One more question :slightly_smiling_face:: Is ther...
# announcements
s
One more question 🙂: Is there a way, that we can run dagster as a microservice (execution with the Python API) and store the metadata/logs somewhere central? That Dagit could consume them and visualize the run, and even take up pipeline code and run it manually again(?) This would support our use-case where we run our data-pipelines as a step in our over-all (kafka-event-driven) pipeline. But if we would want to visualize errors on the way, or re-start manually, Dagit could be used for that. Is that use-case foreseen and/or makes sense? Or how would you attack such a case? Is that Dagit as a service deployment? Thanks for any hint or comments.
n
https://docs.dagster.io/latest/deploying/instance I think this might be what you’re looking for. Dagit can use Postgres as a backing data store
a
ya this should actually work really well - the instance stuff is not documented the best yet so let us know what questions you have trying to get it set up
s
thank you both, will check that!