Simon Späti

03/06/2020, 10:47 AM
One more question 🙂: Is there a way, that we can run dagster as a microservice (execution with the Python API) and store the metadata/logs somewhere central? That Dagit could consume them and visualize the run, and even take up pipeline code and run it manually again(?) This would support our use-case where we run our data-pipelines as a step in our over-all (kafka-event-driven) pipeline. But if we would want to visualize errors on the way, or re-start manually, Dagit could be used for that. Is that use-case foreseen and/or makes sense? Or how would you attack such a case? Is that Dagit as a service deployment? Thanks for any hint or comments.


03/06/2020, 3:03 PM I think this might be what you’re looking for. Dagit can use Postgres as a backing data store


03/06/2020, 4:56 PM
ya this should actually work really well - the instance stuff is not documented the best yet so let us know what questions you have trying to get it set up

Simon Späti

03/06/2020, 11:07 PM
thank you both, will check that!