One more question 🙂: Is there a way, that we can run dagster as a microservice (execution with the Python API) and store the metadata/logs somewhere central? That Dagit could consume them and visualize the run, and even take up pipeline code and run it manually again(?)
This would support our use-case where we run our data-pipelines as a step in our over-all (kafka-event-driven) pipeline. But if we would want to visualize errors on the way, or re-start manually, Dagit could be used for that.
Is that use-case foreseen and/or makes sense? Or how would you attack such a case? Is that Dagit as a service deployment? Thanks for any hint or comments.