Hey everyone, haven’t poked around here in a while. I’m looking at how dagster may replace one of my typical pipeline use cases. Running a pipeline via airflow or luigi, where airflow/luigi are in local execution mode running as a service in a single container, and jobs are run as containers on ECS/Batch/Kubernetes.
Pipelines can take minutes to hours to run, so tracking pipeline state and being able to recover is important. Airflow and luigi handle this by tracking individual task and overall workflow state, saving to a database or disk.
In Dagster world, is there a way to track and recover pipelines from a state?
Looking through the code, I see ReexecutionConfig, but not sure how to get it or construct it.
Maybe by tracking execution through execute_pipeline_iterator and recording the StepOutputHandles?