I have a setup where sometimes I need to re-execut...
# ask-community
j
I have a setup where sometimes I need to re-execute a run when input data is adjusted or added. I'd like to maintain the _run_key_ and programmatically trigger the re-execute action on the dashboard interface. I see that there's a reexecute() API function, which takes a _run_id_, but the only API I ran across to look up a _run_id_ is listed as internal and subject to change. Is there a better approach I can take?
m
which API were you thinking of? (for looking up a run_id)
j
There's a
get_runs()
listed here: https://docs.dagster.io/_modules/dagster/core/storage/runs/base It seemed to match what you can do via the pipeline filters on the webpage interface.
My _run_key_ is tied to the data, so I can use that as my filter
m
are you kicking off your runs programmatically or in Dagit?
j
To generate the initial run, I have a sensor which calls RunRequest(). However, sometimes I need to re-execute some of the runs.
m
gotcha. I think you might be better off calling methods on the
DagsterInstance
this is less likely to change than the backing storages
j
Is there an API for that?
I was able to pull most of the data I need from get_runs(), but the PipelineRun class doesn't have the startTime and endTime of the pipeline. I see that I do have a snapshot_id, which it appears I can use to look that info up - but I don't see a function to return it.