https://dagster.io/ logo
#ask-community
Title
# ask-community
s

Son Do

07/26/2023, 1:46 PM
Hey Dagster team, I was wondering if there was a way to set up a job (or jobs) such that a job would still run even if some of its dependencies fail? We currently have a job that runs a lot of different extract logic (broken up via dagster software defined assets) and we want to trigger the same downstream assets but only once all of them have either failed or succeeded. cc: @Anthony Yim @Simon Weber
s

sandy

07/26/2023, 5:08 PM
Hi Son - are you able to be a little more specific? When you say "its dependencies", do you mean that you want to materialize assets even if some of the upstream assets fail to materialize?
s

Son Do

07/26/2023, 5:10 PM
yes that is exactly it. To be more exact, there is a downstream asset which is a single dbt model that essentially combines all the data that I want to still materialize even if its upstream dependencies fail.
Hey @sandy I just wanted to bump this question. Is this possible?
s

sandy

07/27/2023, 11:36 PM
This is not currently possible. A hacky approach would be to put a try/catch block inside your function that logs relevant information on failure but still "succeeds" from the perspective of Dagster. I'd also encourage you to file a Github issue with this request.
2 Views