I guess you are right with your recommendation here @sandy However, I am facing a similar problem when interacting with duckDB from a sensor ( where the sensor might fire multiple executions in parallel. The parquet file writing works, however, dagster fails to obtain a lock on duckdb - as another thread/process is busy registering results. How can I have a lock on duckdb (for the sole purpose of registering a parquet file with it (which is a pretty instant operation)? I do not want to de-parallelize the other processing (in process executor). And I also for now do not want to 1) restructure the graph or 2) use K8s with celery.