What’s the practical limit for run concurrency in ...
# dagster-plus
s
What’s the practical limit for run concurrency in dagster cloud serverless?
d
Hi Sterling - the maximum value for max_concurrent_runs that it allows you to set in serverless is 50 runs happening at once at any given time.
s
Thanks!~
I currently have an asset that batch processes thousands of files every night. I’ve built custom logic to determine which files have already been processed. There are some events that happen during the day where I want to trigger processing just one of those files very shortly after the event happens (usually just 10-20 events over the day). I’m struggling a bit with how to represent that in dagster. I first thought about using dynamic partitions and just set up one partition per file. However, there’s overhead to running the process and if I did them one at a time, I’d likely be running Snowflake jobs all day long instead of just 1-2 hours overnight. Additionally, I expect this would get worse if I can only run 50 dagster runs at a time. Then I was thinking I could run an asset in either partition or non-partition mode. When running via a partition, it would just run the file that was detected via a sensor. When running without a partition, it would use the current batch processing logic I’ve built. However, it doesn’t look like dagster supports running an asset that is defined with partitions without a partition defined.