Hello dagster community! I am excited to be here! I have a question about the design of my workflow
I am designing a workflow that includes some heavy compute tasks (mostly ffmpeg tasks).
if I understand correctly, it is a bad practice to run compute heavy tasks in the context of the dagster executors.
so I thought about extracting these workloads to an external service and make dagster just send requests to these service.
my questions are whether this is a good approach or not, and if it is, who should handle the materialization of the files after processing?
my external service can upload the processed files to an object storage and return the path to dagster, or it can return the whole huge byte array over the network and let dagster persists that to an object storage (that sounds bad to me). how would you handle this pipeline? thanks!