I have a series of `op` (some with `DynamicOutput...
# ask-community
I have a series of
(some with
) that passes some complex objects that can't be pickled. This threw an error and dagster asked me to change the
which I did, and now it's asking to switch back since they're not persisting. How exactly can I solve this?
Copy code
dagster._core.errors.DagsterUnmetExecutorRequirementsError: You have attempted to use an executor that uses multiple processes, but your job includes op outputs that will not be stored somewhere where other processes can retrieve them. Please use a persistent IO manager for these outputs. E.g. with
    the_graph.to_job(resource_defs={"io_manager": fs_io_manager})
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_grpc\impl.py", line 498, in get_external_execution_plan_snapshot
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\api.py", line 960, in create_execution_plan
    return ExecutionPlan.build(
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\plan\plan.py", line 1026, in build
    return plan_builder.build()
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\plan\plan.py", line 167, in build
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\plan\plan.py", line 1239, in _check_persistent_storage_requirement
    raise DagsterUnmetExecutorRequirementsError(
Do you have a way of serializing these objects? If so, you could write an IO manager that includes that implementation. You could extend
in order to get a lot of the basic functionality out of the box. Alternative, you could switch your executor to the
, but then you lose the ability to execute in parallel, because of Python language concurrency limitations
🙏 1