I have a series of `op` (some with `DynamicOutput...
# ask-community
c
I have a series of
op
(some with
DynamicOutput
) that passes some complex objects that can't be pickled. This threw an error and dagster asked me to change the
io_manager
to
mem_io_manager
which I did, and now it's asking to switch back since they're not persisting. How exactly can I solve this?
Copy code
dagster._core.errors.DagsterUnmetExecutorRequirementsError: You have attempted to use an executor that uses multiple processes, but your job includes op outputs that will not be stored somewhere where other processes can retrieve them. Please use a persistent IO manager for these outputs. E.g. with
    the_graph.to_job(resource_defs={"io_manager": fs_io_manager})
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_grpc\impl.py", line 498, in get_external_execution_plan_snapshot
    create_execution_plan(
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\api.py", line 960, in create_execution_plan
    return ExecutionPlan.build(
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\plan\plan.py", line 1026, in build
    return plan_builder.build()
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\plan\plan.py", line 167, in build
    _check_persistent_storage_requirement(
  File "D:\Documents\GitHub\mycelium\venv\lib\site-packages\dagster\_core\execution\plan\plan.py", line 1239, in _check_persistent_storage_requirement
    raise DagsterUnmetExecutorRequirementsError(
s
Do you have a way of serializing these objects? If so, you could write an IO manager that includes that implementation. You could extend
UPathIOManager
in order to get a lot of the basic functionality out of the box. Alternative, you could switch your executor to the
in_process_executor
, but then you lose the ability to execute in parallel, because of Python language concurrency limitations
🙏 1