Hello folks :wave: As we switched our logging to J...
# ask-community
Hello folks 👋 As we switched our logging to JSON (using the
) we started noticing weird behaviour in some of our pipelines. Sometimes pipelines fail with the following exception:
Copy code
dagster._core.errors.DagsterSubprocessError: During celery execution errors occurred in workers:
[fetch_data]: dagster._serdes.errors.DeserializationError: Output of deserialized json_str was not expected type of tuple. Received type <class 'dict'>.

Stack Trace:
File "/usr/local/lib/python3.9/site-packages/dagster_celery/core_execution_loop.py", line 84, in core_celery_execution_loop
step_events = result.get()
File "/usr/local/lib/python3.9/site-packages/celery/result.py", line 220, in get
File "/usr/local/lib/python3.9/site-packages/celery/result.py", line 336, in maybe_throw
self.throw(value, self._to_remote_traceback(tb))
File "/usr/local/lib/python3.9/site-packages/celery/result.py", line 329, in throw
self.on_ready.throw(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/vine/promises.py", line 234, in throw
reraise(type(exc), exc, tb)
File "/usr/local/lib/python3.9/site-packages/vine/utils.py", line 30, in reraise
raise value

  File "/usr/local/lib/python3.9/site-packages/dagster/_core/execution/api.py", line 990, in pipeline_execution_iterator
    for event in pipeline_context.executor.execute(pipeline_context, execution_plan):
  File "/usr/local/lib/python3.9/site-packages/dagster_celery/core_execution_loop.py", line 164, in core_celery_execution_loop
    raise DagsterSubprocessError(
If I re-run the pipeline with the same configuration, sometimes the pipeline succeeds and sometimes fails again. Most of the runs complete coerrectly so it’s not something easy to reproduce. It would just happen occasionally. Any idea?
🤖 1
Theres a fix for this going out in the release today!
Ow that’s cool! Any link to the commit / issue?