Hello! When launching a backfill on a partitioned ...
# integration-dbt
j
Hello! When launching a backfill on a partitioned asset, some of my backfill jobs fail with this error:
Copy code
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 1900543 (char 1900542)

  File "/usr/local/lib/python3.10/site-packages/dagster/_grpc/impl.py", line 120, in core_execute_run
    recon_job.get_definition()
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/definitions/reconstruct.py", line 243, in get_definition
    return self.repository.get_definition().get_maybe_subset_job_def(
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/definitions/reconstruct.py", line 117, in get_definition
    return repository_def_from_pointer(self.pointer, self.repository_load_data)
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/definitions/reconstruct.py", line 793, in repository_def_from_pointer
    target = def_from_pointer(pointer)
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/definitions/reconstruct.py", line 685, in def_from_pointer
    target = pointer.load_target()
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/code_pointer.py", line 175, in load_target
    module = load_python_file(self.python_file, self.working_directory)
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/code_pointer.py", line 83, in load_python_file
    return import_module_from_path(module_name, python_file)
  File "/usr/local/lib/python3.10/site-packages/dagster/_seven/__init__.py", line 49, in import_module_from_path
    spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/opt/dagster/home/orchestration/repository.py", line 52, in <module>
    def orchestration():
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/definitions/decorators/repository_decorator.py", line 379, in repository
    return _Repository()(definitions_fn)
  File "/usr/local/lib/python3.10/site-packages/dagster/_core/definitions/decorators/repository_decorator.py", line 111, in __call__
    repository_definitions = fn()
  File "/opt/dagster/home/orchestration/repository.py", line 79, in orchestration
    assets = build_assets(dbt_assets)
  File "/opt/dagster/home/orchestration/assets/assets.py", line 72, in build_assets
    dbt_pdf_assets = load_assets_from_dbt_project(
  File "/usr/local/lib/python3.10/site-packages/dagster_dbt/asset_defs.py", line 516, in load_assets_from_dbt_project
    manifest_json, cli_output = _load_manifest_for_project(
  File "/usr/local/lib/python3.10/site-packages/dagster_dbt/asset_defs.py", line 95, in _load_manifest_for_project
    return json.load(f), cli_output
  File "/usr/local/lib/python3.10/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/usr/local/lib/python3.10/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/usr/local/lib/python3.10/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/local/lib/python3.10/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
It looks like the dbt project is compiled for each run and that there are conflicts when writing/reading the manifest.json when several runs are executed in parallel but I'm not sure. Do you have any idea about what happens or how I could avoid this? I'm using dagster 1.3.3 and dagster-dbt 0.19.3