Melle Minderhoud
03/11/2021, 4:29 PMUsage: dagster api execute_step [OPTIONS] INPUT_JSON
Try 'dagster api execute_step --help' for help.
Error: Got unexpected extra arguments ("ExecuteStepArgs", "instance_ref": {"__class__": "InstanceRef", "compute_logs_data": {"__class__": "ConfigurableClassData", "class_name": "LocalComputeLogManager", "config_yaml": "base_dir: /tmp/storage\n", "module_name": "dagster.core.storage.local_compute_log_manager"}, "custom_instance_class_data": null, "event_storage_data": {"__class__": "ConfigurableClassData", "class_name": "SqliteEventLogStorage", "config_yaml": "base_dir: /tmp/history/runs/\n", "module_name": "dagster.core.storage.event_log"}, "local_artifact_storage_data": {"__class__": "ConfigurableClassData", "class_name": "LocalArtifactStorage", "config_yaml": "base_dir: /tmp\n", "module_name": "dagster.core.storage.root"}, "run_coordinator_data": {"__class__": "ConfigurableClassData", "class_name": "DefaultRunCoordinator", "config_yaml": "{}\n", "module_name": "dagster.core.run_coordinator"}, "run_launcher_data": {"__class__": "ConfigurableClassData", "class_name": "DefaultRunLauncher", "config_yaml": "{}\n", "module_name": "dagster"}, "run_storage_data": {"__class__": "ConfigurableClassData", "class_name": "SqliteRunStorage", "config_yaml": "base_dir: /tmp/history/\n", "module_name": "dagster.core.storage.runs"}, "schedule_storage_data": {"__class__": "ConfigurableClassData", "class_name": "SqliteScheduleStorage", "config_yaml": "base_dir: /tmp/schedules\n", "module_name": "dagster.core.storage.schedules"}, "scheduler_data": {"__class__": "ConfigurableClassData", "class_name": "DagsterDaemonScheduler", "config_yaml": "{}\n", "module_name": "dagster.core.scheduler"}, "settings": {"backfill": null, "sensor_settings": null, "telemetry": null}}, "pipeline_origin": {"__class__": "PipelinePythonOrigin", "pipeline_name": "hello_cereal_pipeline", "repository_origin": {"__class__": "RepositoryPythonOrigin", "code_pointer": {"__class__": "ModuleCodePointer", "fn_name": "hello_cereal_pipeline", "module": "airflow_test.airflow"}, "container_image": null, "executable_path": "Library/Caches/pypoetry/virtualenvs/churn-metrics-ENa28q3B-py3.8/bin/python"}}, "pipeline_run_id": "manual__2021-03-11T15:04:26.729658+00:00", "retries_dict": {}, "should_verify_step": false, "step_keys_to_execute": ["hello_cereal"]})
Any advice on how we can solve this issue?
The DAG we are using is:daniel
03/11/2021, 4:49 PMyuhan
03/11/2021, 7:51 PMMelle Minderhoud
03/12/2021, 5:55 AMdaniel
03/12/2021, 10:24 PMMelle Minderhoud
04/08/2021, 10:45 AMdaniel
04/08/2021, 11:14 AMMelle Minderhoud
04/11/2021, 11:12 AMmake_airflow_dag_containerized
imports the module when creating the dag & tasks so all the dependecies still need to be available in the (global) airflow environment. Is this correct or am I missing something here?daniel
04/11/2021, 5:17 PM