Jean Gonzalez
03/10/2023, 10:56 PMdagster._check.CheckError: Expected non-None value: None
File "/usr/local/lib/python3.9/site-packages/dagster/_core/errors.py", line 206, in user_code_error_boundary
yield
File "/usr/local/lib/python3.9/site-packages/dagster/_grpc/impl.py", line 328, in get_external_sensor_execution
return sensor_def.evaluate_tick(sensor_context)
File "/usr/local/lib/python3.9/site-packages/dagster/_core/definitions/sensor_definition.py", line 428, in evaluate_tick
result = list(self._evaluation_fn(context))
File "/usr/local/lib/python3.9/site-packages/dagster/_core/definitions/sensor_definition.py", line 598, in _wrapped_fn
for item in result:
File "/usr/local/lib/python3.9/site-packages/dagster/_core/definitions/run_status_sensor_definition.py", line 589, in _wrapped_fn
external_repository_origin = check.not_none(
File "/usr/local/lib/python3.9/site-packages/dagster/_check/__init__.py", line 1081, in not_none
raise CheckError(f"Expected non-None value: {additional_message}")
chris
03/10/2023, 11:13 PMJean Gonzalez
03/14/2023, 4:26 AM@run_status_sensor(
run_status=DagsterRunStatus.STARTED,
request_job=source_status_job,
monitored_jobs=[source_job],
default_status=DefaultSensorStatus.RUNNING,
description="Sensor listening on a source job starting its execution",
)
def source_job_on_started_sensor(context):
pass
This was all working fine but recently I added a new op to my job. This op basically does the following:
@op(
required_resource_keys={"dbt_projects_resource"},
ins={"start_after": In(Nothing)},
config_schema={
"project_name": Field(str),
"assets_keys": Field(list),
"type_selection": Field(str, default_value=""),
},
tags={"kind": "seek-connect"},
)
def execute_asset_dbt_deps_job_op(context: OpExecutionContext):
"""
Execute according to the provided asset keys selection
Args:
1. project_name: str
2. assets_keys: [
["netflix", "src_reviews"],
["netflix", "dim_hosts_cleansed"]
]
3. type_selection: AssetTypeSelection
"""
from mydagster_project.repository import my_repository
job = my_repository.get_job(project_name)
job.execute_in_process(
instance=context.instance,
)
Long story short I am executing an asset job which materialized some dbt assetkeys.
Both jobs execute as expected but the SUCESS and STATED sensors are failing...pipeline_run.external_pipeline_origin
?Arsenii Poriadin
04/13/2023, 4:25 PMit got solved by adding monitor_all_repositories=Truethanks for that, man!!! such an annoying bug, @claire is it known or should smb create an GH issue for it?
monitored_jobs
adding monitor_all_repositories=False
fixed it for a couple of evaluations but then it started failing again...Bala Samynathan
06/03/2023, 3:15 PMmonitor_all_repositories
works, but it seems to trigger on all jobs ??Jean Gonzalez
06/03/2023, 6:20 PM