Ted Conbeer
11/14/2020, 8:42 PM# dagster schedule debug
Scheduler Configuration
=======================
Scheduler:
module: dagster_cron.cron_scheduler
class: SystemCronScheduler
config:
{}
Scheduler Info
==============
Running Cron Jobs:
30 7 * * * /opt/dagster/dagster_home/schedules/scripts/c2d87ddff6867c6180a2bf164218586ffbb64c0b.sh > /opt/dagster/dagster_home/schedules/logs/c2d87ddff6867c6180a2bf164218586ffbb64c0b/scheduler.log 2>&1 # dagster-schedule: c2d87ddff6867c6180a2bf164218586ffbb64c0b
5 11 * * * /opt/dagster/dagster_home/schedules/scripts/2d84fba9c7a01dae736f3ca8ec65c4625b2d5608.sh > /opt/dagster/dagster_home/schedules/logs/2d84fba9c7a01dae736f3ca8ec65c4625b2d5608/scheduler.log 2>&1 # dagster-schedule: 2d84fba9c7a01dae736f3ca8ec65c4625b2d5608
Scheduler Storage Info
======================
my_first_pipeline:
cron_schedule: 30 7 * * *
pipeline_origin_id: c2d87ddff6867c6180a2bf164218586ffbb64c0b
python_path: /usr/local/bin/python
repository_origin_id: 8651c1dcac6632fd200ddfc60b20f7e7bee30fc6
repository_pointer: -f /opt/dagster/app/pipelines.py -a analytics -d /opt/dagster/app
schedule_origin_id: c2d87ddff6867c6180a2bf164218586ffbb64c0b
status: RUNNING
my_second_pipeline:
cron_schedule: 5 11 * * *
pipeline_origin_id: 2d84fba9c7a01dae736f3ca8ec65c4625b2d5608
python_path: /usr/local/bin/python
repository_origin_id: 8651c1dcac6632fd200ddfc60b20f7e7bee30fc6
repository_pointer: -f /opt/dagster/app/pipelines.py -a analytics -d /opt/dagster/app
schedule_origin_id: 2d84fba9c7a01dae736f3ca8ec65c4625b2d5608
status: RUNNING
sashank
11/14/2020, 8:44 PMdagster schedule logs schedule_name
Ted Conbeer
11/14/2020, 8:47 PM# dagster schedule logs my_second_schedule
/opt/dagster/dagster_home/schedules/logs/2d84fba9c7a01dae736f3ca8ec65c4625b2d5608/scheduler.log
# cat /opt/dagster/dagster_home/schedules/logs/2d84fba9c7a01dae736f3ca8ec65c4625b2d5608/scheduler.log
#
sashank
11/14/2020, 8:50 PM/opt/dagster/dagster_home/schedules/scripts/c2d87ddff6867c6180a2bf164218586ffbb64c0b.sh
Ted Conbeer
11/14/2020, 8:54 PMsashank
11/14/2020, 8:54 PMTed Conbeer
11/14/2020, 8:55 PMOperation name: FetchScheduleYaml
Message: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "failed to connect to all addresses"
debug_error_string = "{"created":"@1605387068.582981700","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":4165,"referenced_errors":[{"created":"@1605387068.582973300","description":"failed to connect to all addresses","file":"src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":397,"grpc_status":14}]}"
>
I'm trying to read from a yaml file containing secrets, so I set the yaml file's path as an ENV var, but the schedule needs that ENV var set separately, if I understand correctlysashank
11/14/2020, 8:56 PMTed Conbeer
11/14/2020, 9:02 PMrun_config
my reading in from yaml files:
run_config = merge_yamls([
file_relative_path(__file__, "config/urt_prod.yaml"),
DBT_CLOUD_CONFIG_FILE,
REDSHIFT_CONFIG_FILE
])
where those CONFIG_FILE paths are set elsewhere in the repo.py file as:
DBT_CLOUD_CONFIG_FILE = os.environ['DBT_CLOUD_CONFIG_FILE']
REDSHIFT_CONFIG_FILE = os.environ['REDSHIFT_CONFIG_FILE']
which I'm doing because I'm using docker-compose to copy those config files in as secrets, and I thought it would be handy to store the paths as ENV vars at the same timesashank
11/16/2020, 11:49 AMTed Conbeer
11/16/2020, 2:38 PMsashank
11/16/2020, 2:39 PM