https://dagster.io/ logo
#announcements
Title
# announcements
e

esztermarton

12/16/2020, 2:51 PM
Hi all! I'm having some trouble with schedules. The schedule seems appropriately configured, and a backfill has successfully run, and partitions all seem generated correctly. However, runs aren't triggered automatically. How/where can I see the logs for whatever is supposed to be triggering so that I can figure out why it isn't triggering? Thanks for the help in advance!
(I can share the repo code if that is of interest)
d

daniel

12/16/2020, 4:21 PM
Hi Rebeka - is this using the SystemCronScheduler?
e

esztermarton

12/16/2020, 4:26 PM
Hey Dan! It is.
d

daniel

12/16/2020, 4:30 PM
Got it - if you run crontab -l on the machine that's supposed to be executing the schedules, do you see a set of crontab entries, one for each schedule?
e

esztermarton

12/16/2020, 4:33 PM
I have 3 schedules, only 1 turned on via the UI. I see one schedule when I run that command - but I'm not sure which one it corresponds to because it is not human readable 🙂
does that sound right?
s

sashank

12/16/2020, 4:35 PM
Yup that’s right! You should also be able to see that information from
dagster schedule debug
👍 1
e

esztermarton

12/16/2020, 4:36 PM
yes that command shows 1 running schedule!
s

sashank

12/16/2020, 4:36 PM
The next thing I would try is
dagster schedule logs {schedule_name}
That should give you the log file where you can potentially find the error the schedule is running into
e

esztermarton

12/16/2020, 4:37 PM
empty file 😞
(but file exists)
s

sashank

12/16/2020, 4:42 PM
Ah interesting
Could I see your debug output?
e

esztermarton

12/16/2020, 4:45 PM
Copy code
Scheduler Configuration
=======================
Scheduler:
     module: dagster_cron.cron_scheduler
     class: SystemCronScheduler
     config:
       {}


Scheduler Info
==============
Running Cron Jobs:
* * * * * /app/schedules/scripts/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad.sh > /app/schedules/logs/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad/scheduler.log 2>&1 # dagster-schedule: c0687ec3dc2f13c099a4e27b325f848bdbe1dfad


Scheduler Storage Info
======================
my_minute_schedule:
  cron_schedule: '* * * * *'
  pipeline_origin_id: ba44b35a377e542ff92a40e47d300290f00ca809
  python_path: /opt/bitnami/python/bin/python
  repository_origin_id: 62e469cf665b76477f27c2c06d7e99cce4e2be73
  repository_pointer: -f /app/lena_tweets/repo.py -a repo -d /app
  schedule_origin_id: ba44b35a377e542ff92a40e47d300290f00ca809
  status: STOPPED

my_minute_schedule_tweet:
  cron_schedule: '* * * * *'
  pipeline_origin_id: 67617e3619ce2afdce2b2046a2e80871ebb03eba
  python_path: /opt/bitnami/python/bin/python
  repository_origin_id: 62e469cf665b76477f27c2c06d7e99cce4e2be73
  repository_pointer: -f /app/lena_tweets/repo.py -a repo -d /app
  schedule_origin_id: 67617e3619ce2afdce2b2046a2e80871ebb03eba
  status: STOPPED

my_minute_schedule_tweet_history:
  cron_schedule: '* * * * *'
  pipeline_origin_id: c0687ec3dc2f13c099a4e27b325f848bdbe1dfad
  python_path: /opt/bitnami/python/bin/python
  repository_origin_id: 62e469cf665b76477f27c2c06d7e99cce4e2be73
  repository_pointer: -f /app/lena_tweets/repo.py -a repo -d /app
  schedule_origin_id: c0687ec3dc2f13c099a4e27b325f848bdbe1dfad
  status: RUNNING
d

daniel

12/16/2020, 4:46 PM
Would it be possible to print the results of crontab -l as well?
e

esztermarton

12/16/2020, 4:47 PM
Copy code
* * * * * /app/schedules/scripts/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad.sh > /app/schedules/logs/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad/scheduler.log 2>&1 # dagster-schedule: c0687ec3dc2f13c099a4e27b325f848bdbe1dfad
s

sashank

12/16/2020, 4:47 PM
The
crontab -l
output is included right there at the top of this
debug
output already actually
Just to be sure, the log file you checked is
/app/schedules/logs/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad/scheduler.log
e

esztermarton

12/16/2020, 4:48 PM
confirmed, just looked again 🙂
s

sashank

12/16/2020, 4:48 PM
The next thing I would try is running
/app/schedules/scripts/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad.sh
yourself and seeing if you get any errors
🙌 1
e

esztermarton

12/16/2020, 4:49 PM
ran without any reported errors or logs 🤔
or a new run showing up in the UI
(very very happy to be running commands and printing things here, but if it's helpful, also happy to share the repo so that you can reproduce locally and/or give you access to the place where this is running)
s

sashank

12/16/2020, 4:50 PM
yeah let’s do that, I can dig in
sorry that you’re running into this – I would expect the logs to have the appropriate info
these errors where it works outside the context of cron but not in cron usually have to do with environment variables, are you using any in your pipeline?
e

esztermarton

12/16/2020, 4:52 PM
I'm not (I think(
d

daniel

12/16/2020, 4:53 PM
one other question, what platform is this running on?
e

esztermarton

12/16/2020, 4:55 PM
it's running inside a docker container (bitnami python 3.8) on an ubuntu 18 machine. Is that what you're asking? 🙂
d

daniel

12/16/2020, 4:56 PM
it is, yeah
👍 1
s

sashank

12/16/2020, 5:05 PM
Ah so it looks like the pipeline itself has a tiny bug in it. A
should_execute
function takes one argument (a
ScheduleExecutionContext
). So just change your method
Copy code
def outstanding_tweet_history():
to be:
Copy code
def outstanding_tweet_history(_):
Also this is a mistake on our part, but the
logs
command should return the path to the folder of that file returned
If you look at the files in
/app/schedules/logs/c0687ec3dc2f13c099a4e27b325f848bdbe1dfad/
they contain your error
We will be sure to update that in the next release, thanks for helping us catch that oversight!
e

esztermarton

12/16/2020, 10:24 PM
Thank you Sashank! I was sure it was an error on my part - next time I’ll know to look in that folder. I actually saw that there were all those other files there but never thought to open the other ones, only the schedule.log file 🤦‍♀️
s

sashank

12/16/2020, 10:40 PM
Great! Let us know if you run into any other issues
And that was definitely all fault, sorry again for not pointing you to the correct file!
2 Views