Hi all! I'm getting a little stuck with setting up...
# announcements
e
Hi all! I'm getting a little stuck with setting up schedules. I expect that this is just some configuration that I'm missing or setting up wrongly... Would be great if somebody who's had some experience with setting this up could help 😌 I'll post the relevant files in this thread - thank you in advance for any pointers!
So this is what I was following to set it up: https://docs.dagster.io/tutorial/advanced_scheduling
I deploy in docker... So I also used the examples from Deploying Dagit as a service - https://docs.dagster.io/deploying/local
This is what my dockerfile looks like:
Copy code
FROM bitnami/python:3.8

COPY ./*.txt /app

RUN apt-get update && apt-get install -yqq cron \
    && pip install -r requirements.txt

COPY . /app
RUN chmod +x /app/entrypoint.sh && mkdir -p /app/data

ENV PYTHONPATH=$PYTHONPATH:/app
ENV DAGSTER_HOME=/app

EXPOSE 3000

ENTRYPOINT ["/app/entrypoint.sh"]
This is my entrypoint.sh
Copy code
#!/bin/sh
export DAGSTER_HOME=/app

# This block may be omitted if not packaging a repository with cron schedules:
####################################################################################################
# see: <https://unix.stackexchange.com/a/453053> - fixes inflated hard link count
touch /etc/crontab /etc/cron.*/*

service cron start

# Add all schedules defined by the user
dagster schedule up
####################################################################################################

# Launch Dagit as a service
DAGSTER_HOME=/app dagit -h 0.0.0.0 -p 3003
this is my dagster.yaml
Copy code
scheduler:
  module: dagster_cron.cron_scheduler
  class: SystemCronScheduler
my requirements.txt
Copy code
dagit<1.0.0
dagster<1.0.0
dagster-cron<1.0.0
tweepy==3.9.0
pandas==1.1.4
jupyter==1.0.0
pytest==6.1.2
nest_asyncio
pylint==2.6.0
rope==0.18.0
my workspace.yaml
Copy code
# This file defines the dagster workspace.
load_from:
  - python_file:
      relative_path: my_module/repo.py
      location_name: my_module
And finally repo.py, referenced above!
Copy code
from datetime import datetime

from dagster import repository, daily_schedule

from my_module.config import TIMESTAMP_FORMAT
from my_module.pipelines import my_pipeline


@daily_schedule(pipeline_name="my_pipeline", start_date=datetime(2020, 11, 26))
def my_daily_schedule(date):
    return {
        "solids": {
            "solid_1": {
                "config": {"timestamp": date.strftime(TIMESTAMP_FORMAT)}
            },
            "solid_2": {
                "config": {"timestamp": date.strftime(TIMESTAMP_FORMAT)}
            },
        }
    }


@repository(name="my_module")
def repo():
    return [my_pipeline, my_daily_schedule]
I think that's all I'd need to define a schedule. Running the stuff in entrypoint.sh actually makes it show up, also, however, when I attempt to run a backfill, I click the button in the UI and nothing appears to happen. They also don't run automatically. I have an Airflow background and I was fairly comfortable with how scheduling worked there, but I don't fully yet see the analogy in Dagster 🙂
thanks in advance for your help!
d
Hi Rebeka - just to confirm, is the button in the UI you're referring to on the partitions page? Like a "Launch Backfill" button? Or is this the UI toggle to turn on the schedule?
e
Launch backfill. But I would have also toggled the turn on schedule button
d
Got it. Will take a look - one thing that would be helpful to know is if there are any new errors or other output in the container logs when you kick off the backfill
e
unfortunately not - although I don't think the level is DEBUG. Also no errors from
dagster schedule debug
. It's like nothing happens 🤔
this is a screenshot, in case that's also helpful.
(the warning I took as just a warning, I only really want to run 1 or a few, so I don't mind the lack of queueing!)
d
yeah, that's strange. I'll try to repro, could I have the pipelines.py file and config.py file as well in my_module?
e
ofc - I'll ping you the entire repo in a dm in a sec. Thanks for taking a look.