Hi guys, loving dagster so far and really nice to ...
# announcements
Hi guys, loving dagster so far and really nice to see the progress you’ve made with 0.7.0. I’ve encountered an issue with the scheduler and setting google authentication credentials. When I manually trigger dagit to run a pipeline with mode A it runs as expected, but when the same pipeline and mode A are run by a schedule it fails because dagster cannot find the service account (Will add error log in the thread) I’m running dagit locally through the dagit command so nothing fancy. This issue was already present in 0.6.9 and I hoped upgrading to 0.7.0 would solve this but it unfortunately didnt.
The error log
Also, the resource specification:
Copy code
# Bigquery
@resource(config={"project": Field(str), "service_account": Field(str)})
def bigquery_resource(init_context):
    os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = init_context.resource_config["service_account"]
    project_id = init_context.resource_config["project"]
    client = bigquery.Client(project=project_id)
    return client
and enviroment yaml specification
Copy code
    project: "private-project"
    service_account: "./GoogleCloud_ServiceAccount.json"
Hi Bob. So with schedules, you actually need to pass your environment variables along via an
parameter. Here is an example: https://github.com/dagster-io/dagster/blob/master/examples/dagster_examples/bay_bikes/schedules.py#L15. The reason for this, I believe, is due to some annoying constraints set by cron.
Hi Abhi, thanks for your reply. I do pass the environment variable to the schedule:
Copy code
""" Load Resource Config"""
with open(
    file_relative_path(__file__, "pipelines/environments/docker_dev.yaml"), "r"
) as yaml_in:
    config = yaml.safe_load(yaml_in)  # yaml_object will be a list or a dict

""" Individual Pipeline Schedules"""

test_slack_schedule = ScheduleDefinition(
    cron_schedule="*/1 * * * *",