Megan Beckett
06/21/2022, 1:25 PM@graph
def update_db_metadata():
"""
Generalised job that will pull all metadata required. There is a test and prod instance of each below.
"""
process_data(pull_data(connect_api()))
# Test job - will use local testing database connection
update_db_metadata_test_job = update_db_metadata.to_job(
name='update_db_metadata_test_job',
resource_defs={"database": database_connection},
config=config_from_files(
[file_relative_path(__file__, "../config/run_config_test.yaml")]
),
)
# Production job - will use production database connection
update_db_metadata_prod_job = update_db_metadata.to_job(
name='update_db_metadata_prod_job',
resource_defs={"database": database_connection},
config=config_from_files(
[file_relative_path(__file__, "../config/run_config_prod.yaml")]
),
)
In the Dagit UI, I can then run either the testing or the production job (this isn't deployed yet either).
Is this the recommended way to handle connecting to different DB environments in testing vs production? Or how else?David Hyman
06/21/2022, 1:35 PM<http://update_db_metadata.to|update_db_metadata.to>_job
)
alternatively can switch out the resources instead of the configs:
https://docs.dagster.io/guides/dagster/graph_job_op#a-pipeline-with-prod-and-dev-modesMegan Beckett
06/22/2022, 8:27 AM