Hi Community! I'm playing a bit with Dagster orch...
# ask-community
r
Hi Community! I'm playing a bit with Dagster orchestrator and I have some doubts. My intention is use this tool only as a scheduler and runner for commands since I have the business logic developed in other repositories and containers. So, the architecture would be: 1. A container with Dagit 2. A container with Dagster daemon 3. A container with DBT models (business logic) 4. A container with Python runnable functions Here my doubts: • What would the best way be to define ops and jobs to execute DBT models (2) from the Dagster daemon environment (1)? Is there some specific Dagster library to make this easier? The command to run by the Dagster daemon should be translated to something similar to:
docker exec -t [container_DBT] bash -c "dbt run ...."
• The question to execute the Python scripts hosted in the other Docker container. Thanks in advance!
t
Hi! This seems to go against the philosophy of dagster as a data-orchestrator. There is great built in support for dbt in the dagster-dbt package (https://docs.dagster.io/integrations/dbt/reference#dagster-dbt-integration-reference). Have you also looked at SoftwareDefinedAssets? (https://docs.dagster.io/concepts/assets/software-defined-assets) Makes it really easy to mix dbt with any other python processing/function. I.e., your python scripts could be defined as individual software defined assets,
r
Thanks Thomas for your answer. I know, I'm trying to find an easy and isolated solution before a straightforward Dagster integration in our current code.
a
You’ll end up getting the separate container behavior from configuring a run launcher. As an example, the k8s run launcher runs jobs in isolated containers (ok, pods) automatically in your k8s cluster. If you already have these containers existing, you could look at just running ops that run the containers initially, but I agree with the previous comment that it’s definitely not dagster-like
thank you box 1