Hello all, I'm actually running a POC with Airbyte...
# ask-community
h
Hello all, I'm actually running a POC with Airbyte and Dagster. I've installed Dagster locally and I'm trying to understand if there's a way to run some python code synced from a Github repo with Dagster Open Source? Thank you in advance.
h
thank you @Paul Ellica Padilla but I'm still a bit lost. I've read the article you shared but we are getting metadata from Github not chekcking out python files to run with Dagster right?
o
hi @Hatem L -- can you say a bit more about your deployment model? is this python code in a separate Github repo than your regular Dagster code? In general, we would recommend co-locating your orchestration code with the code that it's orchestrating.
h
Hi @owen, Actually I'm running Dagster locally to try to understance how it works but the target setup will look like this : ā€¢ Airbyte for data ingestion (on a aws instance) ā€¢ dbt for transformation ā€¢ Dagster to orchestrate Airbyte, dbt and some custom python code (another aws instance? or should it be the same as Airbyte?) the dbt and python code is on Github so when something change no need to re import the files on the dagster instance. My main question here is how to use Dagster (open source) with Github and import multiple files coming from multiple teams (DE,DS & BI). I hope this is clear. Thanks you in advance!
Hi @owen, Any thoughts on this?
šŸ‘ 1
j
With open source, power users generally build some cicd pipeline that packages up dagster code from a repo, builds an image, and redeploys the k8s helm chart/ecs stack
šŸ‘ 2
With multiple teams it might build separate images for each
šŸ‘ 1
h
Thank you Johann! Will give it a try and will keep you posted as I will definitely have more questions :)