Hi, I am using this link like a guide: <https://g...
# ask-community
s
Hi, I am using this link like a guide: https://github.com/dagster-io/dagster/blob/1.0.8/examples/deploy_docker/docker-compose.yml I am trying to integrate Airbyte, so....I took the docker-compose.yaml from the airbyte github and I put it on my folder together with my other compose file https://github.com/airbytehq/airbyte/blob/v0.40.3/docker-compose.yaml So... in this moment I have 2 docker compose files: "docker-compose.yml" and "docker-compose.airbyte.yml". I run this command: docker-compose -f docker-compose.yml -f docker-compose.airbyte.yml up --build Everything is built and created correctly. I am using the "airbyte_resource" from "dagster_airbyte". On my env variables I pass the client airbyte host in this way: AIRBYTE_CLIENT_HOST=airbyte-webapp AIRBYTE_CLIENT_PORT=8000 I have a sensor that is calling a job with a unique operation: @job( resource_defs={ "airbyte": airbyte_resource } ) def main(): airbyte_sync_op() So .. the problem is when the sensor triggers my job I can not reach the airbyte api List of things that I have tried: - Join all the services from dasgter and airbyte in a unique network - Expose the port 8000 on webapp service - use like host: "airbyte-webapp", "webapp", "airbyte-server" - use ports like: 8000 or 8001 does anyone has any idea what is happening. ? who is supposed to have access to the airbyte api ?? "docker_example_user_code" , "docker_example_dagit", "docker_example_daemon"??? I assume that I have some problem with docker configuration ... but I have tried everything.
a
Is airbyte running on the same host/docker network as your dagster deployment?
it's the dagster job container that needs access to the airbyte webapp/api. where that specifically lives depends on your run launcher configuration
s
@Adam Bloom yes, all services are on the same network.
@Adam Bloom But I don't have launcher configuration, I have commented
But I don't want to use the docker launcher I want to use the default. Should I use it?
d
Hey Saul - the daemon never loads your code directly. Your sensors are considered your code and will run on docker_example_user_code
(as are your runs - if you're using the default run launcher, those will run on docker_example_user_code too)
s
I could make it work. I was using the wrong port, is 80. But I have notice another issue. When all the services are starting and Dagit is available I enter to the runs and I see the my sensor is trigger but I got the error : "Request to Airbyte API failed: 502 Server Error: Bad Gateway for URL" But if I see the terminal logs of airbyte_webapp I see that is still loading ...so I assume that the sensor was trigger and the API is not available. I could confirm this because I re-execute the previous failed job on dagit and run successfully so my new question is.. is there a way to wait for the airbyte API ? I tried to use the "depends_on" of docker but is not working
d
could you write a resource or op that polls the airbyte API until it is available?
s
Yes I can possible do that... but Do you know if I can do do something like: Create a container that can test if the API is available and later just build the others containers dependencies
d
I'm not aware offhand of a way to do that
s
@daniel ... I tried with docker but I found this link, Where docker suggest that: "The best solution is to perform this check in your application code, both at startup and whenever a connection is lost for any reason" https://docs.docker.com/compose/startup-order/ So... I only have the option to do it in dagster. What is the best option for you? resource? hook? event? ... I don't have idea
d
I think a resource sounds like a good fit since this has to do with an external service
s
@daniel I have an idea: what if I create my sensors with DefaultSensorStatus.STOPPED. and later when the "API resource checker" found the API is available ...I can turn on all my sensor. What do you think? is possible to change the status of a sensor from an operation?
a
@Saul Burgos I think you're overdesigning this. What scenario do you expect dagster to try to run when airbyte is not yet available?
s
My scenario is that the airbyte api is not available when my sensors trigger, which executes the airbyte operations @Adam Bloom , So I am trying to figure out how to validate that the airbyte API is available before my sensors trigger
a
are you not planning to deploy airbyte separately from dagster? you're trying to launch airbyte within dagster? if airbyte is always running as a service, then perhaps you hit this on the first run, but things should be ok after that?
s
you are right ...I have tested that situation but my use case requires that it need work the first time. And the idea is that with a single docker-compose command all the services will be built and started ... for that reason I have this little problem... I agree with you... the subsequent request will be successful but I need it for the first one. My plan is to create my sensor stopped and check the API and later start them. But I don't know if I can start sensors from dagster or using the graph API