https://dagster.io/ logo
#integration-airbyte
Title
# integration-airbyte
o

Ohad

02/17/2023, 7:57 PM
Hi everyone! My dag execution starts with many AirByte assets, and it executes two Airbyte assets in parallel. I found it's much faster when each Airbyte asset executes individually. The question is how to limit the execution to 1 at a time. On the job Ops docs there is a parameter called `max_concurrent`but I could not find something similar for
load_assets_from_airbyte_instance
Any ideas?
o

owen

02/21/2023, 7:43 PM
hm that's definitely interesting -- I can see how it's possible for two airbyte syncs reading from the same source or writing to the same destination simultaneously could slow things down. I think you're on the right track with
max_concurrent
, as this applies to all operations in the graph, regardless of how they're loaded. setting that to 1 as described with the config in the docs (although with
define_asset_job
instead of
@job
) should prevent multiple airbyte syncs from running at the same time in that job if you're not using
define_asset_job
(and just have raw definitions) then you can set up a default executor with that config by supplying
executor=multiprocess_executor.configured({"max_concurrent": 1})
to your
Definitions
object
🙏 1