Hey All! I’m a new Dagster user and I’m wondering ...
# ask-community
b
Hey All! I’m a new Dagster user and I’m wondering if it’s possible to have a single job/schedule that will execute or materialize a set of ops or assets in sequence. My case here is that I want to move several datasets(oracle views) from an oracle database into s3 but i am limited to only having one query against the database at a time, so they would need to execute sequentially, even though there’s no real dependencies from one to the other. The kicker is that I'd also like to set logging for each one of these assets or ops (oracle views) and information like, how long each one takes etc. Are there any typical patterns to follow to accomplish this or should I just schedule each one separately and offset the scheduled time? Thanks!
🤖 1
z
I think you're looking for Nothing dependencies. Dagster will automatically log how long each step in a run takes. For custom logs that you want to have emitted to the Dagster console you can get a dagster logger with
Copy code
from dagster import get_dagster_logger
then you can use the logger in an op like
Copy code
@op
def something():
    logger = get_dagster_logger()
    <http://logger.info|logger.info>("print to logs")
j
hey @Benjamin Faught what Zach said is a good option and will definitely work. Another option is to check out concurrency options https://docs.dagster.io/guides/limiting-concurrency-in-data-pipelines#limiting-concurrency-in-data-pipelines
b
Thank you both! I'll take some time to review these and pop back with any questions!