Is there any docs on how to use/think with dagster...
# ask-ai
e
Is there any docs on how to use/think with dagster when you’re building on-demand workflows? Our workflows are run when a user presses a button or a row changes, not on a set schedule
s
Yes, the documentation provides guidance on automating data pipelines with Dagster, which can be relevant to on-demand workflows. While the documentation primarily discusses scheduled workflows, it also mentions the use of sensors for triggering workflows based on specific events. For workflows that need to be triggered by an event, such as a user action or a change in data, Dagster offers sensors. Sensors are a feature that allows you to monitor for specific conditions or events and trigger computations in response. This can be particularly useful for on-demand workflows where you want your data to be updated when an event happens. Here's a brief overview of how you might use sensors for on-demand workflows: 1. Event-based triggering: You can define a sensor that monitors for the specific event that should trigger your workflow. For example, this could be a new file appearing in a storage bucket, a new row being added to a database, or a user action like pressing a button. 2. Sensor configuration: Within the sensor, you can specify the logic that determines when a run should be triggered. This logic can be as simple or complex as needed, depending on the conditions that constitute an event for your use case. 3. Run initiation: When the sensor detects the defined event, it can initiate a run of the relevant job or asset. You can pass any necessary information from the event into the run configuration to ensure that the workflow has the context it needs to execute correctly. 4. Monitoring and re-execution: After triggering a workflow, you can use Dagster's UI to monitor the run and, if necessary, re-execute certain steps or the entire workflow if it fails or needs to be rerun for any reason. While the documentation does not provide a detailed step-by-step guide specifically for on-demand workflows, the concepts of sensors and event-based triggering are applicable. You can refer to the sections on sensors in the documentation for more information on how to implement this in your Dagster instance. For more detailed guidance on implementing on-demand workflows with sensors, you may want to explore the Dagster documentation on sensors: https://docs.dagster.io/concepts/partitions-schedules-sensors/sensors Remember that the exact implementation details will depend on your specific use case and the events that you are monitoring to trigger your workflows. Automating your data pipelines | Dagster... button Declarative Scheduling for Data Assets |... button Dagster vs. Airflow | Dagster Blog button Ability to add different asset materiali... button Thank you for your feedback!