Can dagster be used for streaming pipelines? I have a file-listing sensor, that yields RunRequests for each file found. The job converts the file into 0 or more rows in an asset. For each subsequent file added to the filesystem, the new rows wipe out the existing rows in the assets instead of being appended. How do I get it to append the new rows, for each new file detected?
Ah nevermind, as far as I can tell dagster does not seem to be a good choice for streaming real time pipelines (unless you want to create a dynamic partition config and create a partition for every unique row)