I'm trying to setup a pipeline which is triggered ...
# ask-community
a
I'm trying to setup a pipeline which is triggered by many different assets from different customers, which are all processed the same way. The files hit a s3 bucket and need to be processed and the results injected into a relational database. The @asset will rematerialize when the new file comes in... will this destroy the previous copy? How do i look as a specific instance of a previous run (i.e. for a specific client)?
I guess I would not use an Asset for this use case and just a regular Op?
d
We have a similar type setup to this, and use a sensor to monitor the bucket, and then process the files (it boils down to a graph of ops that takes in one file) and yield asset metadata so everything shows up right in dagit
p
An
op
might be a better fit here… I would use
asset
here if I could determine the asset key (and downstream asset dependencies) at definition time (as opposed to at runtime, when the file hits the s3 bucket).
👍 1