# dagster-cloud

Sam Van Den Berghe

02/15/2023, 3:46 PM
I am experiencing some problems with our sensor.
Copy code
dagster._core.errors.DagsterUserCodeUnreachableError: dagster._core.errors.DagsterUserCodeUnreachableError: The sensor tick timed out due to taking longer than 60 seconds to execute the sensor function.
But we have a similar sensor that does the same but on a different dataset that is working fine. When running locally the sensor function completes fast enough. tried adding logs but those don’t appear.
The sensor reads files from S3.


02/15/2023, 6:17 PM
We should do a better job of exposing those sensor logs back to you in Cloud - I’ll make sure my team is aware of it. I do see a log line internally that implies the sensor is finding a lot of S3 keys (I’ll DM you the specifics) - this might be an issue where it’s struggling to finish in time the first time it runs? And chunking the work further so it only ever processes n keys and then moves onto the next tick might get this working again.

Joseph Sayad

02/15/2023, 9:00 PM
I'm experiencing the same issue with a sensor I have deployed that reads files from S3
My S3 bucket contains ~120K files and I ran
over the bucket (isolated in a notebook), which took ~23 seconds to get all keys. Though I experience a timeout in Dagster Cloud and no runs are requested. We might be able to get around the issue by editing the following code to only extract
keys from S3 as you mentioned @jordan


02/16/2023, 7:56 PM
At that point, you’ll probably want to switch from listing the files to something like a bucket notification. And have your sensor pop messages from a queue: Unless your keys have a matching alphabetical and timestamp sort, it’s not possible with the S3 api alone to limit to only recently created keys when listing.
👍 1