I have an issue where my sensor logs an error on e...
# ask-community
s
I have an issue where my sensor logs an error on every evaluate_tick which say it cannot instantiate gcp_compute_logmanger because it cannot access the bucket. However access should exist and if bash into the daemon pod.
Copy code
from google.cloud import storage
storage.Client().bucket(...).exists()
Test the exact code which fails in dagster. It happily returns true. There is no jsonsecret configured, It should work via workload identity. How can the code in the daemon tick fail if I can run it inside the pod fine. I assumed all sensor code executes inside the daemon pod. Probably in some executor. Dagster 0.14.12 latest version Stacktrace below.
Copy code
google.api_core.exceptions.Forbidden: 403 GET <https://storage.googleapis.com/storage/v1/b/retailai-staging-artifacts?fields=name&prettyPrint=false>: Caller does not have storage.buckets.get access to the Google Cloud Storage bucket.
  File "/usr/local/lib/python3.8/site-packages/dagster/core/errors.py", line 184, in user_code_error_boundary
    yield
  File "/usr/local/lib/python3.8/site-packages/dagster/grpc/impl.py", line 284, in get_external_sensor_execution
    return sensor_def.evaluate_tick(sensor_context)
  File "/usr/local/lib/python3.8/site-packages/dagster/core/definitions/sensor_definition.py", line 334, in evaluate_tick
    result = list(ensure_gen(self._evaluation_fn(context)))
  File "/usr/local/lib/python3.8/site-packages/dagster/core/definitions/sensor_definition.py", line 494, in _wrapped_fn
    for item in result:
  File "/usr/local/lib/python3.8/site-packages/dagster/core/definitions/sensor_definition.py", line 611, in _fn
    event_records = context.instance.get_event_records(
  File "/usr/local/lib/python3.8/site-packages/dagster/core/definitions/sensor_definition.py", line 109, in instance
    DagsterInstance.from_ref(self._instance_ref)
  File "/usr/local/lib/python3.8/site-packages/dagster/core/instance/__init__.py", line 434, in from_ref
    compute_log_manager=instance_ref.compute_log_manager,
  File "/usr/local/lib/python3.8/site-packages/dagster/core/instance/ref.py", line 258, in compute_log_manager
    return self.compute_logs_data.rehydrate()
  File "/usr/local/lib/python3.8/site-packages/dagster/serdes/config_class.py", line 86, in rehydrate
    return klass.from_config_value(self, result.value)
  File "/usr/local/lib/python3.8/site-packages/dagster_gcp/gcs/compute_log_manager.py", line 102, in from_config_value
    return GCSComputeLogManager(inst_data=inst_data, **config_value)
  File "/usr/local/lib/python3.8/site-packages/dagster_gcp/gcs/compute_log_manager.py", line 70, in __init__
    check.invariant(self._bucket.exists())
d
Hey Samuel - the sensor code executes inside your user code deployment. The daemon never directly runs your code, it interacts with the user code deployment over a gRPC interface
👍 1
s
I will check this then side then. I misunderstood that. I also wondered how too many sensors would scale if there is only one daemon.