I'm struggling to get my (helm deployed) instance ...
# ask-community
m
I'm struggling to get my (helm deployed) instance of Dagit to be able to access the compute logs that are written to S3. The logs are getting written; but when I try view them via Dagit I get a
Unexpected GraphQL error
Copy code
ComputeLogsSubscription
Unable to locate credentials
I've passed the AWS credentials to the user-depoyment & the k8sRunLauncher as per Deploying with Helm | Dagster
And validated that they are correctly set on the
dagster-dagster-user-deployments-my-deployment
container:
Copy code
root@dagster-dagster-user-deployments-my-deploymenwt8k8:/vdk# env | grep AWS
AWS_SECRET_ACCESS_KEY=REDACTED
AWS_ACCESS_KEY_ID=REDACTED
and in the
dagster-run-<run-id>
container during a job run
The log data is ending up in the S3 bucket
Its just that I can't access the logs via Dagit after they have been written sigh
Help! Where should I debug next?
Hmm. I just added the AWS secrets to the Dagit container - eg: in the Helm values.yaml:
Copy code
dagit:
  envSecrets:
    - name: dagster-aws-access-key-id
    - name: dagster-aws-secret-access-key
And now Dagit is able to fetch the log data from S3. Did I not see that config requirement in the docs, or is it missing?
d
Hi David - Dagit pulls directly from S3 when loading the compute logs, so the fix you've identified is right. Totally agree that this should be clearer in the docs - I don't think that helm page you lists covers compute logs at all. @Dagster Bot docs add section on compute logs to "Deploying with Helm" page
thank you box 1
d