Hi, I am running a Kubernetes deployment of Dagste...
# ask-community
m
Hi, I am running a Kubernetes deployment of Dagster, but I cannot see raw compute logs (stdout / stderr). I saw in helm’s
values.yaml
file that this stuff is handled by
computeLogManager
. Should I switch from
LocalComputeLogManager
to another one? perhaps
S3ComputeLogManager
? Or can I do something to make it work with ``LocalComputeLogManager`` itself?
a
Should I switch from LocalComputeLogManager  to another one? perhaps S3ComputeLogManager ?
Yea thats likely the right move.
m
Thanks, just mentioning here that I am using a
k8sRunLauncher
for this, hope it will work with that
Hi @alex, can I use a private bucket? If yes, where do I create a Kubernetes
secret
containing the
s3 credentials
? I don’t see the field
envSecrets
in
values.yaml
file for
S3ComputeLogManager
a
cc @johann
j
You can create the secret in the
extraManifests:
field of your helm values, then reference it in the config of your run launcher, with
env_secrets
so that launched pipelines will have the secret attached.
Alternatively, if you are running on EKS, you can assign an iam role to a k8s service account that you use for your dagster deployment, and give that role access to the bucket.
m
thanks, I already have
envSecrets
under
k8sRunLauncher
. Should that be fine? Or do I have to take the long route of
extraManifests
?
because the
values.yaml
file did have a field for
envSecrets
(for
k8sRunLauncher
), so I used it directly
j
envSecrets specifies secrets to attach to your pods. It won’t create the k8s secret object for you
m
oh, yeah, so I create the secret using
kubectl apply -f xxx.yaml
before
j
Gotcha. In that case you don’t need to use extraManifests
m
thanks a lot, I will give this a try
Does the value I pass into
prefix
has to exist before on
S3
?
or that directory will be created automatically?
j
I believe it will be created, but could you try it and report back?
m
okay sure
it got created automatically
I get this though
Message: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
it’s a private bucket
but I can see the logs stored in S3
so it can write to S3, but not read from it?
cc: @johann
a
likely you need to update the
dagit
deployment section to include the secrets since its whats attempting to read
👍 1
m
aah I see
let me try and report back to you
thanks a lot, I was able to set it up
🎉 2