Hi I could use some help with setting environment ...
# deployment-kubernetes
n
Hi I could use some help with setting environment variables in Dagster jobs when running on Kubernetes via the Helm chart. Below I've included my configuration for the
values.yaml
I am using with helm. Note that I've included the
dagit
section and the
dagster-user-deployments
sections. I do not see the environment variables set in the
dagster-user-deployments
section in the jobs/pods which are run on kubernetes during a job run, but I do see the environment variables in the dagit pod. We are using chart version
dagster-1.1.20
.
Copy code
# values.yaml
dagit:
  env:
    SOME_HOST: "1.2.3.4"
  envSecrets:
    - name: some-password

dagster-user-deployments:
  enabled: true
  deployments:
    - name: "some-name"
      image:
        repository: "foo-bar-docker.pkg.dev/some-path"
        tag: sometag
        pullPolicy: Always
      dagsterApiGrpcArgs:
        - "--python-file"
        - "/foo/bar/repository.py"
      port: 3030
      env:
        SOME_HOST: "1.2.3.4"
      envSecrets:
        - name: some-password
(replaced most values above with placeholders) I'm trying to copy this example from the docs with the `envSecrets`: https://docs.dagster.io/deployment/guides/kubernetes/deploying-with-helm#step-62-run-step_isolated_job
Copy code
dagster-user-deployments:
  enabled: true
  deployments:
    - name: "k8s-example-user-code-1"
      image:
        repository: "<http://docker.io/dagster/user-code-example|docker.io/dagster/user-code-example>"
        tag: latest
        pullPolicy: Always
      dagsterApiGrpcArgs:
        - "--python-file"
        - "/example_project/example_repo/repo.py"
      port: 3030
      envSecrets:
        - name: dagster-aws-access-key-id
        - name: dagster-aws-secret-access-key
Let me know what further info I can provide (originally posted in #dagster-support https://dagster.slack.com/archives/C01U954MEER/p1678150166235829, but moved to this channel when I saw it)
Copy code
kubectl get pod -n dagster dagster-dagit-foo-bar -o yaml
# Gives the expected
spec:
  envFrom:
  - configMapRef:
      name: dagster-dagit-env
  - secretRef:
      name: some-password
However
Copy code
kubectl get pod -n dagster dagster-job-foo-bar -o yaml
# Output has no envFrom section
The expected env vars do show up in the
dagster-dagster-user-deployments-leash-dags-foo-bar
container (verified by exec'ing in)
d
Hi Nathan - is it "env" or "envSecrets" that isn't getting passed through to the job pod? or both?
The run pod should start with dagster-run-xxx not dagster-job-xxx right?
Is your user code deployment image also using a recent Dagster version?
n
Thanks for the response Daniel
is it "env" or "envSecrets" that isn't getting passed through to the job pod? or both?
both
The run pod should start with dagster-run-xxx not dagster-job-xxx right?
I was inspecting the
dagster-job-xxx
pod (which I assume is created on a per op basis), but I just now inspected the
dagster-run-xxx
pod during a run and it does not have the env vars either (checked by exec'ing in and inspecting the pod's yaml).
Is your user code deployment image also using a recent Dagster version?
Looks like we're running an old one (
0.14.2
), thanks for the idea to check it out. I'll try updating to
1.1.20
(which is the version of the chart we're using) and let you know if that fixes it
d
Ah I think that upgrade will help!