Hi! I have dagster deployed in k8s with helm using...
# ask-community
n
Hi! I have dagster deployed in k8s with helm using separate
dagster
and
user-code
deployments. For the
user-code
deployment I configured a custom
serviceAccountName
while the
dagster
deployment uses the default role for the cluster. When I run a job using the
K8sRunLauncher
it is running under the default role rather than the custom one specified in my
user-code
deployment. I confirmed that pod running my
user-code
correctly assumes the role specified with
serviceAccountName
and assumed that jobs would inherit this role as well. Is this the expected behavior or how can I configure my job to run using a specified service account?
d
Hi Nolan - there's an 'includeConfigInLaunchedRuns' flag that you can set in your dagster-user-deployments that will pass along config, including the service account, to any launched runs from that user code deployment. We're planning on making this the default behavior in the next major release, but until then you can turn it on following the example here (search for 'includeConfigInLaunchedRuns'): https://docs.dagster.io/deployment/guides/kubernetes/deploying-with-helm#configure-your-user-deployment
n
bah, missed that - thanks for the quick response!
d
no problem, I don't think the docs are very clear that it applies to the service account too
1
j
@daniel hey Daniel, I tried setting this flag and it doesn't seem to have an effect. Do you think it might be because we are specifying the user accound globally? e.g.:
Copy code
global:
  serviceAccountName: "our-service-account"
d
@Juan Arrivillaga what version of dagster are you using? Including the service account when includeConfigInLaunchedRuns is set was added pretty recently
specifically looks like 0.14.13: https://docs.dagster.io/changelog#01413
j
@daniel on 0.14.14
d
Could you post the result of kubectl describe for a pod with a different service account than what you expect?
And could you post the result of Kubectl describe for the user code deployment pod that you're hoping will transfer its service account to the run pod?
n
To close the loop, we ended up getting this to work by adding the following tag to the job.
Copy code
dagster-k8s/config: {"pod_spec_config": {"service_account_name": "dagster-access-control-sa"}}