https://dagster.io/ logo
#ask-community
Title
# ask-community
j

Jonah Liebert

07/20/2022, 6:47 PM
Fellow Dagsters! We got Dagster running in a Kubernetes Cluster. One project has a sensor that looks for changes in a Google Sheet. On the local machine, I used a service account file for authentication. In the cloud, the service account doesn't work, and I don't know what to use instead to authenticate. The available methods for the pygsheets Python module are: client secret service account service_account_env_var (Use an environment variable to provide service account credentials) credentials_directory– Location of the token file created by the OAuth2 process. Use ‘global’ to store in global location, which is OS dependent. Default None will store token file in current working directory. Please note that this is override your client secret. custom_credentials–A custom or pre-made credentials object. Will ignore all other params. scopes–The scopes for which the authentication applies. If anyone can even point me in the right direction, I would be grateful.
j

jamie

07/20/2022, 10:08 PM
Hi @Jonah Liebert my kubernetes is a bit rusty, but i believe you can set up secrets in your cluster that can be accessed as env vars or as files depending on how you set it up https://kubernetes.io/docs/concepts/configuration/secret/ let me know if this helps, otherwise i can pull in someone else who know k8s a bit better!
j

Jonah Liebert

07/21/2022, 2:40 PM
Thank you @jamie! I was unsuccessful with uploading the json file to Kubernetes. However, what seems to be working is using google.auth to pull the default credentials and then passing them to the authentication in the google sheet python package