Hey all, wondering what best practice is for authenticating boto3 in production using dagster. For airflow, I would use the secrets manager in the UI. Not sure how I would do this in dagster. Any thoughts here on best practices?
aside.. I can use secrets manager to get my snowflake, dbt, rds credentials. I'm wondering how I can authenticate boto3 safely so that I can use secrets manager
dagster bot responded by community 2
m
Mike Grabbe
06/08/2022, 1:17 PM
You can use the built in ec2/container IAM role, then there's no reason to store aws credentials on the machine or image.
a
Alec Ryan
06/08/2022, 1:18 PM
For local development, do you suggest just using env variables?
m
Mike Grabbe
06/08/2022, 1:19 PM
Ah, for local dev, yes you could use env variables but I usually volume mount my own aws creds into the image and let the image use those.
a
Alec Ryan
06/08/2022, 1:19 PM
So just the creds file?
m
Mike Grabbe
06/08/2022, 1:20 PM
Right.
~\.aws\credentials
a
Alec Ryan
06/08/2022, 1:20 PM
Thanks I think that can work
👍 1
I haven't gotten to productionalizing my pipeline yet in the cloud, so wanted to get a local project working first