Basil V
02/16/2021, 11:47 PMdagster-aws
library to read/write files to S3. It works locally, however, when running within Docker, it can't find my credentials. The specific error is botocore.exceptions.NoCredentialsError: Unable to locate credentials
and seems to be coming from dagster_aws/s3/compute_log_manager.py
when boto3 tries to instantiate the client.
I volume mounted my ~/.aws
file (containing creds/config) into the Docker containers at /root
using docker-compose
but Dagster/boto3 still isn't able to find the credentials. Does anyone know how to resolve this/where the credentials should be stores so boto3 can find them? (note for actual deployment we plan to use IAM but for testing Docker locally I need to figure out how to pass the creds). Thanks again!alex
02/16/2021, 11:57 PMCameron Gallivan
02/17/2021, 12:28 AMBasil V
02/17/2021, 12:35 AMCameron Gallivan
02/17/2021, 12:36 AMBasil V
02/17/2021, 12:38 AMpython_modules/libraries/dagster-aws/dagster_aws/s3/compute_log_manager.py
line 63)?
self._s3_session = boto3.resource(
"s3", use_ssl=use_ssl, verify=_verify, endpoint_url=endpoint_url
).meta.client
Cameron Gallivan
02/17/2021, 12:39 AMAWS_ACCESS_KEY_ID="$(aws configure get aws_access_key_id)"
AWS_SECRET_ACCESS_KEY="$(aws configure get aws_secret_access_key)"
in a build.sh script. Just need to expose them in the docker-compose for each service, I don’t think I even reference them in the actual dockerfilesBasil V
02/17/2021, 12:40 AMjordan
02/17/2021, 12:41 AMBasil V
02/17/2021, 12:43 AM