https://dagster.io/ logo
#announcements
Title
# announcements
b

Basil V

02/16/2021, 11:47 PM
Hey all, I have a Docker/Dagster question—right now in my Dagster pipeline I'm using the
dagster-aws
library to read/write files to S3.  It works locally, however, when running within Docker, it can't find my credentials. The specific error is
botocore.exceptions.NoCredentialsError: Unable to locate credentials
and seems to be coming from
dagster_aws/s3/compute_log_manager.py
when boto3 tries to instantiate the client. I volume mounted my 
~/.aws
 file (containing creds/config) into the Docker containers at 
/root
 using
docker-compose
but Dagster/boto3 still isn't able to find the credentials.  Does anyone know how to resolve this/where the credentials should be stores so boto3 can find them? (note for actual deployment we plan to use IAM but for testing Docker locally I need to figure out how to pass the creds).  Thanks again!
a

alex

02/16/2021, 11:57 PM
could try environment variables
c

Cameron Gallivan

02/17/2021, 12:28 AM
Are you using a single docker container or running a multi-container setup with docker-compose
b

Basil V

02/17/2021, 12:35 AM
multi-container setup with docker-compose
env variables would be great, but the way dagster-aws constructs the boto3 client I'm not sure would currently work with env variables would it?
c

Cameron Gallivan

02/17/2021, 12:36 AM
It does, it just uses botocore so it’ll look for env vars to provide the credentials
I’m doing the same approach after i couldn’t figure out how to properly mount the .aws dir for it
👍 1
b

Basil V

02/17/2021, 12:38 AM
Oh awesome. That would be ideal then. This was the line where I think the client gets created right (
python_modules/libraries/dagster-aws/dagster_aws/s3/compute_log_manager.py
line 63)?
Copy code
self._s3_session = boto3.resource(
            "s3", use_ssl=use_ssl, verify=_verify, endpoint_url=endpoint_url
        ).meta.client
Cool I'll try env variables then if you had success with that. I had just thought based on that code snippet above that it didn't load the env variables anywhere.
Thanks!
c

Cameron Gallivan

02/17/2021, 12:39 AM
If you don’t have the var’s set by default you can do:
Copy code
AWS_ACCESS_KEY_ID="$(aws configure get aws_access_key_id)"
AWS_SECRET_ACCESS_KEY="$(aws configure get aws_secret_access_key)"
in a build.sh script. Just need to expose them in the docker-compose for each service, I don’t think I even reference them in the actual dockerfiles
b

Basil V

02/17/2021, 12:40 AM
Oh awesome thanks!
b

Basil V

02/17/2021, 12:43 AM
Oh awesome thanks! That was the piece of info I was missing. Really appreciate it all!
9 Views