After my failed attempt with filesystem storage ab...
# announcements
m
After my failed attempt with filesystem storage above 😃 I tried another option - to deploy with local self-hosted minio s3-like storage, but stucked at providing credentials. JFYI, I'll post it again - there is a bug in boto3 that makes impossible to connect to endpoint_url containing underscores (as I had) https://github.com/boto/boto3/issues/703 # I just renamed my containers so I just wanted to use s3_storage from dagster_aws and initialized it with my own s3 endpoint_url. It connected but failed pipeline execution with
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I tried to
dagster-aws init
inside container but it requires me to provide aws region which is obviously irrelevant to me
botocore.exceptions.NoRegionError: You must specify a region.
how can I provide s3 connection creds in this case?
I actually made it working by hard-patching my creds here: https://github.com/dagster-io/dagster/blob/master/python_modules/libraries/dagster-aws/dagster_aws/s3/utils.py#L7 How should I make it cleaner?
I tried with
~/.aws/credentials
but no luck
oh, yes, I got it finally https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html just passing them through env works
👍 1
s
Hi @matas May I ask, how did you specify the correct endpoint? As it’s not s3 I don’t need a region, but an endpoint something like: https://s3.noobaa.svc.cluster.local in our case. But if I set this as region-name. I get the following error when starting
dagster-aws init
which is obviously wrong: •
botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "<https://ec2.s3.noobaa.svc.cluster.local.amazonaws.com/>
Or didn’t you use the dagster-aws init for that?
m
I've just put it in my dagster.yaml like here https://docs.dagster.io/docs/apidocs/libraries/dagster_aws and then passed the credentials through the env variables environment: AWS_ACCESS_KEY_ID: your-id-here AWS_SECRET_ACCESS_KEY: your-secret-here
👍 1