https://dagster.io/ logo
#ask-community
Title
# ask-community
a

Alec Ryan

04/14/2022, 9:06 PM
Anyone know how to upload a file to s3 using resources?
Copy code
from dagster_aws.s3 import s3_resource



@op(required_resource_keys={'s3'})
def load_json_to_s3(context, json_game_data_list):
    for json in json_game_data_list:
        context.resources.s3.put_object(
        body=json,
        Bucket='nhl-analytics',
        Prefix='test'
        )
j

johann

04/14/2022, 9:16 PM
Could take a looks at https://docs.dagster.io/deployment/guides/aws#using-s3-for-io-management A common way is with
AWS_ACCESS_KEY_ID
and
AWS_SECRET_ACCESS_KEY
z

Zach

04/14/2022, 9:19 PM
yeah I usually just configure aws creds as the standard boto AWS_ACCESS_KEY and AWS_SECRET_ACCESS_KEY environment variables and the s3 resource works. it's just a boto3 s3 client so you can configure it using a credentials profile too
1
a

Alec Ryan

04/14/2022, 9:19 PM
where do you place the keys?
j

johann

04/14/2022, 9:20 PM
In the env wherever your Op is running. That would depend on how you have Dagster deployed
a

Alec Ryan

04/14/2022, 9:21 PM
got it, so I can store them as environment variables
j

johann

04/14/2022, 9:21 PM
Yep
a

Alec Ryan

04/14/2022, 9:23 PM
how would I then specify a profile?
a

Alec Ryan

04/14/2022, 9:24 PM
say I have to aws profiles set in my environment
how would I pass the right ones to dagster?
nvm, I'll use profile
156 Views