https://dagster.io/ logo
d

dwall

02/20/2020, 9:05 PM
will file an issue once I get a chance to dig in more, but I think something may be funky with the
s3_resource
in the
dagster_aws
library. I get a
PutObject Access Denied
error when trying to use
context.resources.s3.upload_fileobj(fp, bucket, key)
but when I swap it out with:
Copy code
s3 = boto3.resource('s3').meta.client
s3.upload_fileobj(fp, bucket, key)
everything works fine
m

max

02/20/2020, 9:26 PM
anything else on the error?
could be a credentials chain thing
d

dwall

02/20/2020, 9:27 PM
I imagine it is but the traceback is super unhelpful
will dig in and try to find more
s

schrockn

02/20/2020, 9:51 PM
@dwall what is the traceback?
d

dwall

02/20/2020, 9:54 PM
Copy code
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/errors.py", line 159, in user_code_error_boundary
    yield
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/engine/engine_inprocess.py", line 680, in _user_event_sequence_for_step_compute_fn
    for event in gen:
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/execution/plan/compute.py", line 87, in _execute_core_compute
    for step_output in _yield_compute_results(compute_context, inputs, compute_fn):
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/execution/plan/compute.py", line 64, in _yield_compute_results
    for event in user_event_sequence:
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/definitions/decorators.py", line 413, in compute
    for item in result:
  File "/Users/dwall/repos/dataland-dagster/pipelines/vc_pipeline.py", line 171, in df_to_s3
    context.resources.s3.upload_fileobj(fp, bucket, key)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster_aws/s3/resources.py", line 23, in upload_fileobj
    return self.session.upload_fileobj(fileobj, bucket, key)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
    return future.result()
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/futures.py", line 106, in result
    return self._coordinator.result()
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/futures.py", line 265, in result
    raise self._exception
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/tasks.py", line 126, in __call__
    return self._execute_main(kwargs)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/tasks.py", line 150, in _execute_main
    return_value = self._main(**kwargs)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/upload.py", line 692, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/botocore/client.py", line 661, in _make_api_call
    raise error_class(parsed_response, operation_name)
s

schrockn

02/20/2020, 9:55 PM
yeah that’s a boto thing
that would be the same error message whether dagster were in play or not
d

dwall

02/20/2020, 9:55 PM
yep. whats weird is that it works fine when swapped out with what seems to be an identical implementation from boto directly
s

schrockn

02/20/2020, 9:55 PM
our engine adds some noise into the stack trace
d

dwall

02/20/2020, 9:55 PM
(see above)
Copy code
s3 = boto3.resource('s3').meta.client
s3.upload_fileobj(fp, bucket, key)
this works fine ^
s

schrockn

02/20/2020, 9:59 PM
cool. could you use this (untested just wrote it):
Copy code
@resource
def boto_client():
    return boto3.resource('s3').meta.client
instead of our s3 resource
‘s3’: boto_client should do it
d

dwall

02/20/2020, 10:12 PM
totally works
s

schrockn

02/20/2020, 10:17 PM
cool
could you file an issue about this?
so we can track
d

dwall

02/20/2020, 10:18 PM
yep 👍
s

schrockn

02/20/2020, 10:18 PM
🙏🏻🙏🏻🙏🏻
probably needed a context arg
m

max

02/20/2020, 10:25 PM
for my curiosity, @dwall, do you have
AWS_DEFAULT_REGION
set
in your env
d

dwall

02/20/2020, 10:26 PM
I do not
m

max

02/20/2020, 10:27 PM
but you are using the
AWS_
env variables to set the secret access key and access key id presumably
d

dwall

02/20/2020, 10:27 PM
yes
m

max

02/20/2020, 10:30 PM
really peculiar
i think i see the issue, we have a really bad default set
probably to make the tutorial smoother or something
i think i'm responsible for this
believe that https://dagster.phacility.com/D2091 will resolve
d

dwall

02/20/2020, 11:26 PM
heh - nice
thanks @max!
Lol. Love it
s

schrockn

02/20/2020, 11:27 PM
our diffs are a pretty consistent source of solid lols