will file an issue once I get a chance to dig in m...
# announcements
d
will file an issue once I get a chance to dig in more, but I think something may be funky with the
s3_resource
in the
dagster_aws
library. I get a
PutObject Access Denied
error when trying to use
context.resources.s3.upload_fileobj(fp, bucket, key)
but when I swap it out with:
Copy code
s3 = boto3.resource('s3').meta.client
s3.upload_fileobj(fp, bucket, key)
everything works fine
m
anything else on the error?
could be a credentials chain thing
d
I imagine it is but the traceback is super unhelpful
will dig in and try to find more
s
@dwall what is the traceback?
d
Copy code
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/errors.py", line 159, in user_code_error_boundary
    yield
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/engine/engine_inprocess.py", line 680, in _user_event_sequence_for_step_compute_fn
    for event in gen:
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/execution/plan/compute.py", line 87, in _execute_core_compute
    for step_output in _yield_compute_results(compute_context, inputs, compute_fn):
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/execution/plan/compute.py", line 64, in _yield_compute_results
    for event in user_event_sequence:
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster/core/definitions/decorators.py", line 413, in compute
    for item in result:
  File "/Users/dwall/repos/dataland-dagster/pipelines/vc_pipeline.py", line 171, in df_to_s3
    context.resources.s3.upload_fileobj(fp, bucket, key)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/dagster_aws/s3/resources.py", line 23, in upload_fileobj
    return self.session.upload_fileobj(fileobj, bucket, key)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
    return future.result()
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/futures.py", line 106, in result
    return self._coordinator.result()
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/futures.py", line 265, in result
    raise self._exception
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/tasks.py", line 126, in __call__
    return self._execute_main(kwargs)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/tasks.py", line 150, in _execute_main
    return_value = self._main(**kwargs)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/s3transfer/upload.py", line 692, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Users/dwall/.local/share/virtualenvs/dataland-dagster-Z2VR7MFq/lib/python3.8/site-packages/botocore/client.py", line 661, in _make_api_call
    raise error_class(parsed_response, operation_name)
s
yeah thatโ€™s a boto thing
that would be the same error message whether dagster were in play or not
d
yep. whats weird is that it works fine when swapped out with what seems to be an identical implementation from boto directly
s
our engine adds some noise into the stack trace
d
(see above)
Copy code
s3 = boto3.resource('s3').meta.client
s3.upload_fileobj(fp, bucket, key)
this works fine ^
s
cool. could you use this (untested just wrote it):
Copy code
@resource
def boto_client():
    return boto3.resource('s3').meta.client
instead of our s3 resource
โ€˜s3โ€™: boto_client should do it
d
totally works
s
cool
could you file an issue about this?
so we can track
d
yep ๐Ÿ‘
s
๐Ÿ™๐Ÿป๐Ÿ™๐Ÿป๐Ÿ™๐Ÿป
probably needed a context arg
m
for my curiosity, @dwall, do you have
AWS_DEFAULT_REGION
set
in your env
d
I do not
m
but you are using the
AWS_
env variables to set the secret access key and access key id presumably
d
yes
m
really peculiar
i think i see the issue, we have a really bad default set
probably to make the tutorial smoother or something
i think i'm responsible for this
believe that https://dagster.phacility.com/D2091 will resolve
d
heh - nice
thanks @max!
Lol. Love it
s
our diffs are a pretty consistent source of solid lols