https://dagster.io/ logo
#ask-community
Title
# ask-community
m

Mykola Palamarchuk

03/29/2022, 2:50 PM
Hi! Please help me to make a cleaner code. I have a resource that provides "s3_bucket" value. Something like:
Copy code
my_resource = context.resources.my_resource
my_bucket = my_resource['s3_bucket']
Is there a way to pass the value to
s3_pickle_io_manager
?
p

prha

03/29/2022, 4:28 PM
Hi Nick… I believe the
s3_pickle_io_manager
already has access to the s3 bucket… (it specifies the bucket in config, and specifies the
s3
required resource key): https://docs.dagster.io/_apidocs/libraries/dagster-aws#dagster_aws.s3.s3_pickle_io_manager Is there some workflow you’re trying to enable where you won’t have access to that?
if you are trying to read the bucket name at resource initialization time, instead of at config time, you might have to define your own IO manager:
Copy code
@io_manager(required_resource_keys={'s3', 'my_s3_bucket_providing_resource'})
def my_custom_s3_io_manager(context):
    bucket_name = context.resources.my_s3_bucket_providing_resource['s3_bucket']
    return PickledObjectS3IOManager(bucket_name, context.resources.s3)
m

Mykola Palamarchuk

03/29/2022, 4:41 PM
@prha, this is how I actually resolved it myself. But I'm not sure about the stability of code out of the public api. Can I safely use PickledObjectS3IOManager?
p

prha

03/29/2022, 4:45 PM
These APIs are pretty stable… we don’t provide hard guarantees against changing these APIs in dot releases, but we’re pretty good about listing any changes as breaking changes. I can see about exporting
PickledObjectS3IOManager
into the public API of
dagster_aws
though.
@Dagster Bot issue export
PickledObjectS3IOManager
from
dagster_aws
module
d

Dagster Bot

03/29/2022, 4:46 PM
3 Views