https://dagster.io/ logo
p

Prratek Ramchandani

02/24/2021, 8:24 PM
Hey! I’m trying to pass a BigQuery table schema into the config for the import_gcs_paths_to_bq solid in
dagster_gcp
and had a question - I’d like to avoid typing out the entire schema in a YAML file and it seems my best bet is to use the
configured
API and write Python to load the schema from a JSON file. Is there any way I can instead specify the path to the file in a YAML file and write a custom type with a corresponding type loader to pass the schema to the solid? My concern then is that the solid config expects a schema of type
Array(inner_type=dict)
but this would then be some other custom type. Also, are custom types even supported for config, or is that only for inputs?
I don’t actually have a strong preference for whether or not I use the configured API. Really what I’d like to be able to do is: 1. Not type out the entire schema 2. Perform some validation to check that the schema is valid - similar to what I’d do with a
type_check_fn
for a custom type
a

alex

02/24/2021, 8:58 PM
ya custom config types is not really a feature at the moment
you could try to model this is an input or maybe even a resource you will likely need to implement your own version of
import_gcs_paths_to_bq
to do that, but I think that is probably the right way to go once you are doing something more complex like this
p

Prratek Ramchandani

02/25/2021, 8:00 PM
yeah writing my own solid or resource and modeling it as an input makes sense. thanks @alex!