Justin Albinet
05/11/2023, 4:54 PM@op
For @asset
I declare like that in my `___init___.py`file:
defs = Definitions(
assets = load_assets_from_modules([assets]),
resources={
"bigquery": bigquery_resource.configured(
{
"project": "NAME_PROJECT",
"location": "EU", #EU,US,etc....
"gcp_credentials": {"env": "GCP_CREDS"},
}
)
}
)
For the op I declared the job calling this `@op`the error resource with key 'bigquery' required by op 'getBigQueryBatch' was not provided. Please provide a <class 'dagster._core.definitions.resource_definition.ResourceDefinition'> to key 'bigquery'
The @op
is declared like that : @op(required_resource_keys={"bigquery"}, out = DynamicOut())
Any idea ? What am I missing here ? 🤔
Thx!sean
05/11/2023, 6:51 PMDefinitions
that shows this behavior? Your posted Definitions
snippet does not include any jobs.Justin Albinet
05/12/2023, 8:22 AMfrom dagster import Definitions, load_assets_from_modules, define_asset_job, AssetSelection, ScheduleDefinition
from dagster_gcp import bigquery_resource
from .assets import BatchToBigQuery
batchAPI_schedule = ScheduleDefinition(
job=BatchToBigQuery,
cron_schedule="*/30 * * * *",
)
defs = Definitions(
resources={
"bigquery": bigquery_resource.configured(
{
"project": "calcium-field-296116",
"location": "EU", #EU,US,etc....
"gcp_credentials": {"env": "GCP_CREDS"},
}
)
},
schedules = [batchAPI_schedule]
)
And I've put on my assets.py
(yeah I'm gonna change the file as now it's not only assets) the job BatchToBigQuery
@job
def BatchToBigQuery():
df = getBigQueryBatch()
pushData = df.map(apiBatch)
@op
to the ressources @sean? Must be my issue I think