Alexey Kulakov
02/14/2023, 4:25 PMparts = MonthlyPartitionsDefinition(start_date='2022-01-01')
@asset(
required_resource_keys={"spark"},
io_manager_key="io_spark_iceberg_parts",
partitions_def = parts
)
def parts_proc(context, parts_source):
parts_proc = some_func(parts_source)
return parts_proc
2. I have an asset job defined as:
assets_for_job = ['parts_proc']
asset_job = define_asset_job(
name='asset_job',
selection=assets_for_job
)
3. I have some definition to start job from external jupyter:
from dagster_graphql import DagsterGraphQLClient
client = DagsterGraphQLClient("dagster", port_number=3070)
client.submit_job_execution(
job_name="asset_job",
repository_name="my_repo"
)
How could I set the list of asset’s partitions that I need to materialize in “submit_job_execution” at the step 3?owen
02/14/2023, 6:53 PMclient.submit_job_execution(
job_name="asset_job",
repository_name="my_repo",
tags={"dagster/partition": "2020-01-01"}
)
Alexey Kulakov
02/15/2023, 8:02 AMJulius
04/26/2023, 10:53 AM