Neil
04/13/2023, 7:03 PMDetected conflicting node definitions with the same name
when generating two ops using the Databricks op factories. Could you tell me what I'm doing wrong? Code in threaddatabricks_job_id = 12
my_databricks_run_now_op = create_databricks_run_now_op(
databricks_job_id = databricks_job_id,
databricks_job_configuration={
"spark_submit_params": '1'
}
)
my_databricks_run_now_op2 = create_databricks_run_now_op(
databricks_job_id = databricks_job_id,
databricks_job_configuration={
"spark_submit_params": '2'
}
)
@graph_asset
def databricks_asset():
return my_databricks_run_now_op()
@graph_asset
def databricks_asset_2():
return my_databricks_run_now_op2()
claire
04/13/2023, 9:37 PMcreate_databricks_run_now_op
function which always returns an op named _databricks_run_now_op
, which is why you're getting the conflicting definitions error.
I think we need to make the function accept a name
param which can be passed into the op definition to make the ops unique.Neil
04/13/2023, 11:22 PM