Pablo Beltran
11/07/2022, 10:30 PMSelene Hines
11/08/2022, 12:24 AMsaravan kumar
11/08/2022, 2:25 AMRay Hilton
11/08/2022, 4:09 AMJoshua Smart-Olufemi
11/08/2022, 12:05 PMdagster.yaml
, in the Compute Log Storage and the Local Artifact Storage sections, for the base_dir
, do I use a /
or a \
for my path directory? My computer automatically uses the backward slash when copying paths but the dagster docs show a forward slash when indicating base_dir
Robert Lawson
11/08/2022, 12:44 PMSamuel Stütz
11/08/2022, 1:10 PMAn exception was thrown during execution that is likely a framework error, rather than an error in user code.
dagster._check.CheckError: Invariant failed. Description: all_assets_materialize has no op named account10__loadtest1__asset1.
We tried some loadtest of dagit. And generated all this generic assets in a factory. Locally this execution works but on k8s deployed it fails
these are the asset ops
@asset(key_prefix=context.asset_key_prefix(), name=asset_name, ins={"upstream": AssetIn(key=upstream_key)}, group_name=asset_group_name, partitions_def=daily_partitions_definition)
def anAsset(upstream):
time.sleep(asset_materialization_sleep_sec)
return 42
assets += [anAsset]
dagster v 1.0.16Abhishek Sawant
11/08/2022, 2:34 PM<path>/lib/python3.9/site-packages/dagster/_core/workspace/context.py:538: UserWarning: Error loading repository location dagster_redshift.repository:AssertionError
....
dagster_redshift/op/copy.py", line 76, in <module>
assert example_redshift_op(context) == [(1,)]
The query is running but it is giving above error, below is the sample code (same from the example)
from dagster import build_op_context, op
from dagster_aws.redshift import redshift_resource
@op(required_resource_keys={'redshift'})
def example_redshift_op(context):
return context.resources.redshift.execute_query('SELECT 1', fetch_results=True)
redshift_configured = redshift_resource.configured({
'host': '<http://my-redshift-cluster.us-east-1.redshift.amazonaws.com|my-redshift-cluster.us-east-1.redshift.amazonaws.com>',
'port': 5439,
'user': '<usr_id>',
'password': '<pwd>',
'database': 'dev',
})
context = build_op_context(resources={'redshift': redshift_configured})
assert example_redshift_op(context) == [(1,)]
Zachary Bluhm
11/08/2022, 3:08 PMJesper Bagge
11/08/2022, 3:37 PMZachary Bluhm
11/08/2022, 3:50 PMTaylor
11/08/2022, 7:25 PMnickvazz
11/08/2022, 7:49 PMdagit -f ./dagstermill_output_notebook_io_manager_bug.py
Mike Atlas
11/08/2022, 9:27 PMnickvazz
11/09/2022, 1:23 AMdagster._check.CheckError: Invariant failed. Description: Can't supply a ConfigMapping for 'config' when 'partitions_def' is supplied.
happen with an asset job containing a dagstermill notebook asset?
https://github.com/dagster-io/dagster/issues/10041Son Giang
11/09/2022, 8:51 AMcelery_executor
and celery_k8s_job_executor
?Marco Jacopo Ferrarotti
11/09/2022, 11:15 AMHarry James
11/09/2022, 11:29 AM@asset
def asset_1():
return 5
@asset
def asset_2(asset_1):
return asset_1 + 2
result = materialize_to_memory(assets=[asset_2, SourceAsset(key="asset_1", value=5)])
print(result.output_for_node("asset_2"))
Deepa Vasant
11/09/2022, 1:54 PMJose Estudillo
11/09/2022, 2:05 PMgrpcio<1.48.1,>=1.32.0
. The thing is dagit hangs as soon as I start it, if I update grpcio==1.50.0
, it works normally, but this dependency is out of the defined range, any thoughts? Thanks in advance.Zachary Bluhm
11/09/2022, 2:43 PMsome_conditional_op
> dbt_graph
Do I need to just set some_conditional_op
as an explicit upstream in each assetdefinition I create as part of the DBT load assets function?geoHeil
11/09/2022, 2:48 PMcontext.advance_cursor
call marks specific materializations as consumed. In future context calls, these materializations will not be returned. <<< What does this mean if not all assets are partitioned in the same way and one i.e. daily and another one i.e. hourly are to be merged. For the first hour of the day - it should work fine. But if the daily asset is consumed (at least once) it is no longer available (if I understand this correctly. How would such a join of assets with non-equal partitions work where one asset needs to be joined for a whole period of time.
multi_asset_sensor
lists:Mike Atlas
11/09/2022, 3:43 PMfinished without success or failure event. Downstream steps will not execute.op
Dependencies for stepwere not executed: [...]. Not executing.next_op
Process for run exited (pid: 1).The logs show an error like:
Exception while writing logger call to event log: (mysql.connector.errors.ProgrammingError) 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'PIPELINE_FAILURE', '2022-11-09 15:42:15.864446', NULL, NULL, NULL)]\\n[parameter' at line 1
Exception while writing logger call to event log: (mysql.connector.errors.ProgrammingError) 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'PIPELINE_FAILURE', '2022-11-09 15:42:15.864446', NULL, NULL, NULL)]\\n[parameter' at line 1
I'm running Dagster daemon 1.0.6 - is there a bug fix or something for this in a newer version?Charles Lariviere
11/09/2022, 3:59 PMhandle_output
function to log each as separate metadata, but we would prefer to ideally record it as a single value.
I see that Dagit is able to surface this using the STEP_INPUT
and `STEP_SUCCESS`/`STEP_FAILURE` events, but I wonder whether that's available within the context of the asset definition?Szymon Zaborowski
11/09/2022, 5:17 PM{asset1: {key: our_model, ops: [train_model, deploy_model(optional)]}}
instead of {asset1: {key: our_model, ops: train_model}, asset2: {key: deployed_status, ops: deploy_model}}
where asset1 downstreams to asset2
Thanks for reply and have a nice day 🙂Chris Dong
11/09/2022, 5:38 PMJoshua Smart-Olufemi
11/09/2022, 8:30 PMI TRIED RUNNING THE "DEFINE AN ASSET" TUTORIAL AFTER SETTING UP MY DAGSTER IN AN EBVIRONEMNT IN POWERSHELL AND GOT THIS ERROR WHEN I RAN dagit -f cereal.py Stack Trace: File "C:\Users\josh\Desktop\toggle assignment\assignment\venv\lib\site-packages\dagster\_grpc\server.py", line 230, in __init__
self._loaded_repositories: Optional[LoadedRepositories] = LoadedRepositories(
File "C:\Users\josh\Desktop\toggle assignment\assignment\venv\lib\site-packages\dagster\_grpc\server.py", line 104, in __init__
loadable_targets = get_loadable_targets( File "C:\Users\josh\Desktop\toggle assignment\assignment\venv\lib\site-packages\dagster\_grpc\utils.py", line 33, in get_loadable_targets
else loadable_targets_from_python_file(python_file, working_directory)
File "C:\Users\josh\Desktop\toggle assignment\assignment\venv\lib\site-packages\dagster\_core\workspace\autodiscovery.py", line 27, in loadable_targets_from_python_file
loaded_module = load_python_file(python_file, working_directory)
File "C:\Users\josh\Desktop\toggle assignment\assignment\venv\lib\site-packages\dagster\_core\code_pointer.py", line 75, in load_python_file
os.stat(python_file)
warnings.warn(
2022-11-09 21:23:40 +0100 - dagit - INFO - Serving dagit on <http://127.0.0.1:3000> in process 15224
Zach P
11/09/2022, 9:04 PMAaron Hoffer
11/09/2022, 9:22 PMmax_resume_run_attempts
retry jobs that fail to start when run_monitoring
is in enabled? I noticed that I’ll occasionally get that if a kubernetes pod isn’t created within the specified timeout, but it doesn’t get retried, just fails.Zach
11/10/2022, 2:06 AM