Noah K
11/10/2020, 9:58 PMNoah K
11/11/2020, 9:07 AMManas Jain
11/11/2020, 9:56 AMsashank
11/11/2020, 3:48 PMBrian Abelson
11/11/2020, 4:17 PMBrian Abelson
11/11/2020, 4:21 PMschrockn
11/11/2020, 8:30 PMschrockn
11/11/2020, 8:31 PMBasil V
11/11/2020, 11:39 PMsnowflake_resource
in an sqlalchemy connection? Or would you guys be open to a PR to make the snowflake_resource
use sqlalchemy? The reason is, some pandas function such as .to_sql
require an sqlalchemy connection, so as far as I can tell / and via my own testing the pandas SQL utilities won't be available for Snowflake using the Dagster snowflake_resource
directly as it is now. Any thoughts?Jessica Stewart
11/12/2020, 12:44 AMTraceback (most recent call last):
File "hello_dagster.py", line 1, in <module>
from dagster import execute_pipeline, pipeline, solid
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/__init__.py", line 5, in <module>
from dagster.core.definitions import (
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/__init__.py", line 2, in <module>
from .decorators import (
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/decorators/__init__.py", line 3, in <module>
from .job import job
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/decorators/job.py", line 2, in <module>
from dagster.core.definitions.job import JobDefinition
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/job.py", line 4, in <module>
from dagster.core.instance import DagsterInstance
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/instance/__init__.py", line 12, in <module>
from dagster.core.definitions.pipeline import PipelineDefinition, PipelineSubsetDefinition
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/pipeline.py", line 24, in <module>
from .mode import ModeDefinition
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/mode.py", line 5, in <module>
from dagster.core.definitions.executor import ExecutorDefinition, default_executors
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/core/definitions/executor.py", line 138, in <module>
"marker_to_close": Field(str, is_required=False),
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/config/field.py", line 237, in __init__
self.config_type = check.inst(self._resolve_config_arg(config), ConfigType)
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/config/field.py", line 221, in _resolve_config_arg
config_type = resolve_to_config_type(config)
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/config/field.py", line 81, in resolve_to_config_type
if is_typing_type(dagster_type):
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/utils/typing_api.py", line 152, in is_typing_type
or is_closed_python_optional_type(ttype)
File "/Users/jessicas/.pyenv/versions/dagster-env/lib/python3.6/site-packages/dagster/utils/typing_api.py", line 16, in is_closed_python_optional_type
return origin == typing.Union and len(ttype.__args__) == 2 and ttype.__args__[1] == type(None)
File "/Users/jessicas/.pyenv/versions/3.6.0/lib/python3.6/typing.py", line 760, in __eq__
return self._subs_tree() == other
File "/Users/jessicas/.pyenv/versions/3.6.0/lib/python3.6/typing.py", line 760, in __eq__
return self._subs_tree() == other
File "/Users/jessicas/.pyenv/versions/3.6.0/lib/python3.6/typing.py", line 760, in __eq__
return self._subs_tree() == other
[Previous line repeated 227 more times]
File "/Users/jessicas/.pyenv/versions/3.6.0/lib/python3.6/typing.py", line 759, in __eq__
if not isinstance(other, _Union):
RecursionError: maximum recursion depth exceeded in __instancecheck__
Jeff Tilton
11/12/2020, 4:45 AM2020-11-11 20:29:01 - dagster - DEBUG - inputs_pipeline - 7480db29-cf8b-4e18-b69c-6695a58b6867 <- uuid
Any advice on how to apply serverless to dagster?Xu Zhang
11/12/2020, 2:22 PMLeor
11/12/2020, 9:51 PMLeor
11/12/2020, 9:52 PMuser
11/12/2020, 11:30 PMszelee
11/13/2020, 12:41 PMload_from:
- python_file:
relative_path: src/common/repositories.py
- python_file:
relative_path: src/project_alpha/repositories.py
- python_file:
relative_path: src/project_beta/repositories.py
With 0.9.19, I am getting this error
dagster.check.CheckError: Invariant failed. Description: Cannot have multiple locations with the same name, got multiple "repositories.py"
Are we supposed to have unique name for each repository now?Mose
11/13/2020, 2:51 PMclean_leads
solids?Nate Loker
11/13/2020, 7:11 PMXu Zhang
11/13/2020, 8:00 PMXu Zhang
11/13/2020, 8:15 PMXu Zhang
11/13/2020, 9:53 PMTed Conbeer
11/14/2020, 8:42 PMSlackbot
11/14/2020, 11:25 PMXu Zhang
11/15/2020, 6:08 AMexecute_pipeline
to run a pipeline which completely only exists in memory; since i passed in an Dagster instance, the runs details were able to picked up by Dagit UI; however, it only shows the pipeline tree structure, and if I click the solid, it wouldn’t show anything. Is this expected?Xu Zhang
11/15/2020, 6:34 AMInMemoryWorkspace
that requires no physical Python files? or is there a way to make the snapshot to contain more meta data about the solids?Simon Späti
11/15/2020, 9:41 AMDagster’s mechanism for conditional execution is non-required outputs. For any solid output definition, we can set the is_required argument to False. If any of the inputs to a solid come from non-required outputs, and any of those non-required outputs are not yielded by the upstream solid, then the solid won’t run.This doesn’t account for
composite_solids
correct? Because I’m struggling to achieve the same with composites. How would you do it there?Xu Zhang
11/15/2020, 5:45 PMXu Zhang
11/16/2020, 4:19 AMdagster.serdes.ipc.DagsterIPCProtocolError: Timeout: read stream has not received any data in 15 seconds
it could be possibly caused by how long the deployment process takes which exceeds the timeout defined.
The error was raised from wait_for_grpc_server
, however, I did explicitly add below config to dagster.yaml, why it still tried to start a gRPC server?
opt_in:
local_servers: false
stacktrace:
/export/apps/python/3.7/bin/python3.7: Error while finding module specification for 'dagster.grpc' (ModuleNotFoundError: No module named 'dagster')
Loading repository...
KafkaClient cannot be created for Log Handler in DEV fabric. No events will be sent!
/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/nbformat/notebooknode.py:4: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
from collections import Mapping
/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/jsonschema/compat.py:6: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
from collections import MutableMapping, Sequence # noqa
/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/jsonschema/compat.py:6: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
from collections import MutableMapping, Sequence # noqa
/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/graphene/relay/connection.py:2: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
from collections import Iterable, OrderedDict
/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/cli/workspace/workspace.py:50: UserWarning: Error loading repository location dagsterweb:(DagsterIPCProtocolError) - dagster.serdes.ipc.DagsterIPCProtocolError: Timeout: read stream has not received any data in 15 seconds
Stack Trace:
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/cli/workspace/workspace.py", line 43, in _load_handle
handle = RepositoryLocationHandle.create_from_repository_location_origin(origin)
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/core/host_representation/handle.py", line 47, in create_from_repository_location_origin
return ManagedGrpcPythonEnvRepositoryLocationHandle(repo_location_origin)
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/core/host_representation/handle.py", line 172, in __init__
lazy_load_user_code=True,
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/grpc/server.py", line 1095, in __init__
lazy_load_user_code=lazy_load_user_code,
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/grpc/server.py", line 1008, in open_server_process
wait_for_grpc_server(output_file)
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/grpc/server.py", line 955, in wait_for_grpc_server
event = read_unary_response(ipc_output_file, timeout=timeout)
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/serdes/ipc.py", line 39, in read_unary_response
messages = list(ipc_read_event_stream(output_file, timeout=timeout))
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/serdes/ipc.py", line 142, in ipc_read_event_stream
timeout=timeout
location_name=location_name, error_string=error_info.to_string()
Exception ignored in: <function ManagedGrpcPythonEnvRepositoryLocationHandle.__del__ at 0x108704950>
Traceback (most recent call last):
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/core/host_representation/handle.py", line 245, in __del__
self.is_cleaned_up,
File "/export/content/lid/apps/dagster-web/i001/libexec/dagster-web_e1306745056a4ad8d3f2195a879de5b31aa2840bd78ffce543abb9b3e57d908a/site-packages/dagster/core/host_representation/handle.py", line 241, in is_cleaned_up
return not self.client
AttributeError: 'ManagedGrpcPythonEnvRepositoryLocationHandle' object has no attribute 'client'
Xu Zhang
11/16/2020, 4:25 AMload_from:
- python_package: dagsterweb
dagsterweb is the package of the Flask app I wrote.Xu Zhang
11/16/2020, 5:51 AMlocal_servers
is not being used anywhere, so no matter what Dagit will create a ManagedGrpcPythonEnvRepositoryLocationOrigin
, unless it is different from GrpcServerRepositoryLocationOrigin