Nicolas Gaillard
10/20/2021, 3:51 PMDbtRpcClient
and everything looks fine when I want to run or test models but when I want to run the dbt ls
command, I can't get the result on the Dagster side and I have no error on the RPC server :
{'error': {'code': -32000, 'message': 'Server error', 'data': {'type': 'InternalException', 'args': ['No matching handler found for rpc method None (which=list)'], 'message': 'No matching handler found for rpc method None (which=list)'}}, 'id': '38ac7b10-31bb-11ec-a63a-0242ac120004', 'jsonrpc': '2.0'}
I don't understand, the command is normally supported on DBT since if I put a wrong command, I get the following message:
{'error': {'code': -32602, 'message': 'Invalid params', 'data': {'type': 'TypeError', 'args': ["dbt: error: invalid choice: 'foo' (choose from 'docs', 'source', 'init', 'clean', 'debug', 'deps', 'list', 'ls', 'build', 'snapshot', 'rpc', 'run', 'compile', 'parse', 'test', 'seed', 'run-operation')\n"], 'message': "dbt: error: invalid choice: 'foo' (choose from 'docs', 'source', 'init', 'clean', 'debug', 'deps', 'list', 'ls', 'build', 'snapshot', 'rpc', 'run', 'compile', 'parse', 'test', 'seed', 'run-operation')\n"}}, 'id': '79795a54-31bc-11ec-9a83-0242ac120004', 'jsonrpc': '2.0'}
I use this block of code to run DBT ls :
dbt_rpc_client = DbtRpcClient()
ls_command = "ls"
out = dbt_rpc_client .cli(command=ls_command, task_tags={})
I tried to upgrade DBT and nothing happens. All other commands work.
Has anyone had this problem? Thank you in advance and have a nice day!David Loewenstern
10/20/2021, 4:07 PMAlexis M
10/20/2021, 7:50 PMdagster.check.CheckError: Invariant failed. Description: Parent pipeline snapshot id out of sync with passed parent pipeline snapshot
File "/usr/local/lib/python3.7/site-packages/dagster_graphql/implementation/utils.py", line 29, in _fn
return fn(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/dagster_graphql/implementation/execution/launch_execution.py", line 16, in launch_pipeline_execution
return _launch_pipeline_execution(graphene_info, execution_params)
File "/usr/local/lib/python3.7/site-packages/dagster_graphql/implementation/execution/launch_execution.py", line 50, in _launch_pipeline_execution
run = do_launch(graphene_info, execution_params, is_reexecuted)
File "/usr/local/lib/python3.7/site-packages/dagster_graphql/implementation/execution/launch_execution.py", line 34, in do_launch
pipeline_run = create_valid_pipeline_run(graphene_info, external_pipeline, execution_params)
File "/usr/local/lib/python3.7/site-packages/dagster_graphql/implementation/execution/run_lifecycle.py", line 89, in create_valid_pipeline_run
pipeline_code_origin=external_pipeline.get_python_origin(),
File "/usr/local/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 917, in create_run
pipeline_code_origin=pipeline_code_origin,
File "/usr/local/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 772, in _construct_run_with_snapshots
if pipeline_snapshot
File "/usr/local/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 815, in _ensure_persisted_pipeline_snapshot
"Parent pipeline snapshot id out of sync with passed parent pipeline snapshot",
File "/usr/local/lib/python3.7/site-packages/dagster/check/__init__.py", line 167, in invariant
raise CheckError(f"Invariant failed. Description: {desc}")
Egor -
10/20/2021, 10:56 PMRemi Gabillet
10/21/2021, 9:01 AMraaid
10/21/2021, 9:46 AMLevan
10/21/2021, 9:52 AMError 1: Received unexpected config entry "container_kwargs" at path root:run_launcher. Expected: "{ class: String config?: { } module: String }".
sourabh upadhye
10/21/2021, 10:08 AMraaid
10/21/2021, 10:36 AMAndrea Giardini
10/21/2021, 1:05 PMsolid
of my pipeline should just run a container and run a command in it. so i wrote something like:
@solid(
tags = {
'dagster-k8s/config': {
'container_config': {
'image': "MYIMAGE"
},
},
},
)
def my_solid_test():
mycommand = create_shell_command_op('mycommand --help', name="s2p_help")
mycommand()
But when I run it the step fails because it's trying to find dagster. MYIMAGE
in this case is a public image so it' doesn't have dagster installed.
Does dagster need to be installed in all the docker containers that I am going to use?Mohit.ASingh
10/22/2021, 6:46 AMaccess control
on dagit ui?Andrea Giardini
10/22/2021, 8:30 AMJonathan PHOON
10/22/2021, 10:45 AMRodrigo Schammass
10/22/2021, 12:40 PMDan Stoner
10/22/2021, 5:27 PM0.11.3
but Dagster just released 0.13
so not sure if this is a safe operation.Dan Stoner
10/22/2021, 5:32 PMDan Stoner
10/22/2021, 5:45 PMAdd the remote url under the namespaceto install the Dagster charts.dagster
$ helm repo add dagster <https://dagster-io.github.io/helm>
Is that a kubernetes namespace or should that say "name" instead of "namespace" ? The default
kubernetes namespace is explicitly used elsewhere in that document.marcos
10/23/2021, 2:04 AM/cloudsql/GCP_PROJECT:us-central1:CLOUD_SQL_INSTANCE_NAME
). Using that unix socket path in the host field does not work. I believe that's because I need to use a different format for the connection string than what I see being used. I am looking for guidance on how I should proceed from here. Thank you!DaveKalpak
10/23/2021, 2:41 PMmrdavidlaing
10/23/2021, 5:07 PMMartin Carlsson
10/24/2021, 1:36 PMMartin Carlsson
10/24/2021, 3:31 PMdagster-azure
to work
I’m using the code from: https://docs.dagster.io/_apidocs/libraries/dagster-azure
However, when I run: dagit -f jobs_blob.py
I get an error, and Dagit doesn’t pick up the job
For me it looks like I haven’t supplied the config - but where do I add that?
Since I’m not able open Dagit properly, I cannot write it here: https://docs.dagster.io/concepts/configuration/config-schema#dagit
The error I get:
dagster.core.errors.DagsterInvalidConfigError: Error in config for resource adls2 Error 1: Missing required config entry "credential" at path root:config. Sample config for missing entry: {'credential': '<selector>'}
File "/usr/local/lib/python3.9/site-packages/dagster/grpc/server.py", line 205, in __init__
self._repository_symbols_and_code_pointers.load()
File "/usr/local/lib/python3.9/site-packages/dagster/grpc/server.py", line 90, in load
self._loadable_repository_symbols = load_loadable_repository_symbols(
File "/usr/local/lib/python3.9/site-packages/dagster/grpc/server.py", line 108, in load_loadable_repository_symbols
loadable_targets = get_loadable_targets(
File "/usr/local/lib/python3.9/site-packages/dagster/grpc/utils.py", line 25, in get_loadable_targets
else loadable_targets_from_python_file(python_file, working_directory)
File "/usr/local/lib/python3.9/site-packages/dagster/core/workspace/autodiscovery.py", line 17, in loadable_targets_from_python_file
loaded_module = load_python_file(python_file, working_directory)
File "/usr/local/lib/python3.9/site-packages/dagster/core/code_pointer.py", line 94, in load_python_file
return import_module_from_path(module_name, python_file)
File "/usr/local/lib/python3.9/site-packages/dagster/seven/__init__.py", line 50, in import_module_from_path
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 850, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "jobs_blob.py", line 11, in <module>
context = build_op_context(resources={'adls2': adls2_resource_configured})
File "/usr/local/lib/python3.9/site-packages/dagster/core/execution/context/invocation.py", line 441, in build_op_context
return build_solid_context(
File "/usr/local/lib/python3.9/site-packages/dagster/core/execution/context/invocation.py", line 492, in build_solid_context
return UnboundSolidExecutionContext(
File "/usr/local/lib/python3.9/site-packages/dagster/core/execution/context/invocation.py", line 73, in __init__
self._resources = self._resources_cm.__enter__() # pylint: disable=no-member
File "/usr/local/lib/python3.9/contextlib.py", line 119, in __enter__
return next(self.gen)
File "/usr/local/lib/python3.9/site-packages/dagster/core/execution/build_resources.py", line 73, in build_resources
mapped_resource_config = _get_mapped_resource_config(resource_defs, resource_config)
File "/usr/local/lib/python3.9/site-packages/dagster/core/execution/build_resources.py", line 34, in _get_mapped_resource_config
return config_map_resources(resource_defs, config_value)
File "/usr/local/lib/python3.9/site-packages/dagster/core/system_config/objects.py", line 262, in config_map_resources
raise DagsterInvalidConfigError(
Koby Kilimnik
10/24/2021, 4:10 PMKoby Kilimnik
10/24/2021, 4:11 PMprod
mode and the other poc1
mode, each defined configuration and default env configuration for all of my resources, now it belongs to the repo which is a tad weirded abstractionwiseKoby Kilimnik
10/24/2021, 4:12 PMdinya
10/25/2021, 8:51 AM@graph(name="process_records_graph")
def process_records_graph():
process_records()
process_records_job = process_records_graph.to_job(
name="do_process_records",
config=cfg1
)
process_records_job2 = process_records_graph.to_job(
name="do_process_records2",
config=cfg2
)
Why is there not a single process_records_graph
element (one "parent") on the "Graphs" tab. Instead two: process_records_job
and process_records_job2
(named the same as "forked" jobs with the same contents)? Is this is a feature of the Dagit user interface or bug?Navneet Sajwan
10/25/2021, 10:03 AMgrpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "DNS resolution failed for service: example-user-code-5:3030"
debug_error_string = "{"created":"@1635156090.651482028","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":2137,"referenced_errors":[{"created":"@1635156090.651480836","description":"DNS resolution failed for service: example-user-code-5:3030","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":361,"grpc_status":14,"referenced_errors":[{"created":"@1635156090.651465992","description":"C-ares status is not ARES_SUCCESS qtype=A name=example-user-code-5 is_balancer=0: Domain name not found","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":716}]}]}"
>
File "/usr/local/lib/python3.7/site-packages/dagster/scheduler/scheduler.py", line 93, in launch_scheduled_runs
repo_location = location_manager.get_location(origin)
File "/usr/local/lib/python3.7/site-packages/dagster/core/host_representation/location_manager.py", line 35, in get_location
else repository_location_origin.create_location()
File "/usr/local/lib/python3.7/site-packages/dagster/core/host_representation/origin.py", line 247, in create_location
return GrpcServerRepositoryLocation(self)
File "/usr/local/lib/python3.7/site-packages/dagster/core/host_representation/repository_location.py", line 428, in __init__
list_repositories_response = sync_list_repositories_grpc(self.client)
File "/usr/local/lib/python3.7/site-packages/dagster/api/list_repositories.py", line 13, in sync_list_repositories_grpc
api_client.list_repositories(), (ListRepositoriesResponse, SerializableErrorInfo)
File "/usr/local/lib/python3.7/site-packages/dagster/grpc/client.py", line 141, in list_repositories
res = self._query("ListRepositories", api_pb2.ListRepositoriesRequest)
File "/usr/local/lib/python3.7/site-packages/dagster/grpc/client.py", line 87, in _query
response = getattr(stub, method)(request_type(**kwargs), timeout=timeout)
File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 923, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 826, in _end_unary_response_blocking
raise _InactiveRpcError(state)
Navneet Sajwan
10/25/2021, 10:04 AMRemi Gabillet
10/25/2021, 10:29 AMyield AssetMaterialization
statement, the op now fails (while the materialization does work). Any ideas?Anaqi Afendi
10/25/2021, 2:31 PM