Adrien
09/09/2022, 9:13 AMdagster-daemon
history type of data, only in the Cloud Serverless environment, but I can't find anything. Is anyone aware of any such API?Todd
09/09/2022, 3:00 PMAlan Dao
09/09/2022, 3:47 PMAlan Dao
09/09/2022, 3:48 PMAlan Dao
09/09/2022, 3:48 PMCharlie Bini
09/09/2022, 4:15 PMresync
graph contains nodes named compose_queries
, compose_queries_2
, etc
@graph
def execute_query(dynamic_query):
# trunk-ignore(flake8/F841)
data = extract(dynamic_query=dynamic_query)
@graph
def resync():
# log
dynamic_query = compose_queries()
dynamic_query.map(execute_query)
# attendance
dynamic_query = compose_queries()
dynamic_query.map(execute_query)
# storedgrades
dynamic_query = compose_queries()
dynamic_query.map(execute_query)
# pgfinalgrades
dynamic_query = compose_queries()
dynamic_query.map(execute_query)
# assignmentscore
dynamic_query = compose_queries()
dynamic_query.map(execute_query)
Yevhen Samoilenko
09/09/2022, 4:19 PM[dagit] The launchpad tab is no longer shown for Asset jobs. Asset jobs can be launched via the "Materialize All" button shown on the Overview tab. To provide optional configuration, hold shift when clicking "Materialize".
This feature doesn't work for me in dagster cloud and open source dagster.Saul Burgos
09/09/2022, 4:31 PMAnthony Reksoatmodjo
09/09/2022, 4:39 PMMichal Malyska
09/09/2022, 5:01 PMScott Hood
09/09/2022, 5:20 PMjose
09/09/2022, 7:48 PMTom Reilly
09/09/2022, 10:48 PMoperation queue.declare caused a channel exception precondition_failed: inequivalent arg 'x-max-priority' for queue 'dagster' in vhost '/': received the value '10' of type 'signedint' but current is none
any ideas?Jiamin Chen
09/10/2022, 1:17 AMgeoHeil
09/10/2022, 6:44 AMAlan Dao
09/11/2022, 6:08 AMAlan Dao
09/11/2022, 6:08 AMAlan Dao
09/11/2022, 6:08 AMBaris Cekic
09/11/2022, 10:57 AMheadless service
that points to driver job/pod when the job is executed. Otherwise the spark executor pods can not communicate with dagster job/pod ( which is the spark driver ) .Saad Anwar
09/11/2022, 1:17 PMTamas Foldi
09/11/2022, 6:59 PMkey_prefix
does not work with seeds and snapshots. any easy ways to prefix all dbt outputs, not just models?Alexander Whillas
09/12/2022, 4:24 AMFrank Dekervel
09/12/2022, 7:58 AMteodorpaius
09/12/2022, 12:27 PMrun_storage:
module: dagster_postgres.run_storage
class: PostgresRunStorage
config:
postgres_db:
hostname:
env: DAGSTER_POSTGRES_HOST
username:
env: DAGSTER_POSTGRES_USER
password:
env: DAGSTER_POSTGRES_PASSWORD
db_name:
env: DAGSTER_POSTGRES_DB
port: 5432
Gowtham Manne
09/12/2022, 1:38 PMLeo Qin
09/12/2022, 2:30 PMbotocore.errorfactory.InvalidRequestException: An error occurred (InvalidRequestException) when calling the StartQueryExecution operation: The S3 location provided to save your query results is invalid. Please check your S3 location is correct and is in the same region and try again. If you continue to see the issue, contact customer support for further assistance.
, if I run the image and run it locally this error doesn't happen, any ideas what is happning?Slackbot
09/12/2022, 2:31 PMGowtham Manne
09/12/2022, 3:05 PMfrom dagster import op, job, repository, In,Out, OpExecutionContext, logger
from pydantic import Field
import logging
import json
@logger(
{
"log_level": Field(str, is_required=False, default_value="INFO"),
"name": Field(str, is_required=False, default_value="dagster"),
},
description="A JSON-formatted console logger",
)
def json_console_logger(context : OpExecutionContext):
level = context.logger_config["log_level"]
name = context.logger_config["name"]
klass = logging.getLoggerClass()
logger_ = klass(name, level=level)
handler = logging.StreamHandler()
class JsonFormatter(logging.Formatter):
def format(self, record):
return json.dumps(record.__dict__)
handler.setFormatter(JsonFormatter())
logger_.addHandler(handler)
return logger_
@op(
name="MyCOperationInput",
out={"result_str":Out(dagster_type=str)},
config_schema={"name": str}
)
def my_C_op_input(context:OpExecutionContext):
name = context.op_config["name"]
<http://context.log.info|context.log.info>(f"My name is {name} in MyCOperationInput")
return name
@op(
name="MyCOperation",
ins={"result_str":In()},
out={"result":Out(dagster_type=str)}
)
def my_C_op(context:OpExecutionContext,result_str : str):
print(context)
print(result_str)
<http://context.log.info|context.log.info>(f"My name is {result_str} in MyCOperation")
return result_str
@job
def my_C_job():
print("job started")
my_C_op(my_C_op_input())
@repository(default_logger_defs={"json_logger": json_console_logger})
def my_C_repo():
return [my_C_job]
but getting be error dagster.core.errors.DagsterInvalidConfigDefinitionError: Error defining config. Original value passed: {'log_level': FieldInfo(default=<class 'str'>, extra={'is_required': False, 'default_value': 'INFO'}), 'name': FieldInfo(default=<class 'str'>, extra={'is_required': False, 'default_value': 'dagster'})}. Error at stack path :log_level. FieldInfo(default=<class 'str'>, extra={'is_required': False, 'default_value': 'INFO'}) cannot be resolved. This value can be a: - Field - Python primitive types that resolve to dagster config types - int, float, bool, str, list. - A dagster config type: Int, Float, Bool, Array, Optional, Selector, Shape, Permissive, Map - A bare python dictionary, which is wrapped in Field(Shape(...)). Any values in the dictionary get resolved by the same rules, recursively. - A python list with a single entry that can resolve to a type, e.g. [int]
sar
09/12/2022, 3:23 PMOlivier Doisneau
09/12/2022, 3:40 PMOlivier Doisneau
09/12/2022, 3:40 PMVinnie
09/12/2022, 3:49 PMBianca Rosa
09/12/2022, 4:11 PMOlivier Doisneau
09/12/2022, 4:21 PMBianca Rosa
09/12/2022, 4:23 PMVinnie
09/12/2022, 4:23 PMOlivier Doisneau
09/12/2022, 4:24 PMBianca Rosa
09/12/2022, 4:25 PMOlivier Doisneau
09/12/2022, 4:26 PM