Ramnath Vaidyanathan
12/05/2021, 7:13 PMQwame
12/07/2021, 5:14 PMresources:
environs:
config:
year: 2021
How can I tell dagster that when launching from the dagit UI, if no year config us passed, use 2020.
In my Python file, I have the job defined as
@job(resource_defs={"environs": make_values_resource(year=2020)}
def pipeline_job():
job_a(job_b)
The idea is to pass the default value in the job definition but I am getting errors.
However, when I do this, I don't get any errors:
@job(resource_defs={"environs": make_values_resource(year=int)}
def pipeline_job():
job_a(job_b)
Any help?bitsofinfo
12/08/2021, 9:00 PMDK
12/20/2021, 3:27 PM@op(config_schema={"param": str},
config={"param":"some_value"})
def do_something(_):
medihack
12/20/2021, 6:08 PM@root_input_manager
and @io_manager decorator
IMHO should work the same way. Especially as RootInputManager
and IOManager
do behave so similar. At the moment @root_input_manager
in contrast to @io_manager
creates a new RootInputManager
itself. That's why you can't simply write
class DatabaseManager(RootInputManager, IOManager):
def handle_output(self, context, obj):
...
def load_input(self, context):
...
@io_manager(required_resource_keys={"database_client"})
def database_io_manager():
return DatabaseManager()
@root_input_manager(required_resource_keys={"database_client"})
def database_root_manager():
return DatabaseManager()
but you have to write
@root_input_manager(required_resource_keys={"database_client"})
def database_root_manager(context: InputContext):
manager = DatabaseManager()
return manager.load_input(context)
I know that it is not a big deal, but just want to leave some feedback while @root_input_manager is still experimental.Jeremy Fisher
12/20/2021, 8:16 PMAndrea Giardini
12/22/2021, 2:06 PMColin Sullivan
01/02/2022, 5:42 PMcontext
argument for ops. It feels odd to use the context object in the body of an op function to access a value, especially since the body of the job function doesn't pass the context argument explicitly. I’d rather express those values as parameters to the op function and use the body of the job function to pass explicitly. This would make testing more straightforward to avoid needing to mock a context object and simplify migrating existing projects. Similarly, I feel like I'd prefer to write a schema for an entire Job, rather than on a per op basis. Then have job function take a context or run_config argument, that defines the necessary values to pass to each op in the body. Am I missing the magic that explains why I ought to resist the urge to structure a pipeline like this? Is it possible to access the run_config in the body of the job function? And can I define the schema for an entire job?James Miller
01/04/2022, 2:12 PMChris Chan
01/04/2022, 7:00 PMAlex Service
01/04/2022, 9:56 PMdagster new-project
to utilize pyproject.toml
(from pep518). I may look into if I can open-source the project setup I have as an example of how that could be used in conjunction with env/package managers like poetry
(I’d be happy to have a chat with any of the folks at Elementl if you’re interested in follow-up conversations 🙂 )Nick Dellosa
01/11/2022, 2:29 PMDaniel Suissa
01/13/2022, 2:43 PMAnatoly Laskaris
01/17/2022, 9:26 AMMatthias Queitsch
01/18/2022, 9:19 AMVxD
01/20/2022, 12:15 AMFailure
exception, but it got "fixed" to abide by the retry policy.
Could we please get an Abort
special exception that aborts the current pipeline?Huib Keemink
01/21/2022, 12:15 PMHuib Keemink
01/21/2022, 12:17 PMHuib Keemink
01/21/2022, 12:19 PMVxD
01/26/2022, 1:04 AMNatalie Novitsky
01/31/2022, 8:08 PMZach
02/02/2022, 5:51 PMGeorge Pearse
02/03/2022, 1:39 PMAlessandro Marrella
02/09/2022, 12:27 PMAlex Service
02/17/2022, 7:13 PMHugo Saavedra
02/17/2022, 10:44 PMMike
02/18/2022, 12:27 AM[1, 2, 3] + [4]
why do I care?Mike
02/18/2022, 12:30 AMMike
02/18/2022, 12:38 AMAlex Service
02/18/2022, 5:42 PMAlex Service
02/18/2022, 5:42 PM