addy
05/05/2020, 3:04 PM<s3://bucket/prefix/step.result>
, I'm also using dagster-pandas
and TypeCheck
to create summary statistics on dataframes (and other types)
I would like to send these to s3 as well so I have all of my intermediate results and event_specific_data
in one place, I suppose I could do this with materializations, but then I'm already automatically materializing them, and I'm already creating the EventMetadataEntry
in the type checks, and would feel hacky to re-add them to a materialization
loving dagster so far, and it's crazy how fast you guys are improving it, started using it like 3 weeks ago and every day it feels a little more awesome to usealex
05/05/2020, 4:10 PMalex
05/05/2020, 4:11 PMevent_specific_data
off of the type checks and send it somewherealex
05/05/2020, 4:14 PMimport logging
from dagster import logger
class EventListener(logging.Handler):
def emit(self, record):
event = record.dagster_meta.get('dagster_event')
if not event.is_dagster_event:
return
dagster_event = event.dagster_event
# bad name - just means output event
if dagster_event.is_successful_output:
# send dis
dagster_event.step_output_data.type_check_data
@logger
def logger_based_event_listener(init_context):
klass = logging.getLoggerClass()
logger_ = klass(name, level=level)
logger_.addHandler(EventListener())
return logger_
alex
05/05/2020, 4:26 PMwould feel hacky to re-add them to a materializationya its not great but it might be your best bet
addy
05/08/2020, 2:04 PM