Hi please can someone explain what the "event_logs...
# ask-community
j
Hi please can someone explain what the "event_logs" table is in the supporting postgres DB is? My colleague and I configured our logs to be stored in azure blob storage (which they are), so I'm trying to understand what these "event" logs are for and is it safe to delete them periodically e.g. on a daily basis
p
Hi Jonathan. We support three types of logs in Dagster: 1. Default structured logs. These are things emitted by the Dagster framework like “run started”, “run failed”. We use these events to keep track of run status, track asset materializations over time, power retry events, etc. 2. Custom structured logs. These are logs that are used if, in your code, you do things like
<http://context.log.info|context.log.info>("stuff")
. 3. Compute logs. These are the stdout / stderr that the code emits, including from libraries like pyspark that might be not be captured in the python layer. This often includes the text output from 1 and 2. We read from the event log to present certain views in Dagit, historically: A. The run view, where all the events for a particular run is shown B. Asset materialization views, where the materialization events for a particular asset is shown. C. Retries of runs, reads from the event log to see if particular steps have succeeded/failed, in order to determine which steps should be executed in the retry. D. Step duration stats, to determine the history of step durations for a particular run. Azure blob storage would keep the history of log type 3. From the event log table, it should be safe to delete log. type 2 (would have
dagster_event_type
value of null), but it would affect the appearance of those events in scenario A. Deleting log type 1 would affect scenarios A/B/C/D, and so is much more complicated to do.
j
@prha Ah okay I understand! Thank you for your clear and detailed response