paul.q
02/11/2021, 9:47 PMalex
02/11/2021, 11:18 PMmax
02/11/2021, 11:22 PMextra
onto the underlying LogRecord
paul.q
02/11/2021, 11:37 PMmax
02/12/2021, 1:42 AMpaul.q
03/22/2021, 10:10 PMalex
07/21/2021, 2:48 PMdagit
display of these messages also useful for you?paul.q
07/24/2021, 12:52 AMLogstash
to clean up and 'normalise' what's sent to Elastic
. Up until now, that effort is entirely contained in our logger implementation where its pure python and easy enough to debug as well. A further option is to write another package on top of python logging which would give us control of everything that's written to disk - essentially transplanting the logic from our dagster logger implementation.
But we then wouldn't get the benefit of seeing user messages (via context.log) in the dagit
UI would we?
We've also built a REST API that gets pipeline run stats together with messages about pipeline/solid failures (via GraphQL using the logs). I guess these would continue to work because event logs would be unaffected?
Let us know if it's worth waiting or should we switch approaches.
Thanks
Paulcontext
object to it along with extra
. With the extra
dict passed to it, I munge it into a string and add it to the message, before calling the `context`'s log.log
method at the end - so after that it's over to our custom json logger. Inside that I can unmunge the extra dict out of the message, add the elements into the log record and also clean up the message (to removed the munged bit).
It all works fine, except that the message that appears in the dagit console includes the munged portion as well. In our JSON logger, the message is cleaned up as desired. What I don't understand is: isn't the same log record being passed to all the handlers in the custom Dagster logger? If so, we would expect to get the same in the console log as we see in the json log records?alex
07/26/2021, 2:34 PMBut we then wouldn’t get the benefit of seeing user messages (via context.log) in the dagit UI would we?They would no longer be in the structured event stream, but the raw stdout/stderr logs should be visible via dagit assuming you have your
ComputeLogManager
set-up correctly for your deployment. Its not as nice as the structured event stream entries but there is still should be a way to see it
I guess these would continue to work because event logs would be unaffected?Yep this should be right.
I munge it into a string and add it to the message, before calling the context’s log.log method at the end
What I don’t understand is: isn’t the same log record being passed to all the handlers in the custom Dagster logger? If so, we would expect to get the same in the console log as we see in the json log records?The loggers should all receive whats passed to the context’s log method, which is what sounds like is happening? I could be misunderstanding the details.