I was wondering about the Dagster log when you wan...
# announcements
I was wondering about the Dagster log when you want to log from a supporting module(that is doesn't have the
object, and are not solids...also they can be a completely different modules). 1. I tried 
 , but I don't see them on 
when I run the pipeline. 2. I don't see any of my logs using
<http://context.log.info|context.log.info>('My logs here')
in GCP logs(deployed using k8s), only the dagster logs(eg. Engine events...Started process for pipeline (pid: 17024).) can be seen.
Hey Mose. For viewing generalized logging you can view stderr and stdout by clicking on the following:
Our text could probably be clearer
For #2, our default logger logs to stderr. Are you using a custom logger?
I am able to see all the logs(using context.log.info()) from the solids with no issue in
. Thanks. I have another module that I use to access some api to get and post values using this pipeline(we already have those modules or packages for most of our API end points...so I can't change them or make them part of the pipeline). But it is important for us to see the logs in one place, dagit. And I don't know if that is possible. Are you using a custom logger? I don't think I am. Just dagster's logger.
GCP logs(deployed using k8s),
I think the missing piece of the puzzle is a GCS
the compute log manager is responsible for capturing stdout/stderr from solid execution - by default it will just use the local file system which in k8s case will not be available
just landed
which should go out in the release tomorrow
I see the logs now(I didn't change anything) in GCP log. Only issue is all dagster's logs are marked as info(in blue) and all the log contents using the context.log.info is marked as Error(in orange). It might be gcp but I thought you would like to know.
cool - glad thats working, the thing I added is an opt-in system for persisting logs to google storage in a way that dagit knows how to interact with them so they can be viewed in dagit directly
👍 1