Hi guys. I’ve spent some time reading docs searchi...
# ask-community
p
Hi guys. I’ve spent some time reading docs searching in chat history and in web but haven’t found and answer. We have a dagster with celery in running k8s. Celery utilizes k8s jobs and everything works fine except logs dagit logs are empty, and actual logs are only visible inside k8s job logs i want to use g cloud logging (since we’re running it there) by adding custom logger but not sure how to make dagit display those logs ?
c
are no logs displaying at all? or is it just compute logs
We have a GCSComputeLogManager which can stream logs to / from google cloud logging, or you can use captured python loggers to configure a python logger that can stream in the way you’re looking for
p
sorry i’m a bit new to all this so in dagit in runs logs are just blank completely raw and structured in k8s logs are present everywhere locally everything works as expected so most probably that’s related to jobs being run isolated in k8s jobs we have some logging config but i believe it doesn’t change any default config
Copy code
handlers:
  console:
    class: logging.StreamHandler
    level: INFO
    formatter: default
    stream: "<ext://sys.stderr>"

root:
  level: INFO
  propagate: yes
  handlers: [console]
thanks i’ll try GCSComputeLogManager was searching for it but with a different name 🙂
c
yea I think you just need to essentially sync your logs to cloud storage