https://dagster.io/ logo
#deployment-kubernetes
Title
# deployment-kubernetes
b

Binoy Shah

05/26/2022, 4:40 PM
We have NewRelic for log monitoring which scans all Kubernetes containers stdout/stderr, so all our logs get aggregated in NR. In such situation, how do i configure logs so that they only print on stdout/stderr ? I am trying to deploy via Helm charts
🤖 1
j

johann

05/26/2022, 5:53 PM
so that they only print on stdout/stderr
I might misunderstand the question, where are they printing currently?
b

Binoy Shah

05/26/2022, 5:58 PM
I am not there yet, but I saw some components to store logs in postgres, so I was wondering about the default behavior
j

johann

05/26/2022, 6:01 PM
By default we’ll store logs in Postgres (required for the UI etc.) and write everything to stdout/err.
b

Binoy Shah

05/27/2022, 1:10 PM
Is there a way to not save Logs to Postgres
j

johann

05/27/2022, 2:53 PM
So we always will write system logs to a DB (it’s how we know when a run has finished, etc.). If you’re writing directly to stdout/stderr though, we won’t capture that by default
b

Binoy Shah

05/27/2022, 2:54 PM
so if I want to keep my postgres data light, can I remove the logs every few days. or will it cause failures in UI ?
My infra guys are reluctant in hosting a heavy posgres instance in our setup, i convinced, that we’ll mostly store metadata and some light logs..
j

johann

05/27/2022, 2:59 PM
If you delete a run from the UI or via the graphql client, that will remove all the storage related to it. And yeah the storage should be pretty light, it’s mostly events like
STEP_STARTED
etc. unless you start making heavy use of
context.log
, which would write to Postgres
b

Binoy Shah

05/27/2022, 3:01 PM
Okay, so we should not use the logger of
context.log
in python jobs for logging user defined log statements
6 Views