hey, random question/feedback about docs - there a...
# announcements
c
hey, random question/feedback about docs - there are a few places in the documentation where it says "update your config yaml" but it doesn't say which config yaml, and it also doesn't say which section: https://dagster.readthedocs.io/en/0.6.6/sections/learn/guides/logging/logging.html#configuring-the-built-in-loggers i assumed it meant
dagster.yaml
, but i get
Undefined field "loggers" at the root. Expected:
... so i'm not sure where to put this
p
Hey Chris.
The loggers get configured on a per-execution basis
c
gotcha
p
so these would get set in the config editor (e.g. playground)
But thanks for raising that this is confusing…. we’ll need to find a better way of distinguishing all of our config
c
that makes sense, i guess when it says "config yaml" i should hae thought of that
p
probably something like pipeline config vs instance config
c
i feel like i'm never sure which config it's referring to, since there are so many yaml configs
since there is
repository.yaml
,
dagster.yaml
, the playground execution yaml,
celery.yaml
,
scheduler.yaml
the issue i'm trying to debug, actually, is that when i click "view raw step output" on steps that were run on celery, i just get "No log file available". any idea?
p
Do you have a
compute_logs
section in your instance yaml (
dagster.yaml
)?
also, are your celery workers on a remote box?
You might need to configure a
ComputeLogManager
that stores the compute logs remotely… check the
dagster-aws
package for the
S3ComputeLogManager
Your
dagster.yaml
would then include a section like this:
Copy code
compute_logs:
  module: dagster_aws.s3.compute_log_manager
  class: S3ComputeLogManager
  config:
    bucket: "my-bucket"
    prefix: "my-prefix"
c
yup, here is my section: ``````
Copy code
compute_logs:
  module: dagster_aws.s3.compute_log_manager
  class: S3ComputeLogManager
  config:
    bucket: myp-pipeline
    prefix: logs
i'm having the issue both on our production ECS cluster and on my local docker with dagit + redis worker setup
p
hmm
c
s3 is otherwise working though
(i'm calling
context.resources.s3.download_file
which is using
dagster_aws.s3.resources
for example and it works correctly)
p
taking a look…
c
thank you!
p
did this used to work and was a regression, or are you trying to set this up for the first time?
c
i'm not sure tbh. it's been set up for a month or two already but i haven't really tried to access the stdout/stderr feature in production until now
p
and you see ‘No log file available’ for in-flight steps, or also for completed steps?
It’s a known issue that the logs do not currently stream as the step is executing…
c
completed
p
Can you check if the compute logs did get written to the bucket?
c
it's writing the log files, but they are empty