# ask-ai

Tom Reilly

02/15/2024, 12:54 AM
Dasgter UI shows "No log file available" when trying to view the stderr or stdout. Using the
, I can see the raw logs on my grpc container but not in dagster webserver container. If I copy the file from grpc container to webserver container, I can see the logs in the UI. Is there some configuration needed to ensure logs are saved to webserver properly?


02/15/2024, 12:54 AM
When using the
in Dagster, the compute logs (stdout/stderr) are stored locally on disk in the environment where the job is executed. If you are running Dagster in a distributed environment with separate containers for the gRPC server and the Dagster webserver, the compute logs generated by the gRPC server will not automatically be available to the webserver container. The "No log file available" message in the Dagster UI typically indicates that the webserver is unable to locate or access the compute logs. This can happen if the logs are not shared between the containers or if the webserver does not have the correct file path or permissions to access them. To resolve this issue, you have a few options: 1. Shared Volume: Configure a shared volume between the gRPC server container and the webserver container. This shared volume should be mounted to the same path in both containers so that when logs are written by the gRPC server, they are also accessible to the webserver. 2. Compute Log Manager Configuration: Ensure that the
is configured with a base directory that is accessible to both the gRPC server and the webserver. This might involve setting the
configuration option in your
to point to the shared volume. 3. Centralized Storage: Instead of using
, consider using a centralized log storage solution like
if you are on AWS, which would upload the logs to an S3 bucket. Both the gRPC server and the webserver would then access the logs from this centralized location. Here's an example of how you might configure the
in your `dagster.yaml`:
Copy code
  class: LocalComputeLogManager
    base_dir: /path/to/shared/volume
Please ensure that the path you provide is correctly mounted and accessible in both containers. If you continue to face issues, you may need to check the container logs for any errors related to file access or permissions, and ensure that the Dagster instance in the webserver container is correctly configured to look for logs in the shared volume. If you need further assistance with the configuration or encounter specific errors, please provide additional details about your setup, and we can offer more targeted advice.
No log file available
with S3ComputeLo... button Run logs not appearing in realtime in Da... button Custom logger works with Dagster CLI but... button Compute log paths don't match button Thank you for your feedback!