Dasgter UI shows "No log file available" when tryi...
# ask-ai
t
Dasgter UI shows "No log file available" when trying to view the stderr or stdout. Using the
LocalComputeLogManager
, I can see the raw logs on my grpc container but not in dagster webserver container. If I copy the file from grpc container to webserver container, I can see the logs in the UI. Is there some configuration needed to ensure logs are saved to webserver properly?
s
When using the
LocalComputeLogManager
in Dagster, the compute logs (stdout/stderr) are stored locally on disk in the environment where the job is executed. If you are running Dagster in a distributed environment with separate containers for the gRPC server and the Dagster webserver, the compute logs generated by the gRPC server will not automatically be available to the webserver container. The "No log file available" message in the Dagster UI typically indicates that the webserver is unable to locate or access the compute logs. This can happen if the logs are not shared between the containers or if the webserver does not have the correct file path or permissions to access them. To resolve this issue, you have a few options: 1. Shared Volume: Configure a shared volume between the gRPC server container and the webserver container. This shared volume should be mounted to the same path in both containers so that when logs are written by the gRPC server, they are also accessible to the webserver. 2. Compute Log Manager Configuration: Ensure that the
LocalComputeLogManager
is configured with a base directory that is accessible to both the gRPC server and the webserver. This might involve setting the
base_dir
configuration option in your
dagster.yaml
to point to the shared volume. 3. Centralized Storage: Instead of using
LocalComputeLogManager
, consider using a centralized log storage solution like
S3ComputeLogManager
if you are on AWS, which would upload the logs to an S3 bucket. Both the gRPC server and the webserver would then access the logs from this centralized location. Here's an example of how you might configure the
LocalComputeLogManager
in your `dagster.yaml`:
Copy code
yaml
compute_log_manager:
  module: dagster.core.storage.local_compute_log_manager
  class: LocalComputeLogManager
  config:
    base_dir: /path/to/shared/volume
Please ensure that the path you provide is correctly mounted and accessible in both containers. If you continue to face issues, you may need to check the container logs for any errors related to file access or permissions, and ensure that the Dagster instance in the webserver container is correctly configured to look for logs in the shared volume. If you need further assistance with the configuration or encounter specific errors, please provide additional details about your setup, and we can offer more targeted advice.
No log file available
with S3ComputeLo... button Run logs not appearing in realtime in Da... button Custom logger works with Dagster CLI but... button Compute log paths don't match button Thank you for your feedback!
151 Views