Hello :wave: I've configure my compute_logs like t...
# ask-community
a
Hello šŸ‘‹ I've configure my compute_logs like this:
Copy code
compute_logs:
  module: dagster_aws.s3.compute_log_manager
  class: S3ComputeLogManager
  config:
    bucket: konpyutaika-dagster-staging
    prefix: dagster-logs
The logs are well stored in s3 for my run:
Copy code
āÆ aws s3 ls <s3://konpyutaika-dagster-staging/dagster-logs/storage/0791bf89-ff65-433c-8011-f2685a46da99/compute_logs/>
2023-02-20 12:18:47      19818 koygczmb.err
2023-02-20 12:18:47      82842 koygczmb.out
But I have nothing when I go into the UI... Am I missing something (I'm using
K8sRunLauncher
and k8s_job_executor):
Does anyone have any thoughts on this šŸ˜•? I really don't understand what I'm missing šŸ„²
p
Couple things to check: 1. Can you confirm that the config in dagit matches what you have up top? (itā€™s clear that your k8s workers already have the compute log manager configured correctly since the logs are available in S3). 2. At the very bottom of the panel, is there a path visible (e.g.
s3://ā€¦
) 3. Is there a download link button on the top right of the log panel? 4. Can you confirm that youā€™re seeing ā€˜No log file availableā€™ for both stdout and stderr? 5. In the structured event log, what do you see for
CAPTURED_LOG
event types? 6. Can you confirm what version of dagster youā€™re running?
a
1 - Yes (check the image above) 2 - No, I don't have anything (I noticed that when I use local storage, I have the local path). 3 - No šŸ˜• 4 - Yes, I confirm that it is for both 5 - I don't have
CAPTURED_LOG
, but I have
LOGS_CAPTURED
, I put the screen below ! 6 - I use version
1.1.19