Vivek
07/09/2021, 1:21 PMrex
07/09/2021, 5:10 PMVivek
07/10/2021, 3:44 AMrex
07/12/2021, 4:04 AMLocalComputeLogManager
which stores the stdout/stderr for each compute step in your pipeline on disk. See https://docs.dagster.io/deployment/dagster-instance#default-local-behavior for more information.
A Kubernetes job is an abstraction over a Kubernetes pod, so you should be able to edit the security permissions (in the pod template) to allow it to access the container filesystem explicitly. If you want to do this with your pipeline, you can follow https://docs.dagster.io/deployment/guides/kubernetes/customizing-your-deployment#solid-or-pipeline-kubernetes-configuration.
If you don’t want to go through that, you could also just enable the NoOpComputeLogManager
in the Helm chart.
computeLogManager:
# Type can be one of [
# LocalComputeLogManager,
# AzureBlobComputeLogManager,
# GCSComputeLogManager,
# S3ComputeLogManager,
# CustomComputeLogManager,
# ]
type: CustomComputeLogManager
config:
customComputeLogManager:
module: dagster.core.storage.noop_compute_log_manager
class: NoOpComputeLogManager
config: {}
Vivek
07/13/2021, 6:30 AMdagster.yaml
compute_logs:
module: dagster_azure.blob.compute_log_manager
class: AzureBlobComputeLogManager
config:
container: REDACTED
local_dir: /tmp/cool
prefix: dagster-test-
secret_key: REDACTED
storage_account: REDACTEDrex
07/13/2021, 5:35 PMI assume I don’t need to write an import statement in my user deployment code.this is an incorrect assumption - you need this dependency in your user code. https://docs.dagster.io/deployment/guides/kubernetes/deploying-with-helm#build-docker-image-for-user-code
Vivek
07/14/2021, 3:40 AM