Hello, I've deployed Dasgter through Kubernetes wi...
# deployment-kubernetes
a
Hello, I've deployed Dasgter through Kubernetes with Helm and Celery queues using I'm using
celery_k8s_job_executor
this way:
Copy code
defs = Definitions(
    assets=all_assets,
    executor=celery_k8s_job_executor.configured(
        {
            "env_secrets": [".."],
        }
    ),
)
Is there a way to run code in a local service, maybe using celery aside from Kubernetes or bypassing all cluster deployment? I want an easier way to run it for testing purposes
a
there is a
celery_executor
as well as the default multiprocess / in process executor. https://docs.dagster.io/concepts/ops-jobs-graphs/job-execution#controlling-job-execution
a
Is there a way to bypass this in local development? I mean programatically, for us not to have to change code locally.
a
you could set up the code in different ways to make it load differently in local vs deployed • use an environment variable to forks which executor to attach • have different entry points that use the same shared asset loading code but create different final
Definitions
objects with different executors
a
Thanks! I think that should work fine.