https://dagster.io/ logo
#dagster-kubernetes
Title
# dagster-kubernetes
c

Charles Leung

06/08/2022, 3:45 PM
Hey all, so i have a dagster kubernetes instance , and i was wondering is it possible to let my instance execute on my local vm? The use case is to run tests on code that haven’t been deployed as images yet. I wondered if this was possible through a combination of celery-worker and api grpc.. 😅 LMK if more clarification is needed.
j

johann

06/08/2022, 3:51 PM
Hi Charles, you’re looking to test your jobs locally?
c

Charles Leung

06/08/2022, 3:52 PM
yes 🙂 . For a regression test for example, the code is in git, but i don’t want to produce an artifact until the test passes
so i have a docker image built with the git checked in code, but i don’t want to push to repository and deploy; just want it to run in-process in that image
j

johann

06/08/2022, 3:54 PM
Ideally you shouldn’t need to mess with Celery at all- the executor and run launcher abstractions exist to help you run jobs locally or in production without changing business logic
Of course if your job requires some credentials etc. it can be some work to provide those or mock out those components
c

Charles Leung

06/08/2022, 3:56 PM
the complexity comes from the fact that we are trying to avoid giving our build agent permissions to execute locally in CI 😞
hence i have to execute it “remotely” on a separate dagster instance
maybe i’m overcomplicated it, i should just have a job to do a git checkout before executing on the image :x
j

johann

06/08/2022, 4:01 PM
Not sure I fully follow, but some thoughts: • The best case would be if you could test your job in process, possibly by mocking out the resource/etc. that you don’t want available in your CI https://docs.dagster.io/tutorial/intro-tutorial/testable#testing-ops-and-jobs • If you do have to commit to launching jobs ‘remotely’, doing so over the graphql api is likely the best option