https://dagster.io/ logo
#ask-community
Title
# ask-community
o

Oliver

07/01/2022, 6:09 AM
Does anyone have any good approaches for improving this workflow. wanting to test my dagster pipeline, locally I would just run dagit locally; materialise asset, see error, fix error, materialise asset, see error, fix error, ad infinitum now I can't have local access to prod data so I now run dagit in k8s; materialise asset, see error, fix error, docker build and push, restart user code deployment, materialise asset, see error, fix error. Could I run a user code locally and somehow get k8s dagit to connect to that? would that work still since the k8sRunLauncher then doesn't have the user code
i

Ilnur Garifullin

07/01/2022, 6:16 AM
I don’t think, there’s a way to upload your code to Dagster other than through docker build/push
dagster bot responded by community 1
I can suggest, optimizing/organising Dockerfile is a way to speed things up, so the build and push process takes less time
o

Oliver

07/01/2022, 6:36 AM
yea, its only the user code changing rest is cached just a bunch more steps. That local flow is so nice! I guess a shared file system would work But if the user code deployment distributes the code as well then running that locally and getting a connection to dagster would be the easiest solution
i

Ilnur Garifullin

07/01/2022, 6:46 AM
Yes, but this introduces some security concerns, i.e. how this shared fs will be managed, who would have access to it. Because, once you have that, you (or other folks in your team) can essentially execute any code in a production environment without passing approval steps, etc. It’s up to you, whether that is acceptable. The story of eval(…)
Or curl … | bash
o

Oliver

07/01/2022, 6:54 AM
yea for sure!
2 Views