Hi, I am trying to run Pyspark locally from within...
# ask-community
m
Hi, I am trying to run Pyspark locally from within a job. It works locally, but once I deploy dagster in k8s using helm, I get an error
Java gateway process exited before sending its port number
. It works locally, but not on kubernetes and I am out of ideas. I wanted to try https://github.com/dagster-io/dagster/issues/2748, but all links are broken
d
Hi Manuel, I see this issue reported here with some suggested resolution steps (installing java as part of the container running the code): https://dagster.slack.com/archives/C01U954MEER/p1669810139914499
m
Thank you, the issue is fixed now. I downloaded the jars using
spark.jars.packages
which did not work in the dagster environment. I now load the jars into the image and reference them using
spark.jars