hey folks, another pyspark on kubernetes question....
# ask-community
hey folks, another pyspark on kubernetes question. As you may know pyspark on kubernetes with multiple executor requires a headless service ( docs here ) . I wonder if there is any out-of-box functionality to create a
headless service
that points to driver job/pod when the job is executed. Otherwise the spark executor pods can not communicate with dagster job/pod ( which is the spark driver ) .
cc @daniel @rex
any update guys ?
I can add kubernetes python library and create the service in the pipeline but this requires additional authorization in kubernetes and not safe to grant dagster service account to create services.