Hi, I've got a gRPC server for my code, but when I...
# ask-community
m
Hi, I've got a gRPC server for my code, but when I click reload location, the new jobs aren't loaded. The only way I can get the new jobs to show up is to restart the grpc docker container. Is there any config I'm missing?
d
Hey Matt - you're not missing any config, but right now restarting the docker container is the only way to pick up the code change if you're running your own gRPC server
m
Ok thanks @daniel! Just making sure I'm doing it right! It's not too difficult to do anyway. I just switched from Airflow last month, and I love Dagster!! SO good!!
🙏 1
d
Nice! Glad you're enjoying it and appreciate your patience with this particular rough edge. We'd like to move to a place where the daemon is more directly managing the gRPC servers, which would make this possible
❤️ 1
j
@daniel out of curiosity what would be the technical reason for the gRPC server to not get the new jobs?
d
The technical reason is that the process needs to restart in order to re-load the python module with the jobs in it
And dagit doesn't know enough about where the server is currently to restart it
j
I have a similar scenario, I have my gRPC server with the user code where the repository method is generating some sensors for me. The only way for me to get the sensors (if my API has new entries or changes) is to restart the gRPC server, right?
Copy code
@repository
def core_generic():
	definitions = [
		main_job
	]

	pipelines = api.pipeline.get().json()

	for pipe in pipelines:			definitions.append(sftp_sensor.create_sensor(pipe))

	return definitions
d
That's right - the repository is loaded on process startup. You do have one option which is to do option 3 here and return a RepositoryData object: https://docs.dagster.io/_apidocs/repositories#dagster.repository
Copy code
######################################################################
# A complex repository that lazily constructs jobs from a directory
# of files in a bespoke YAML format
######################################################################

class ComplexRepositoryData(RepositoryData):
    def __init__(self, yaml_directory):
        self._yaml_directory = yaml_directory

    def get_all_pipelines(self):
        return [
            self._construct_job_def_from_yaml_file(
              self._yaml_file_for_job_name(file_name)
            )
            for file_name in os.listdir(self._yaml_directory)
        ]

    ...

@repository
def complex_repository():
    return ComplexRepositoryData('some_directory')
that get_all_pipelines() call will be made every time dagit reloads the repository
❤️ 1
j
ty!