Hello :smile: :wave: I'm running a pipeline deploy...
# announcements
j
Hello 😄 👋 I'm running a pipeline deployed with dask on my local machine and encounter the following error message:
Copy code
distributed.worker - WARNING - Memory use is high but worker has no data to store to disk.  Perhaps some other process is leaking memory?  Process memory: 907.54 MB -- Worker memory limit: 1.07 GB
As a fix, I have tried to increase the memory limit for my workers to around 3GB for a memory expensive solid:
Copy code
@dg.solid(
  ...
  tags={'dagster-dask/resource_requirements': {"MEMORY": 3e9}},
)
def parse_dimensions(context, ...):
But, the solid doesn't seem to be executed at all when I run the pipeline. Any insight or general knowledge regarding this type of problem would be very appreciated.
a
on my local machine
You could try to just use the multiprocess executor and see if you hit the same issues?
👍 1
j
Thank you for the direction! I don't hit the same issue when using the multiprocess executor
a
not sure what guidance to provide on debugging the dask specific issues