Joseph Sayad
05/29/2020, 6:13 PMdistributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 907.54 MB -- Worker memory limit: 1.07 GB
As a fix, I have tried to increase the memory limit for my workers to around 3GB for a memory expensive solid:
@dg.solid(
...
tags={'dagster-dask/resource_requirements': {"MEMORY": 3e9}},
)
def parse_dimensions(context, ...):
But, the solid doesn't seem to be executed at all when I run the pipeline. Any insight or general knowledge regarding this type of problem would be very appreciated.alex
05/29/2020, 6:34 PMon my local machineYou could try to just use the multiprocess executor and see if you hit the same issues?
Joseph Sayad
06/02/2020, 3:40 PMalex
06/02/2020, 3:54 PM