Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    primarykey
    data
    text
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. VO
      singulars
      1. This table or related slice is empty.
    2. VO
      singulars
      1. This table or related slice is empty.
    3. VO
      singulars
      1. This table or related slice is empty.
    1. COThis is an appealing explanation but I'm not entirely convinced as the files are held simultaneously by separate Python processes, so memory used in the parsing should be returned to the OS. Rerunning concurrent_parse() grinds my machine down to a halt (I gave it about ten minutes) as the memory maxes out and it starts paging everything. If I rerun it but with only 2-4 files then the memory does seem to stabilize around 2GB. However, rerunning with 4-6 files sometimes works fine, other times hits the memory limit. Either way, `multiprocessing`'s maybe not the magic bullet I was hoping for!
      singulars
    2. COWere you resetting `trees` to all None? This is important, as I have discovered, because the child processes get a copy of the objects from the main process, so if your trees has a lot of data that gets multiplied by the number of processes. After a bit of experimentation it looks like there is no increase in memory after running concurrent_parse() repeatedly so long as trees is reset between runs (at least with python 2.7 on CentOS 5). I would guess that the increase in memory usage when using multiprocess is due to IPC serialisation.
      singulars
    3. COHmm I see what you mean. I think you are probably right with your answer in that it is not a memory leak. But I'm not entirely satisfied as to why the original process ends up using an extra 700MB of memory when all of the copying of instances is into different processes. Either way, I'll lay it to rest as the Python garbage system is beyond the scope of the question. Thanks!
      singulars
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload