Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Your question is pretty unclear. Have you tried initializing your <code>Queue</code> to have a maximum size of, say, 64?</p> <pre><code>myq = Queue.Queue(maxsize=64) </code></pre> <p>Then a producer (one or more) trying to <code>.put()</code> new items on <code>myq</code> will block until consumers reduce the queue size to less than 64. This will correspondingly limit the amount of memory consumed by the queue. By default, queues are unbounded: if the producer(s) add items faster than consumers take them off, the queue can grow to consume all the RAM you have.</p> <p><strong>EDIT</strong></p> <blockquote> <p>This is current script. It appears that it is reading the entire csv file into queue and doing a queue.join(). Is it correct that it is loading the entire csv into a queue then spawning the threads?</p> </blockquote> <p>The indentation is messed up in your post, so have to guess some, but:</p> <ol> <li>The code obviously starts 32 threads before it opens the CSV file.</li> <li>You didn't show the code that creates the queue. As already explained above, if it's a <code>Queue.Queue</code>, by default it's unbounded, and <em>can</em> grow to any size if your main loop puts items on it faster than your threads remove items from it. Since you haven't said anything about what <code>worker()</code> does (or shown its code), we don't have enough information to guess whether that's the case. But that memory use is out of hand <em>suggests</em> that's the case.</li> <li>And, as also explained, you can stop that easily by specifying a maximum size when you create the queue.</li> </ol> <p>To get better answers, supply better info ;-)</p> <p><strong>ANOTHER EDIT</strong></p> <p>Well, the indentation is still messed up in spots, but it's better. Have you <em>tried</em> any suggestions? Looks like your worker threads each spawn a new process, so they'll take very much longer than it takes just to read another line from the csv file. So it's indeed very likely that you put items on the queue <em>far</em> faster than they're taken off. So, for the umpteenth time ;-), <strong>TRY</strong> initializing the queue with (say) <code>maxsize=64</code>. Then reveal what happens.</p> <p>BTW, the bare <code>except:</code> clause in <code>worker()</code> is a Really Bad Idea. If anything goes wrong, you'll never know. If you <em>have</em> to ignore every possible exception (including even <code>KeyboardInterrupt</code> and <code>SystemExit</code>), at least log the exception info.</p> <p>And note what @JamesAnderson said: unless you have extraordinary hardware resources, trying to run 32 processes at a time is almost certainly slower than running a number of processes that's no more than twice the number of available cores. Then again, that depends too a lot on what your PHP program does. If, for example, the PHP program uses disk I/O heavily, <em>any</em> multiprocessing may be slower than none.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload