Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Check your hardware. Are you maxing the IO? Do you have buffers configured properly? Is your hardware sized correctly? Memory for buffering and fast disks are key.</p> <p>If you have too many indexes, it'll slow inserts down substantially. </p> <p>How are you doing your inserts? If you're doing one record per INSERT statement:</p> <pre><code>INSERT INTO TABLE blah VALUES (?, ?, ?, ?) </code></pre> <p>and calling it 500K times, your performance will suck. I'm surprised it's finishing in hours. With MySQL you can insert hundreds or thousands of rows at a time:</p> <pre><code>INSERT INTO TABLE blah VALUES (?, ?, ?, ?), (?, ?, ?, ?), (?, ?, ?, ?) </code></pre> <p>If you're doing one insert per web requests, you should consider logging to the file system and doing bulk imports on a crontab. I've used that design in the past to speed up inserts. It also means your webpages don't depend on the database server.</p> <p>It's also much faster to use <code>LOAD DATA INFILE</code> to import a CSV file. See <a href="http://dev.mysql.com/doc/refman/5.1/en/load-data.html" rel="nofollow noreferrer">http://dev.mysql.com/doc/refman/5.1/en/load-data.html</a></p> <p>The other thing I can suggest is be wary of the SQL hammer -- you may not have SQL nails. Have you considered using a tool like <a href="http://wiki.apache.org/pig/PigOverview" rel="nofollow noreferrer">Pig</a> or <a href="http://wiki.apache.org/hadoop/Hive" rel="nofollow noreferrer">Hive</a> to generate optimized data sets for your reports?</p> <p><em>EDIT</em></p> <p>If you're having troubles batch importing 500K records, you need to compromise somewhere. I would drop some indexes on your master table, then create optimized views of the data for each report.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload