Note that there are some explanatory texts on larger screens.

plurals
  1. POBest practices of dealing with very large arrays? DB?
    text
    copied!<p>In another question, I stated that I was running a php script that grabbed 150 rows of data from a mysql database, then did some calculations on that data, which it put into an array (120 elements with an array of 30 for each, or roughly 3600 elements total). The "results" array is needed, because I create a graph with the data. This script works fine. </p> <p>I wanted to expand my script to a more dense dataset (which would provide better results). The dataset is 1700 rows, which would end up with a "results" array of 1340 elements with an array of 360 for each, or roughly 482,400 elements total. Problem is, I've tried this and came up with some heinous memory errors. </p> <p>As described to me in the previous question I posted, the size of that results array is probably overwhelming the server memory</p> <blockquote> <p>In you second larger sample it will be array(1700,1699). At 144 bytes per element thats 415,915,200 bytes, thats slightly over 406Meg + remaining bucket space, just to hold the results of your calculations.</p> </blockquote> <p>I am not familiar with the typical ways to deal with this issue. I was considering for the larger set of data, serializing and base64_encode'ing each of the 1340 result array elements, as it runs (or every 10 or 20. 1340 db calls might be too much), and uploading to a SQL server, and unsetting the results array as to free up memory. I could then make my report and my graph by querying the DB for the specific information, rather than having it ALL in a huge array.</p> <p>Any other way of doing this?</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload