Note that there are some explanatory texts on larger screens.

plurals
  1. POsmoothing irregularly sampled time data
    text
    copied!<p>Given a table where the first column is seconds past a certain reference point and the second one is an arbitrary measurement:</p> <pre><code>6 0.738158581 21 0.801697222 39 1.797224596 49 2.77920469 54 2.839757536 79 3.832232283 91 4.676794376 97 5.18244704 100 5.521878863 118 6.316630137 131 6.778507504 147 7.020395216 157 7.331607129 176 7.637492223 202 7.848079136 223 7.989456499 251 8.76853608 278 9.092367123 ... </code></pre> <p>As you see, the measurements are sampled at irregular time points. I need to smooth the data by averaging the reading up to 100 seconds prior each measurement (in Python). Since the data table is huge, an iterator-based method is really preferred. Unfortunately, after two hours of coding I can't figure out efficient and elegant solution.</p> <p>Can anyone help me?</p> <p><strong>EDIT</strong>s</p> <ol> <li><p>I want one smoothed reading for each raw reading, and the smoothed reading is to be the arithmetic mean of the raw reading and any others in the previous 100 (delta) seconds. (John, you are right)</p></li> <li><p>Huge ~ 1e6 - 10e6 lines + need to work with tight RAM</p></li> <li><p>The data is approximately random walk</p></li> <li><p>The data is sorted</p></li> </ol> <h1>RESOLUTION</h1> <p>I have tested solutions proposed by J Machin and yairchu. They both gave the same results, however, on my data set, J Machin's version performs exponentially, while that of yairchu is linear. Following are execution times as measured by IPython's <em>%timeit</em> (in microseconds):</p> <pre><code>data size J Machin yairchu 10 90.2 55.6 50 930 258 100 3080 514 500 64700 2660 1000 253000 5390 2000 952000 11500 </code></pre> <p>Thank you all for the help.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload