Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    primarykey
    data
    text
    <blockquote> <p>Now according to wiki, I don't need to go through all data points and I can stop when error is small enough. Is it true?</p> </blockquote> <p>This is especially true when you have a really huge training set and going through all the data points is so expensive. Then, you would check the convergence criterion after K stochastic updates (i.e. after processing K training examples). While it's possible, it doesn't make much sense to do this with a small training set. Another thing people do is randomizing the order in which training examples are processed to avoid having too many correlated examples in a raw which may result in "fake" convergence.</p> <blockquote> <p>I don't understand what should be the stopping criterion here. If anyone can help with this that would be great.</p> </blockquote> <p>There are a few options. I recommend trying as many of them and deciding based on empirical results.</p> <ol> <li>difference in the objective function for the training data is smaller than a threshold. </li> <li>difference in the objective function for held-out data (aka. development data, validation data) is smaller than a threshold. The held-out examples should NOT include any of the examples used for training (i.e. for stochastic updates) nor include any of the examples in the test set used for evaluation.</li> <li>the total absolute difference in parameters w is smaller than a threshold.</li> <li>in 1, 2, and 3 above, instead of specifying a threshold, you could specify a percentage. For example, a reasonable stopping criterion is to stop training when |squared_error(w) - squared_error(previous_w)| &lt; 0.01 * squared_error(previous_w) $$. </li> <li>sometimes, we don't care if we have the optimal parameters. We just want to improve the parameters we originally had. In such case, it's reasonable to preset a number of iterations over the training data and stop after that regardless of whether the objective function actually converged.</li> </ol> <blockquote> <p>With this formula - which I used in for loop - is it correct? I believe <strong>(w.x_i - y_i) * x_t</strong> is my ∆Q(w).</p> </blockquote> <p>It should be <strong>2 * (w.x_i - y_i) * x_t</strong> but it's not a big deal given that you're multiplying by the learning rate <strong>alpha</strong> anyway.</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. VO
      singulars
      1. This table or related slice is empty.
    2. VO
      singulars
      1. This table or related slice is empty.
    3. VO
      singulars
      1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload