Note that there are some explanatory texts on larger screens.

plurals
  1. POResilient backpropagation neural network - question about gradient
    text
    copied!<p>First I want to say that I'm really new to neural networks and I don't understand it very good ;)</p> <p>I've made my first C# implementation of the backpropagation neural network. I've tested it using XOR and it looks it work.</p> <p>Now I would like change my implementation to use resilient backpropagation (Rprop - <a href="http://en.wikipedia.org/wiki/Rprop" rel="nofollow noreferrer">http://en.wikipedia.org/wiki/Rprop</a>). </p> <p>The definition says: "Rprop takes into account only the sign of the partial derivative over all patterns (not the magnitude), and acts independently on each "weight". </p> <p>Could somebody tell me what partial derivative over all patterns is? And how should I compute this partial derivative for a neuron in hidden layer.</p> <p>Thanks a lot</p> <p>UPDATE:</p> <p>My implementation base on this Java code: www_.dia.fi.upm.es/~jamartin/downloads/bpnn.java</p> <p>My backPropagate method looks like this:</p> <pre><code>public double backPropagate(double[] targets) { double error, change; // calculate error terms for output double[] output_deltas = new double[outputsNumber]; for (int k = 0; k &lt; outputsNumber; k++) { error = targets[k] - activationsOutputs[k]; output_deltas[k] = Dsigmoid(activationsOutputs[k]) * error; } // calculate error terms for hidden double[] hidden_deltas = new double[hiddenNumber]; for (int j = 0; j &lt; hiddenNumber; j++) { error = 0.0; for (int k = 0; k &lt; outputsNumber; k++) { error = error + output_deltas[k] * weightsOutputs[j, k]; } hidden_deltas[j] = Dsigmoid(activationsHidden[j]) * error; } //update output weights for (int j = 0; j &lt; hiddenNumber; j++) { for (int k = 0; k &lt; outputsNumber; k++) { change = output_deltas[k] * activationsHidden[j]; weightsOutputs[j, k] = weightsOutputs[j, k] + learningRate * change + momentumFactor * lastChangeWeightsForMomentumOutpus[j, k]; lastChangeWeightsForMomentumOutpus[j, k] = change; } } // update input weights for (int i = 0; i &lt; inputsNumber; i++) { for (int j = 0; j &lt; hiddenNumber; j++) { change = hidden_deltas[j] * activationsInputs[i]; weightsInputs[i, j] = weightsInputs[i, j] + learningRate * change + momentumFactor * lastChangeWeightsForMomentumInputs[i, j]; lastChangeWeightsForMomentumInputs[i, j] = change; } } // calculate error error = 0.0; for (int k = 0; k &lt; outputsNumber; k++) { error = error + 0.5 * (targets[k] - activationsOutputs[k]) * (targets[k] - activationsOutputs[k]); } return error; } </code></pre> <p>So can I use <code>change = hidden_deltas[j] * activationsInputs[i]</code> variable as a gradient (partial derivative) for checking the sing?</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload