Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Regarding the number of neurons in the hidden layer, for such an small example two are more than enough. The only way to know for sure the optimum is to test with different numbers. In this faq you can find a rule of thumb that may be useful: <a href="http://www.faqs.org/faqs/ai-faq/neural-nets/" rel="nofollow">http://www.faqs.org/faqs/ai-faq/neural-nets/</a></p> <p>For the output function, it is often useful to divide it in two steps:</p> <p>First, given the input vector <strong>x</strong>, the output of the neurons in the hidden layer is <strong>y</strong> = f(<strong>x</strong>) = <strong>x</strong>^T <strong>w</strong> + <strong>b</strong> where <strong>w</strong> is the weight matrix from the input neurons to the hidden layer and <strong>b</strong> is the bias vector.</p> <p>Second, you will have to apply the activation function g of the network to the resulting vector of the previous step <strong>z</strong> = g(<strong>y</strong>)</p> <p>Finally, the output is the dot product h(<strong>z</strong>) = <strong>z</strong> . <strong>v</strong> + n, where <strong>v</strong> is the weight vector from the hidden layer to the output neuron and n the bias. In the case of more than one output neurons, you will repeat this for each one.</p> <p>I've never used the matlab mlp functions, so I don't know how to get the weights in this case, but I'm sure the network stores them somewhere. Edit: Searching the documentation I found the properties:</p> <ul> <li>net.IW numLayers-by-numInputs cell array of input weight values</li> <li>net.LW numLayers-by-numLayers cell array of layer weight values</li> <li>net.b numLayers-by-1 cell array of bias values</li> </ul>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload