Quantcast
Channel: Neural network activation/output - Stack Overflow
Viewing all articles
Browse latest Browse all 4

Neural network activation/output

$
0
0

Here is my code (for a neuron in a MLP network):

double summation = 0;for (int i = 0; i < weights.length; i++) {    summation += inputs[i] * weights[i];}double normalized = Math.tanh(summation);if (normalized > 0.9 || normalized < -0.9) {    activated = 1;} else {    activated = 0;}

I think it is incorrect. Is the output supposed to be the normalized value, or is it always limited to 0 or 1?


Viewing all articles
Browse latest Browse all 4

Latest Images

Trending Articles





Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>
<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596344.js" async> </script>