Input layer:
Hidden layer 1:
none
ReLU
Sigmoid
Output layer:
none
ReLU
Sigmoid
Eta (taining rate): 0.1
Cycle time : 100ms
trainFunc = function(inputs) { //wrtie your function here //all inputs are between 0-1, modify as needed //example & default return [Math.sin(inputs[0]]]; //return the output as an array }
Start training
Error : 0
Stop training
Iterations : 0