1D Regression
Controls
Network Architecture
Input Layer (1 neuron)
x-coordinate (single number)
Hidden Layer 1 (10 neurons)
Dense layer with ReLU activation
Transforms input into 10 features
Hidden Layer 2 (10 neurons)
Dense layer with ReLU activation
Further processes the 10 features
Output Layer (1 neuron)
Outputs: predicted y-coordinate
Training Status:
Training Points: 0
Epochs Completed: 0
Current Loss: 0.000000
Total MSE (All Points): 0.000000
Optimizer: ADAM
Status: Stopped
Speed: Slow
Training
Durign training, the network is;
For each epoch:
For each training point (x, y):
1. Feed x to input neuron
2. Get prediction ŷ from output neuron
3. Calculate MSE = (ŷ - y)²
4. Adjust network weights to reduce this error
5. Move to next point
Click on graph to add training points
W & B
Weights & Biases will be here after training has begun