Forward pass

The forward pass, leading up to the ReLU activation function can be summed up by this, where z = sum of wx + bias:

Step 1: Multiply the inputs (x) with the weights (w)

Step 2: Group the weighted inputs (xw0-2) and the bias (b)

Step 3: Add them all together

Step 4: Feed the result to the activation function

YouTube video here