Forward Propagation
1. Basic Neural Network Concepts
- Started with single input, single output networks
- Learned about weights and predictions
- Basic formula:
prediction = input * weight
2. Simple Neural Network Code
weight = 0.1
def neural_network(input, weight):
prediction = input * weight
return prediction
3. Multiple Inputs Neural Network
- Learned how to handle multiple inputs
- Used weighted sums for predictions
- Introduced the concept of matrix operations
weights = np.array([0.1, 0.2, 0])
def neural_network(input, weights):
pred = w_sum(input, weights)
return pred
4. Vector Operations
- Learned about vector dot products
- Weighted sum implementation:
def w_sum(a,b):
assert(len(a) == len(b))
output = 0
for i in range(len(a)):
output += a[i] * b[i]
return output
5. Multiple Outputs
- Extended to multiple output predictions
- Used elementwise multiplication
def ele_mul(number, vector):
output = [0,0,0]
assert(len(output) == len(vector))
for i in range(len(vector)):
output[i] = number * vector[i]
return output
6. Multiple Inputs and Outputs
- Combined everything into a full network
- Matrix multiplication for predictions
ih_wgt = np.array([
[0.1, 0.2, -0.1], # hid[0]
[-0.1,0.1, 0.9], # hid[1]
[0.1, 0.4, 0.1]]).T
def neural_network(input, weights):
hid = input.dot(weights[0])
pred = hid.dot(weights[1])
return pred
7. Hidden Layers
- Introduced hidden layer concepts
- Learned about layer stacking
- Two-step prediction process:
- Input → Hidden layer
- Hidden layer → Output
8. Important Concepts Learned
- Weight sensitivity affects predictions
- Neural networks can handle negative numbers
- Matrix orientation (transpose) matters
- Input data normalization is important
9. Dataset Structure Examples
toes = np.array([8.5, 9.5, 9.9, 9.0])
wlrec = np.array([0.65, 0.8, 0.8, 0.9])
nfans = np.array([1.2, 1.3, 0.5, 1.0])
Then watched 3b1b video on neural networks and most probably tomorrow will start with some C/Assembly