minami

Forward Propagation

1. Basic Neural Network Concepts

2. Simple Neural Network Code

weight = 0.1
def neural_network(input, weight):
    prediction = input * weight
    return prediction

3. Multiple Inputs Neural Network

weights = np.array([0.1, 0.2, 0])
def neural_network(input, weights):
    pred = w_sum(input, weights)
    return pred

4. Vector Operations

def w_sum(a,b):
    assert(len(a) == len(b))
    output = 0
    for i in range(len(a)):
        output += a[i] * b[i]
    return output

5. Multiple Outputs

def ele_mul(number, vector):
    output = [0,0,0]
    assert(len(output) == len(vector))
    for i in range(len(vector)):
        output[i] = number * vector[i]
    return output

6. Multiple Inputs and Outputs

ih_wgt = np.array([
    [0.1, 0.2, -0.1],  # hid[0]
    [-0.1,0.1, 0.9],   # hid[1]
    [0.1, 0.4, 0.1]]).T

def neural_network(input, weights):
    hid = input.dot(weights[0])
    pred = hid.dot(weights[1])
    return pred

7. Hidden Layers

8. Important Concepts Learned

9. Dataset Structure Examples

toes = np.array([8.5, 9.5, 9.9, 9.0])
wlrec = np.array([0.65, 0.8, 0.8, 0.9])
nfans = np.array([1.2, 1.3, 0.5, 1.0])

Then watched 3b1b video on neural networks and most probably tomorrow will start with some C/Assembly