Simple Affine Layer with Numpy
This post is about Affine Layer based on my understanding. Outline of the post is as follows:
- What is Affine Layer
- Forward Pass
- Backward Pass
- Practice
What is Affine Layer
An affine layer, also called a fully connected layer or dense layer, is a layer in which the input signal of the neuron is multiplied by the weight, summed, and biased. An affine layer can be a layer of an artificial neural network in which all contained nodes connect to all nodes of the subsequent layer. Affine layers are commonly used in convolutional neural networks.
An affine layer is consist of forward and backward pass. Let’s see the affine layer code in python. The codes are from ‘Deep Learning from Scratch’.
Forward Pass
In Figure 1, y represents the output, x represents the input and W represents the weights used in for the linear combination. h() represents sigmoid function, a represents the value after multiplication and addition and z represents the output of sigmoid function.

The input to a node is a linear combination of the outputs of the previous layer with an added bias(represented as b). The output of a node is then calculated by passing this input through an activation function.

We can simply get the output of forward pass by dot product inputs and weights with the addition of bias in pyhton.
Backward Pass
Since the back propagation process uses multiplication and addition in the forward propagation process of the Affine layer, the derivative of the bias inherits the previous derivative.

The derivative of the weight and the input value is transposed and replaced with the previous derivative. The derivative of the bias is calculated by the sum of the previous derivatives.
Practice
Lets practice using the affine layer code. When we have the simple inputs(x), weights(W) and bias(b) below:
Find the value of dx, dW and db.
I would happy if you feel this post is helpful.