Assignment #2 - Convolutional Neural Networks
Download [attachment]
Late Policy
- You have free 8 late days.
- You can use late days for assignments. A late day extends the deadline 24 hours.
- Once you have used all 8 late days, the penalty is 10% for each additional late day.
Goals
In this assignment you will practice writing backpropagation code, and training Neural Networks and Convolutional Neural Networks. The goals of this assignment are as follows:
- Understand Neural Networks and how they are arranged in layered architectures.
- Understand and be able to implement (vectorized) backpropagation.
- Implement various update rules used to optimize Neural Networks.
- Implement Batch Normalization and Layer Normalization for training deep networks.
- Implement Dropout to regularize networks.
- Understand the architecture of Convolutional Neural Networks and get practice with training them.
- Gain experience with a major deep learning framework, PyTorch.
Q1: Multi-Layer Fully Connected Neural Networks
The notebook FullyConnectedNets.ipynb
will have you implement fully connected
networks of arbitrary depth. To optimize these models you will implement several
popular update rules.
Q2: Batch Normalization
In notebook BatchNormalization.ipynb
you will implement batch normalization, and use it to train deep fully connected networks.
Q3: Dropout
The notebook Dropout.ipynb
will help you implement dropout and explore its effects on model generalization.
Q4: Convolutional Neural Networks
In the notebook ConvolutionalNetworks.ipynb
you will implement several new layers that are commonly used in convolutional networks.
Q5: PyTorch on CIFAR-10
For this part, you will be working with PyTorch, a popular and powerful deep learning framework.
Open up PyTorch.ipynb
. There, you will learn how the framework works, culminating in training a convolutional network of your own design on CIFAR-10 to get the best performance you can.