Reproducing "Deep Neural Nets: 33 years ago and 33 years from now"

In this sequence of experiments, you will reproduce a result from machine learning from 1989 - a paper that is possibly the earliest real-world application of a neural network trained end-to-end with backpropagation. (In earlier neural networks, some weights were actually hand-tuned!) You'll go a few steps further, though - after reproducing the results of the original paper, you'll get to use some modern 'tricks' to try and improve the performance of the model without changing its underlying infrastructure (i.e. no change in inference time!)

You can find these materials at: https://github.com/teaching-on-testbeds/deep-nets-reproducing

Attribution: This sequence of notebooks is adapted from the ICLR 2022 blog post "Deep Neural Nets: 33 years ago and 33 years from now" (https://iclr-blog-track.github.io/2022/03/26/lecun1989/) by Andrej Karpathy, and the associated Github repository (https://github.com/karpathy/lecun1989-repro).

71 28 17 2 Jun. 18, 2024, 3:36 AM

Authors

Launch on Chameleon

Launching this artifact will open it within Chameleon’s shared Jupyter experiment environment, which is accessible to all Chameleon users with an active allocation.

Request daypass

If you do not have an active Chameleon allocation, or would prefer to not use your allocation, you can request a temporary one from the PI of the project this artifact belongs to.

Download Archive

Download an archive containing the files of this artifact.

Download with git

Clone the git repository for this artifact, and checkout the version's commit

git clone https://github.com/teaching-on-testbeds/deep-nets-reproducing
# cd into the created directory
git checkout 50e4deb6b5c566c434da84a0dd7d7d87e4ee0b56
Feedback

Submit feedback through GitHub issues

Version Stats

66 25 15