Reproducing "Deep Neural Nets: 33 years ago and 33 years from now"
In this sequence of experiments, you will reproduce a result from machine learning from 1989 - a paper that is possibly the earliest real-world application of a neural network trained end-to-end with backpropagation. (In earlier neural networks, some weights were actually hand-tuned!) You'll go a few steps further, though - after reproducing the results of the original paper, you'll get to use some modern 'tricks' to try and improve the performance of the model without changing its underlying infrastructure (i.e. no change in inference time!)
You can find these materials at: https://github.com/teaching-on-testbeds/deep-nets-reproducing
Attribution: This sequence of notebooks is adapted from the ICLR 2022 blog post "Deep Neural Nets: 33 years ago and 33 years from now" (https://iclr-blog-track.github.io/2022/03/26/lecun1989/) by Andrej Karpathy, and the associated Github repository (https://github.com/karpathy/lecun1989-repro).
Launching this artifact will open it within Chameleon’s shared Jupyter experiment environment, which is accessible to all Chameleon users with an active allocation.
Request daypassIf you do not have an active Chameleon allocation, or would prefer to not use your allocation, you can request a temporary one from the PI of the project this artifact belongs to.
Download ArchiveDownload an archive containing the files of this artifact.
Download with git
Clone the git repository for this artifact, and checkout the version's commit
git clone https://github.com/teaching-on-testbeds/deep-nets-reproducing
# cd into the created directory
git checkout bdf2ce7b5687154a1d2409f318b746a6999614c1
Submit feedback through GitHub issues