Re: On Warm Starting Neural Network Training

This project is part of the UCSC OSPO summer of reproducibility fellowship and aims to create an interactive notebook that can be used to teach undergraduate or graduate students different levels of reproducibility in machine learning education.

The project is based on the paper "On Warm-Starting Neural Network Training" by Jordan T. Ash and Ryan P. Adams, which was successfully replicated and published on ReScience C.

The paper investigates the problem of training neural networks on incremental data and shows that warm-starting, i.e., initializing the network with the weights from the previous training on a subset of the data, often leads to worse generalization performance than random initialization, even though the training losses are similar. The paper proposes a simple trick to overcome this problem, which involves shrinking and perturbing the weights before retraining. The paper demonstrates that this trick can close the generalization gap and reduce the training time in several scenarios.

8 6 1 1 Sep. 20, 2023, 3:10 PM

Authors

Launch on Chameleon

Launching this artifact will open it within Chameleon’s shared Jupyter experiment environment, which is accessible to all Chameleon users with an active allocation.

Download Archive

Download an archive containing the files of this artifact.

Download with git

Clone the git repository for this artifact, and checkout the version's commit

git clone https://github.com/mohammed183/re_warm_start_nn
# cd into the created directory
git checkout f65ee3bbaf3e131acc916d69c374d0019f719c3a
Feedback

Submit feedback through GitHub issues

Version Stats

8 6 1