Skip to content
Snippets Groups Projects
Commit 52658b87 authored by Neta Zmora's avatar Neta Zmora
Browse files

Word-level language model compression

Added an implementation of Baidu’s RNN pruning scheme:
Narang, Sharan & Diamos, Gregory & Sengupta, Shubho & Elsen, Erich. (2017).
    Exploring Sparsity in Recurrent Neural Networks.
    (https://arxiv.org/abs/1704.05119)

Added an example of word-level language model compression.
The language model is based on PyTorch’s example:
https://github.com/pytorch/examples/tree/master/word_language_model

Added an AGP pruning schedule and RNN pruning schedule to demonstrate
compression of the language model.
parent 47732093
No related branches found
No related tags found
No related merge requests found
Showing
with 45738 additions and 0 deletions
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment