-
- Downloads
LSTM: Modular implementation + Post-Train Quantization Sample (#196)
* Introduce a modular, Python-level implementation of LSTM/LSTMCell using existing PyTorch nn.Modules as building blocks * This allows quantization of weights and internal activations of LSTM layers using the existing Quantizer. (In the PyTorch implementation of RNN/LSTM only the weights are exposed at the Python level, whereas the internal activations are "hidden" in C++ code.) * Supports stacked (multi-layer) and bi-directional LSTM * Implemented conversion functions from PyTorch LSTM module to our LSTM module and vice-versa * Tests for modular implementation correctness and for conversions * Jupyter notebook showing post-training quantization of a language model
Showing
- distiller/modules/__init__.py 5 additions, 2 deletionsdistiller/modules/__init__.py
- distiller/modules/grouping.py 29 additions, 0 deletionsdistiller/modules/grouping.py
- distiller/modules/rnn.py 402 additions, 0 deletionsdistiller/modules/rnn.py
- distiller/utils.py 1 addition, 0 deletionsdistiller/utils.py
- examples/word_language_model/manual_lstm_pretrained_stats.yaml 566 additions, 0 deletions...les/word_language_model/manual_lstm_pretrained_stats.yaml
- examples/word_language_model/manual_lstm_pretrained_stats_new.yaml 566 additions, 0 deletions...word_language_model/manual_lstm_pretrained_stats_new.yaml
- examples/word_language_model/model.py 40 additions, 1 deletionexamples/word_language_model/model.py
- examples/word_language_model/quantize_lstm.ipynb 833 additions, 0 deletionsexamples/word_language_model/quantize_lstm.ipynb
- tests/test_lstm_impl.py 93 additions, 0 deletionstests/test_lstm_impl.py
Loading
Please register or sign in to comment