Skip to content
Snippets Groups Projects
  • Lev Zlotnik's avatar
    a3c8d86f
    LSTM: Modular implementation + Post-Train Quantization Sample (#196) · a3c8d86f
    Lev Zlotnik authored
    * Introduce a modular, Python-level implementation of LSTM/LSTMCell
      using existing PyTorch nn.Modules as building blocks
    * This allows quantization of weights and internal activations of
      LSTM layers using the existing Quantizer. 
      (In the PyTorch implementation of RNN/LSTM only the weights are 
      exposed at the Python level, whereas the internal activations are 
      "hidden" in C++ code.)
    * Supports stacked (multi-layer) and bi-directional LSTM
    * Implemented conversion functions from PyTorch LSTM module to
      our LSTM module and vice-versa
    * Tests for modular implementation correctness and for conversions
    * Jupyter notebook showing post-training quantization of a language
      model
    a3c8d86f
    History
    LSTM: Modular implementation + Post-Train Quantization Sample (#196)
    Lev Zlotnik authored
    * Introduce a modular, Python-level implementation of LSTM/LSTMCell
      using existing PyTorch nn.Modules as building blocks
    * This allows quantization of weights and internal activations of
      LSTM layers using the existing Quantizer. 
      (In the PyTorch implementation of RNN/LSTM only the weights are 
      exposed at the Python level, whereas the internal activations are 
      "hidden" in C++ code.)
    * Supports stacked (multi-layer) and bi-directional LSTM
    * Implemented conversion functions from PyTorch LSTM module to
      our LSTM module and vice-versa
    * Tests for modular implementation correctness and for conversions
    * Jupyter notebook showing post-training quantization of a language
      model