Skip to content
Snippets Groups Projects
  1. May 27, 2019
  2. May 19, 2019
  3. May 02, 2019
  4. Apr 08, 2019
  5. Apr 01, 2019
    • Lev Zlotnik's avatar
      Quantizer: Specify # bias bits + custom overrides (BREAKING) (#178) · 5271625a
      Lev Zlotnik authored
      * Bias handling:
        * Add 'bits_bias' parameter to explicitly specify # of bits for bias,
          similar to weights and activations.
        * BREAKING: Remove the now redundant 'quantize_bias' boolean parameter
      * Custom overrides:
        * Expand the semantics of the overrides dict to allow overriding of
          other parameters in addition to bit-widths
        * Functions registered in the quantizer's 'replacement_factory' can
          define keyword arguments. Non bit-width entries in the overrides
          dict will be checked against the function signature and passed
        * BREAKING:
          * Changed the name of 'bits_overrides' to simply 'overrides'
          * Bit-width overrides must now be defined using the full parameter
            names - 'bits_activations/weights/bias' instead of the short-hands
            'acts' and 'wts' which were used so far.
        * Added/updated relevant tests
        * Modified all quantization YAMLs under 'examples' to reflect 
          these changes
        * Updated docs
      5271625a
  6. Feb 26, 2019
  7. Feb 11, 2019
    • Guy Jacob's avatar
      Post-train quant based on stats + additional modules quantized (#136) · 28a8ee18
      Guy Jacob authored
      Summary of changes:
      (1) Post-train quantization based on pre-collected statistics
      (2) Quantized concat, element-wise addition / multiplication and embeddings
      (3) Move post-train quantization command line args out of sample code
      (4) Configure post-train quantization from YAML for more fine-grained control
      
      (See PR #136 for more detailed changes descriptions)
      Unverified
      28a8ee18
  8. Jan 24, 2019
  9. Jan 23, 2019
  10. Dec 04, 2018
    • Guy Jacob's avatar
      Range-Based Linear Quantization Features (#95) · 907a6f04
      Guy Jacob authored
      * Asymmetric post-training quantization (only symmetric supported so until now)
      * Quantization aware training for range-based (min-max) symmetric and asymmetric quantization
      * Per-channel quantization support in both training and post-training
      * Added tests and examples
      * Updated documentation
      Unverified
      907a6f04
  11. Jul 22, 2018
    • Gal Novik's avatar
      PACT quantizer (#30) · df9a00ce
      Gal Novik authored
      * Adding PACT quantization method
      * Move logic modifying the optimizer due to changes the quantizer makes into the Quantizer itself
      * Updated documentation and tests
      df9a00ce
  12. Jul 17, 2018
    • Guy Jacob's avatar
      Quantizer tests, fixes and docs update · 6b166cec
      Guy Jacob authored
      * Add Quantizer unit tests
      * Require 'bits_overrides' to be OrderedDict to support overlapping
        patterns in a predictable manner + update documentation to reflect this
      * Quantizer class cleanup
        * Use "public" nn.Module APIs instead of protected attributes
        * Call the builtins set/get/delattr instead of the class special methods
          (__***__)
        * Fix issues reported in #24
      * Bug in RangeLinearQuantParamLayerWrapper - add explicit override of
        pre_quantized_forward accpeting single input (#15)
      * Add DoReFa test to full_flow_tests
      6b166cec
Loading