Skip to content
Snippets Groups Projects
  1. Nov 25, 2018
    • Neta Zmora's avatar
      Activation statistics: fix computation of Channel-wise APoZ · 6fb0fed5
      Neta Zmora authored
      Instead of returning the average-percentage-of-zeros, we returned the
      average-percentage-of-non-zeros.
      So inverted the results, and also multiplied by 100, because the name
      of the statistic has "percentage" not "fraction" (not very important,
      but still...)
      6fb0fed5
  2. Nov 24, 2018
    • Neta Zmora's avatar
      Fix activation stats for Linear layers · 22e3ea8b
      Neta Zmora authored
      Thanks to Dan Alistarh for bringing this issue to my attention.
      The activations of Linear layers have shape (batch_size, output_size) and those
      of Convolution layers have shape (batch_size, num_channels, width, height) and
      this distinction in shape was not correctly handled.
      
      This commit also fixes sparsity computation for very large activations, as seen
      in VGG16, which leads to memory exhaustion.  One solution is to use smaller
      batch sizes, but this commit uses a different solution, which counts zeros “manually”,
      and using less space.
      
      Also in this commit:
      - Added a “caveats” section to the documentation.
      - Added more tests.
      22e3ea8b
  3. Jun 13, 2018
  4. Apr 25, 2018
  5. Apr 24, 2018
Loading