Skip to content
Snippets Groups Projects
user avatar
Neta Zmora authored
Added an implementation of:

Dynamic Network Surgery for Efficient DNNs, Yiwen Guo, Anbang Yao, Yurong Chen.
NIPS 2016, https://arxiv.org/abs/1608.04493.

- Added SplicingPruner: A pruner that both prunes and splices connections.
- Included an example schedule on ResNet20 CIFAR.
- New features for compress_classifier.py:
   1. Added the "--masks-sparsity" which, when enabled, logs the sparsity
      of the weight masks during training.
  2. Added a new command-line argument to report the top N
      best accuracy scores, instead of just the highest score.
      This is sometimes useful when pruning a pre-trained model,
      that has the best Top1 accuracy in the first few pruning epochs.
- New features for PruningPolicy:
   1. The pruning policy can use two copies of the weights: one is used during
       the forward-pass, the other during the backward pass.
       This is controlled by the “mask_on_forward_only” argument.
   2. If we enable “mask_on_forward_only”, we probably want to permanently apply
       the mask at some point (usually once the pruning phase is done).
       This is controlled by the “keep_mask” argument.
   3. We introduce a first implementation of scheduling at the training-iteration
       granularity (i.e. at the mini-batch granularity). Until now we could schedule
       pruning at the epoch-granularity. This is controlled by the “mini_batch_pruning_frequency”
       (disable by setting to zero).

   Some of the abstractions may have leaked from PruningPolicy to CompressionScheduler.
   Need to reexamine this in the future.
60a4f44a
History