- Oct 23, 2019
-
-
Neta Zmora authored
As documented in issue #395, some of the command-line examples in the AMC notebooks are incorrect. Also, fix some bugs that were introduced with the refactoring of the low-level pruning API
-
- Oct 07, 2019
-
-
Neta Zmora authored
-
- Sep 27, 2019
-
-
Neta Zmora authored
Move these files to their true location, instead of using soft-links. Also added a short README file to distiller/examples/baseline_networks directory.
-
- Sep 06, 2019
-
-
Neta Zmora authored
Integrate the code for the DDPG agent from: https://github.com/mit-han-lab/amc-release The instructions for cloning HAN's code and then making changes to fit Distiller were too complicated, so we added the integrated files to distiller/examples/auto_compression/amc/rl_lib/hanlab
-
- Sep 02, 2019
-
-
Neta Zmora authored
Mainly: moved NetworkWrapper to a separate file.
-
- Sep 01, 2019
-
-
Neta Zmora authored
FMReconstructionChannelPruner: add support for nn.Linear layers utils.py: add non_zero_channels() thinning: support removing channels from FC layers preceding Conv layers test_pruning.py: add test_row_pruning() scheduler: init from a dictionary of Maskers coach_if.py – fix imports of Clipped-PPO and TD3
-
- Aug 13, 2019
-
-
Neta Zmora authored
-
Neta Zmora authored
-
Neta Zmora authored
-
- Aug 11, 2019
-
-
Neta Zmora authored
-
Neta Zmora authored
-
Neta Zmora authored
-
- Aug 07, 2019
-
-
Neta Zmora authored
-
- Aug 06, 2019
-
-
Neta Zmora authored
*An implementation of AMC (the previous implementation code has moved to a new location under /distiller/examples/auto_compression/amc. AMC is aligned with the ‘master’ branch of Coach. *compress_classifier.py is refactored. The base code moved to /distiller/apputils/image_classifier.py. Further refactoring will follow. We want to provide a simple and small API to the basic features of a classifier-compression application. This will help applications that want to use the make features of a classifier-compression application, without the standard training regiment. AMC is one example of a stand-alone application that needs to leverage the capabilities of a classifier-compression application, but is currently coupled to `compress_classifier.py`. `multi-finetune.py` is another example. * ranked_structures_pruner.py: ** Added support for grouping channels/filters Sometimes we want to prune a group of structures: e.g. groups of 8-channels. This feature does not force the groups to be adjacent, so it is more like a set of structures. E.g. in the case of pruning channels from a 64-channels convolution, grouped by 8 channels, we will prune exactly one of 0/8/16/24/32/40/48/56 channels. I.e. always a multiple of 8-channels, excluding the set of all 64 channels. ** Added FMReconstructionChannelPruner – this is channel pruning using L1-magnitude to rank and select channels to remove, and feature-map reconstruction to improve the resilience to the pruning. * Added a script to run multiple instances of an experiment, in different processes: examples/classifier_compression/multi-run.py * Set the seed value even when not specified by the command-line arguments, so that we can try and recreate the session. * Added pruning ranking noise - Ranking noise introduces Gaussian noise when ranking channels/filters using Lp-norm. The noise is introduced using the epsilon-greedy methodology, where ranking using exact Lp-norm is considered greedy. * Added configurable rounding of pruning level: choose whether to Round up/down when rounding the number of structures to prune (rounding is always to an integer).
-