Skip to content
Snippets Groups Projects
  • Neta Zmora's avatar
    02054da1
    AMC and other refactoring - large merge (#339) · 02054da1
    Neta Zmora authored
    *An implementation of AMC (the previous implementation
     code has moved to a new location under 
    /distiller/examples/auto_compression/amc.  AMC is aligned
    with the ‘master’ branch of Coach.
    *compress_classifier.py is refactored.  The base code moved
    to /distiller/apputils/image_classifier.py.  Further refactoring
    will follow.
    We want to provide a simple and small API to the basic features of
    a classifier-compression application.
    This will help applications that want to use the make features of a
    classifier-compression application, without the standard training
    regiment.
    AMC is one example of a stand-alone application that needs to leverage
    the capabilities of a classifier-compression application, but is currently
    coupled to `compress_classifier.py`.
    `multi-finetune.py` is another example.
    * ranked_structures_pruner.py:
    ** Added support for grouping channels/filters
    Sometimes we want to prune a group of structures: e.g. groups of
    8-channels.  This feature does not force the groups to be adjacent,
    so it is more like a set of structures.  E.g. in the case of pruning
    channels from a 64-channels convolution, grouped by 8 channels, we 
    will prune exactly one of 0/8/16/24/32/40/48/56 channels.  I.e. 
    always a multiple of 8-channels, excluding the set of all 64 channels.
    ** Added FMReconstructionChannelPruner – this is channel
    pruning using L1-magnitude to rank and select channels to
    remove, and feature-map reconstruction to improve the
    resilience to the pruning.
    * Added a script to run multiple instances of an 
    experiment, in different processes:
     examples/classifier_compression/multi-run.py
    * Set the seed value even when not specified by the command-line
    arguments, so that we can try and recreate the session.
    * Added pruning ranking noise -
    Ranking noise introduces Gaussian noise when ranking channels/filters
    using Lp-norm.  The noise is introduced using the epsilon-greedy
    methodology, where ranking using exact Lp-norm is considered greedy.
    * Added configurable rounding of pruning level: choose whether to 
    Round up/down when rounding the number of structures to prune 
    (rounding is always to an integer).  
    AMC and other refactoring - large merge (#339)
    Neta Zmora authored
    *An implementation of AMC (the previous implementation
     code has moved to a new location under 
    /distiller/examples/auto_compression/amc.  AMC is aligned
    with the ‘master’ branch of Coach.
    *compress_classifier.py is refactored.  The base code moved
    to /distiller/apputils/image_classifier.py.  Further refactoring
    will follow.
    We want to provide a simple and small API to the basic features of
    a classifier-compression application.
    This will help applications that want to use the make features of a
    classifier-compression application, without the standard training
    regiment.
    AMC is one example of a stand-alone application that needs to leverage
    the capabilities of a classifier-compression application, but is currently
    coupled to `compress_classifier.py`.
    `multi-finetune.py` is another example.
    * ranked_structures_pruner.py:
    ** Added support for grouping channels/filters
    Sometimes we want to prune a group of structures: e.g. groups of
    8-channels.  This feature does not force the groups to be adjacent,
    so it is more like a set of structures.  E.g. in the case of pruning
    channels from a 64-channels convolution, grouped by 8 channels, we 
    will prune exactly one of 0/8/16/24/32/40/48/56 channels.  I.e. 
    always a multiple of 8-channels, excluding the set of all 64 channels.
    ** Added FMReconstructionChannelPruner – this is channel
    pruning using L1-magnitude to rank and select channels to
    remove, and feature-map reconstruction to improve the
    resilience to the pruning.
    * Added a script to run multiple instances of an 
    experiment, in different processes:
     examples/classifier_compression/multi-run.py
    * Set the seed value even when not specified by the command-line
    arguments, so that we can try and recreate the session.
    * Added pruning ranking noise -
    Ranking noise introduces Gaussian noise when ranking channels/filters
    using Lp-norm.  The noise is introduced using the epsilon-greedy
    methodology, where ranking using exact Lp-norm is considered greedy.
    * Added configurable rounding of pruning level: choose whether to 
    Round up/down when rounding the number of structures to prune 
    (rounding is always to an integer).  
multi-run.py 3.13 KiB