Skip to content
Snippets Groups Projects
Commit d0ee3762 authored by Neta Zmora's avatar Neta Zmora
Browse files

examples/hybrid - fix README

parent 0c175c94
No related branches found
No related tags found
No related merge requests found
......@@ -3,17 +3,17 @@
The examples in this directory show hybrid pruning schedules in which we combine several different pruning strategies.
1. [alexnet.schedule_agp_2Dreg.yaml](https://github.com/NervanaSystems/distiller/blob/master/examples/hybrid/alexnet.schedule_agp_2Dreg.yaml)
<br>
This example presents a pruning-schedule that performs element-wise (fine grain) pruning,
with 2D group (kernel) regularization. The regularization "pushes" 2D kernels towards zero, while
the pruning attends to individual weights coefficients. The pruning schedule is driven by AGP.
This example presents a pruning-schedule that performs element-wise (fine grain) pruning,
with 2D group (kernel) regularization. The regularization "pushes" 2D kernels towards zero, while
the pruning attends to individual weights coefficients. The pruning schedule is driven by AGP.
2. [alexnet.schedule_sensitivity_2D-reg.yaml](https://github.com/NervanaSystems/distiller/blob/master/examples/hybrid/alexnet.schedule_sensitivity_2D-reg.yaml)
<br>
This example also presents a pruning-schedule that performs element-wise (fine grain) pruning,
with 2D group (kernel) regularization. However, the pruner is a `Distiller.pruning.SensitivityPruner` which is
driven by the tensors' [sensitivity](https://nervanasystems.github.io/distiller/algo_pruning.html#sensitivity-pruner), instead of AGP.
This example also presents a pruning-schedule that performs element-wise (fine grain) pruning,
with 2D group (kernel) regularization. However, the pruner is a `Distiller.pruning.SensitivityPruner` which is
driven by the tensors' [sensitivity](https://nervanasystems.github.io/distiller/algo_pruning.html#sensitivity-pruner), instead of AGP.
<br>
|Experiment| Model | Sparsity | Top1 | Baseline Top1
| :---: | --- | :---: | ---: | ---: |
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment