Skip to content
Snippets Groups Projects
Commit ab35fed7 authored by Thomas Fan's avatar Thomas Fan Committed by Neta Zmora
Browse files

DOC: Fix (#10)

Reviewed and looking good.  We have to set a convention for naming files.
parent a029c9b0
No related branches found
No related tags found
No related merge requests found
......@@ -263,7 +263,7 @@ $ pip3 install -r doc-requirements.txt
To build the project documentation run:
```
$ cd distiller/docs
$ cd distiller/docs-src
$ mkdocs build --clean
```
This will create a folder named 'site' which contains the documentation website.
......
File moved
......@@ -32,13 +32,13 @@ We welcome new ideas and implementations of Jupyter.
Roughly, the notebooks can be divided into three categories.
### Theory
- [jupyter/L1-regularization.ipynb](localhost:8888/notebooks/jupyter/L1-regularization.ipynb): Experience hands-on how L1 and L2 regularization affect the solution of a toy loss-minimization problem, to get a better grasp on the interaction between regularization and sparsity.
- [jupyter/alexnet_insights.ipynb](localhost:8888/notebooks/alexnet_insights.ipynb): This notebook reviews and compares a couple of pruning sessions on Alexnet. We compare distributions, performance, statistics and show some visualizations of the weights tensors.
- [jupyter/L1-regularization.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/L1-regularization.ipynb): Experience hands-on how L1 and L2 regularization affect the solution of a toy loss-minimization problem, to get a better grasp on the interaction between regularization and sparsity.
- [jupyter/alexnet_insights.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/alexnet_insights.ipynb): This notebook reviews and compares a couple of pruning sessions on Alexnet. We compare distributions, performance, statistics and show some visualizations of the weights tensors.
### Preparation for compression
- [jupyter/model_summary.ipynb](localhost:8888/notebooks/jupyter/model_summary.ipynb): Begin by getting familiar with your model. Examine the sizes and properties of layers and connections. Study which layers are compute-bound, and which are bandwidth-bound, and decide how to prune or regularize the model.
- [jupyter/sensitivity_analysis.ipynb](localhost:8888/notebooks/jupyter/sensitivity_analysis.ipynb): If you performed pruning sensitivity analysis on your model, this notebook can help you load the results and graphically study how the layers behave.
- [jupyter/interactive_lr_scheduler.ipynb](localhost:8888/notebooks/jupyter/interactive_lr_scheduler.ipynb): The learning rate decay policy affects pruning results, perhaps as much as it affects training results. Graph a few LR-decay policies to see how they behave.
- [jupyter/jupyter/agp_schedule.ipynb](localhost:8888/notebooks/jupyter/agp_schedule.ipynb): If you are using the Automated Gradual Pruner, this notebook can help you tune the schedule.
- [jupyter/model_summary.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/model_summary.ipynb): Begin by getting familiar with your model. Examine the sizes and properties of layers and connections. Study which layers are compute-bound, and which are bandwidth-bound, and decide how to prune or regularize the model.
- [jupyter/sensitivity_analysis.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/sensitivity_analysis.ipynb): If you performed pruning sensitivity analysis on your model, this notebook can help you load the results and graphically study how the layers behave.
- [jupyter/interactive_lr_scheduler.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/interactive_lr_scheduler.ipynb): The learning rate decay policy affects pruning results, perhaps as much as it affects training results. Graph a few LR-decay policies to see how they behave.
- [jupyter/jupyter/agp_schedule.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/agp_schedule.ipynb): If you are using the Automated Gradual Pruner, this notebook can help you tune the schedule.
### Reviewing experiment results
- [jupyter/compare_executions.ipynb](localhost:8888/notebooks/jupyter/compare_executions.ipynb): This is a simple notebook to help you graphically compare the results of executions of several experiments.
- [jupyter/compression_insights.ipynb](localhost:8888/notebooks/compression_insights.ipynb): This notebook is packed with code, tables and graphs to us understand the results of a compression session. Distiller provides *summaries*, which are Pandas dataframes, which contain statistical information about you model. We chose to use Pandas dataframes because they can be sliced, queried, summarized and graphed with a few lines of code.
- [jupyter/compare_executions.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/compare_executions.ipynb): This is a simple notebook to help you graphically compare the results of executions of several experiments.
- [jupyter/compression_insights.ipynb](https://github.com/NervanaSystems/distiller/blob/master/jupyter/compression_insights.ipynb): This notebook is packed with code, tables and graphs to us understand the results of a compression session. Distiller provides *summaries*, which are Pandas dataframes, which contain statistical information about you model. We chose to use Pandas dataframes because they can be sliced, queried, summarized and graphed with a few lines of code.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment