Skip to content
Snippets Groups Projects
Unverified Commit 4b3994f2 authored by Neta Zmora's avatar Neta Zmora Committed by GitHub
Browse files

update README

Remove warning regarding Distiller release 0.3 (breaking backward compat)
parent 4ad16ef0
No related branches found
No related tags found
No related merge requests found
......@@ -35,11 +35,6 @@
Network compression can reduce the memory footprint of a neural network, increase its inference speed and save energy. Distiller provides a [PyTorch](http://pytorch.org/) environment for prototyping and analyzing compression algorithms, such as sparsity-inducing methods and low-precision arithmetic.
#### Note on Release 0.3 - Possible BREAKING Changes
As of release 0.3, we've moved some code around to enable proper packaging and installation of Distiller. In addition, we updated Distiller to support PyTorch 1.X, which might also cause older code to break due to some API changes.
If updating from an earlier revision of the code, please make sure to follow the instructions in the [install](#install-the-package) section to make sure proper installation of Distiller and all dependencies.
## Table of Contents
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment