We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/NervanaSystems/distiller/tags).
...
...
@@ -317,7 +307,7 @@ This project is licensed under the Apache License 2.0 - see the [LICENSE.md](LIC
- Pascal Bacchus, Robert Stewart, Ekaterina Komendantskaya.<br>
*[Accuracy, Training Time and Hardware Efficiency Trade-Offs for Quantized Neural Networks on FPGAs](https://link.springer.com/chapter/10.1007/978-3-030-44534-8_10)*,<br>
In Applied Reconfigurable Computing. Architectures, Tools, and Applications. ARC 2020. Lecture Notes in Computer Science, vol 12083. Springer, Cham
...
...
@@ -427,7 +417,12 @@ Any published work is built on top of the work of many other people, and the cre
* The Python and PyTorch developer communities have shared many invaluable insights, examples and ideas on the Web.
* The authors of the research papers implemented in the [Distiller model-zoo](https://nervanasystems.github.io/distiller/model_zoo.html) have shared their research ideas, theoretical background and results.
### Built With
*[PyTorch](http://pytorch.org/) - The tensor and neural network framework used by Distiller.
Distiller is released as a reference code for research purposes. It is not an official Intel product, and the level of quality and support may not be as expected from an official product. Additional algorithms and features are planned to be added to the library. Feedback and contributions from the open source and research communities are more than welcome.