@@ -399,9 +399,12 @@ This project is licensed under the Apache License 2.0 - see the [LICENSE.md](LIC
...
@@ -399,9 +399,12 @@ This project is licensed under the Apache License 2.0 - see the [LICENSE.md](LIC
-[DeGirum Pruned Models](https://github.com/DeGirum/pruned-models) - a repository containing pruned models and related information.
-[DeGirum Pruned Models](https://github.com/DeGirum/pruned-models) - a repository containing pruned models and related information.
### Research papers citing Distiller:
### Research papers citing Distiller:
- Ziheng Wang, Jeremy Wohlwend, Tao Lei.<br>
*[Structured Pruning of Large Language Models](https://arxiv.org/abs/1910.04732)*,<br>
arXiv:1910.04732, 2019.
- Soroush Ghodrati, Hardik Sharma, Sean Kinzer, Amir Yazdanbakhsh, Kambiz Samadi, Nam Sung Kim, Doug Burger, Hadi Esmaeilzadeh.<br>
- Soroush Ghodrati, Hardik Sharma, Sean Kinzer, Amir Yazdanbakhsh, Kambiz Samadi, Nam Sung Kim, Doug Burger, Hadi Esmaeilzadeh.<br>
*[Mixed-Signal Charge-Domain Acceleration of Deep Neural networks through Interleaved Bit-Partitioned Arithmetic](https://arxiv.org/abs/1906.11915)*,<br>
*[Mixed-Signal Charge-Domain Acceleration of Deep Neural networks through Interleaved Bit-Partitioned Arithmetic](https://arxiv.org/abs/1906.11915)*,<br>