diff --git a/README.md b/README.md index e4149b29b712f82a566f97c5680d3324686883b8..26d9a8b2848ed1e8c2cc70f74b4aa69aff04c2ca 100755 --- a/README.md +++ b/README.md @@ -399,9 +399,12 @@ This project is licensed under the Apache License 2.0 - see the [LICENSE.md](LIC - [DeGirum Pruned Models](https://github.com/DeGirum/pruned-models) - a repository containing pruned models and related information. ### Research papers citing Distiller: +- Ziheng Wang, Jeremy Wohlwend, Tao Lei.<br> +*[Structured Pruning of Large Language Models](https://arxiv.org/abs/1910.04732)*,<br> +arXiv:1910.04732, 2019. - Soroush Ghodrati, Hardik Sharma, Sean Kinzer, Amir Yazdanbakhsh, Kambiz Samadi, Nam Sung Kim, Doug Burger, Hadi Esmaeilzadeh.<br> -*[Mixed-Signal Charge-Domain Acceleration of Deep Neural networks through Interleaved Bit-Partitioned Arithmetic]( https://arxiv.org/abs/1906.11915)*,<br> +*[Mixed-Signal Charge-Domain Acceleration of Deep Neural networks through Interleaved Bit-Partitioned Arithmetic](https://arxiv.org/abs/1906.11915)*,<br> arXiv:1906.11915, 2019. - Gil Shomron, Tal Horowitz, Uri Weiser.<br>