diff --git a/README.md b/README.md
index 77ed092dd0d414305da0ab59c0b947d305c92d29..863ec065f272fa577a373340c07a4931dd8a534e 100755
--- a/README.md
+++ b/README.md
@@ -152,7 +152,7 @@ Beware.
   - Group Lasso an group variance regularization   
 * **Quantization**
   - Automatic mechanism to transform existing models to quantized versions, with customizable bit-width configuration for different layers. No need to re-write the model for different quantization methods.
-  - Post-training quantization of trained full-precision models
+  - Post-training quantization of trained full-precision models, dynamic and static (statistics-based)
   - Support for [quantization-aware training](https://nervanasystems.github.io/distiller/quantization/index.html#quantization-aware-training) in the loop
 * **Knowledge distillation**
   - Training with [knowledge distillation](https://nervanasystems.github.io/distiller/knowledge_distillation/index.html), in conjunction with the other available pruning / regularization / quantization methods.