diff --git a/README.md b/README.md
index 0ac783f8e8c02e85eb55d7a80e0e6cf477ca6e9c..b13c1e8b5fbc7085c0fcc89ac90d54ac306dccc8 100755
--- a/README.md
+++ b/README.md
@@ -37,13 +37,21 @@ Network compression can reduce the memory footprint of a neural network, increas
 
 <details><summary><b>What's New in October?</b></summary>
 <p>
-We've added collection of activation statistics! 
+<b><i>We've added two new Jupyter notebooks:</i></b>
+
+- The [first notebook](https://github.com/NervanaSystems/distiller/blob/master/jupyter/what_are_you_looking_at.ipynb) contrasts what sparse and dense versions of ResNet50 "look at".
+<center> <img src="imgs/sparse_dense_cmaps.png"></center>
+
+- The [second notebook](https://github.com/NervanaSystems/distiller/blob/master/jupyter/truncated_svd.ipynb) shows a simple application of Truncated SVD to the linear layer in ResNet50.
+</p>
+<p>
+<b>We've added collection of activation statistics!</b>
 
 Activation statistics can be leveraged to make pruning and quantization decisions, and so
 we added support to collect these data.
-Two types of activation statistics are supported: summary statistics, and detailed records 
+Two types of activation statistics are supported: summary statistics, and detailed records
 per activation.
-Currently we support the following summaries: 
+Currently we support the following summaries:
 - Average activation sparsity, per layer
 - Average L1-norm for each activation channel, per layer
 - Average sparsity for each activation channel, per layer
diff --git a/imgs/sparse_dense_cmaps.png b/imgs/sparse_dense_cmaps.png
new file mode 100755
index 0000000000000000000000000000000000000000..eaf0d6687991f624aca42c64762c5a754f1c17c2
Binary files /dev/null and b/imgs/sparse_dense_cmaps.png differ