-
- Downloads
Revert "ModelSummary: adapt sparsity accounting to correctly account for "weight tying"
This reverts commit ecade1b2. This simply does not work, so reverting until we find a correct solution. For example, in the language model the encoder and decoder weights are tied and use the same memory, and yet I can't see how to determine that they are the same parameter.
Loading
Please register or sign in to comment