Skip to content
Snippets Groups Projects
Commit be97de23 authored by Neta Zmora's avatar Neta Zmora
Browse files

Revert "ModelSummary: adapt sparsity accounting to correctly account for "weight tying"

This reverts commit ecade1b2.
This simply does not work, so reverting until we find a correct solution.
For example, in the language model the encoder and decoder weights are tied and use the
same memory, and yet I can't see how to determine that they are the same parameter.
parent 8de6223e
No related branches found
No related tags found
No related merge requests found
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment