Skip to content
Snippets Groups Projects
Unverified Commit 148b7474 authored by Neta Zmora's avatar Neta Zmora Committed by GitHub
Browse files

Thinning: fix param_name_2_layer_name

This fix does not change the behavior. 
The previous code worked correctly because 'weights' and '.weight' have the same length.
parent 98b54695
No related branches found
No related tags found
No related merge requests found
...@@ -84,7 +84,14 @@ def get_normalized_recipe(recipe): ...@@ -84,7 +84,14 @@ def get_normalized_recipe(recipe):
def param_name_2_layer_name(param_name): def param_name_2_layer_name(param_name):
return param_name[:-len('weights')] """Convert a weights tensor's name to the name of the layer using the tensor.
By convention, PyTorch modules name their weights parameters as self.weight
(see for example: torch.nn.modules.conv) which means that their fully-qualified
name when enumerating a model's parameters is the modules name followed by '.weight'.
We exploit this convention to convert a weights tensor name to the fully-qualified
module name."""
return param_name[:-len('.weight')]
def directives_equal(d1, d2): def directives_equal(d1, d2):
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment