Skip to content
Snippets Groups Projects
Commit 11402988 authored by Neta Zmora's avatar Neta Zmora
Browse files

compress_classifier.py: add an option to load a model in serialized mode

By default, when we create a model we  wrap it with DataParallel to benefit
from data-parallelism across GPUs (mainly for convolution layers).

But sometimes we don't want the sample application to do this: for
example when we receive a model that was trained serially.
This commit adds a new argument to the application to prevent
the use of DataParallel.
parent 3876a912
No related branches found
No related tags found
No related merge requests found
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment