-
- Downloads
Remove single worker limitation in deterministic mode (#227)
Also: * Single worker limitation not needed anymore, been fixed in PyTorch since v0.4.0 (https://github.com/pytorch/pytorch/pull/4640) * compress_classifier.py: If run in evaluation mode (--eval), enable deterministic mode. * Call utils.set_deterministic at data loaders creation if deterministic argument is set (don't assume user calls it outside) * Disable CUDNN benchmark mode in utils.set_deterministic (https://pytorch.org/docs/stable/notes/randomness.html#cudnn)
Showing
- distiller/apputils/data_loaders.py 10 additions, 1 deletiondistiller/apputils/data_loaders.py
- distiller/utils.py 12 additions, 7 deletionsdistiller/utils.py
- examples/classifier_compression/compress_classifier.py 7 additions, 8 deletionsexamples/classifier_compression/compress_classifier.py
- examples/classifier_compression/parser.py 1 addition, 1 deletionexamples/classifier_compression/parser.py
- tests/full_flow_tests.py 4 additions, 4 deletionstests/full_flow_tests.py
Loading
Please register or sign in to comment