- Apr 28, 2020
-
-
Guy Jacob authored
-
- Aug 08, 2019
-
-
Lev Zlotnik authored
-
- Jul 03, 2019
-
-
Guy Jacob authored
-
- May 20, 2019
-
-
Guy Jacob authored
This NCF implementation is based on the implementation found in the MLPerf Training GitHub repository, specifically on the last revision of the code before the switch to the extended dataset. See: https://github.com/mlperf/training/tree/fe17e837ed12974d15c86d5173fe8f2c188434d5/recommendation/pytorch We've made several modifications to the code: * Removed all MLPerf specific code including logging * In ncf.py: * Added calls to Distiller compression APIs * Added progress indication in training and evaluation flows * In neumf.py: * Added option to split final FC layer * Replaced all functional calls with modules so they can be detected by Distiller * In dataset.py: * Speed up data loading - On first data will is loaded from CSVs and then pickled. On subsequent runs the pickle is loaded. This is much faster than the original implementation, but still very slow. * Added progress indication during data load process * Removed some irrelevant content from README.md
-
- Apr 03, 2019
-
-
Guy Jacob authored
-
- Feb 26, 2019
-
-
Lev Zlotnik authored
Not backward compatible - re-installation is required * Fixes for PyTorch==1.0.0 * Refactoring folder structure * Update installation section in docs
-
- Dec 02, 2018
-
-
Guy Jacob authored
-
- Jun 21, 2018
- May 07, 2018
-
-
Guy Jacob authored
-