Skip to content
Snippets Groups Projects
Commit 4385084a authored by Guy Jacob's avatar Guy Jacob
Browse files

NCF scripts with Distiller integration

This NCF implementation is based on the implementation found in the MLPerf
Training GitHub repository, specifically on the last revision of the code
before the switch to the extended dataset. See:
https://github.com/mlperf/training/tree/fe17e837ed12974d15c86d5173fe8f2c188434d5/recommendation/pytorch

We've made several modifications to the code:
* Removed all MLPerf specific code including logging
* In ncf.py:
  * Added calls to Distiller compression APIs
  * Added progress indication in training and evaluation flows
* In neumf.py:
  * Added option to split final FC layer
  * Replaced all functional calls with modules so they can be detected
    by Distiller
* In dataset.py:
  * Speed up data loading - On first data will is loaded from CSVs and
    then pickled. On subsequent runs the pickle is loaded. This is much
    faster than the original implementation, but still very slow.
  * Added progress indication during data load process
* Removed some irrelevant content from README.md
parent a7635ea1
No related branches found
No related tags found
No related merge requests found
Showing
with 1487 additions and 1 deletion
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment