Skip to content
Snippets Groups Projects
Unverified Commit 6e223f85 authored by Neta Zmora's avatar Neta Zmora Committed by GitHub
Browse files

Add NCF example (#350)

* NCF scripts with Distiller integration

This NCF implementation is based on the implementation found in the MLPerf
Training GitHub repository, specifically on the last revision of the code
before the switch to the extended dataset. See:
https://github.com/mlperf/training/tree/fe17e837ed12974d15c86d5173fe8f2c188434d5/recommendation/pytorch

We've made several modifications to the code:
* Removed all MLPerf specific code including logging
* In ncf.py:
  * Added calls to Distiller compression APIs
  * Added progress indication in training and evaluation flows
* In neumf.py:
  * Added option to split final FC layer
  * Replaced all functional calls with modules so they can be detected
    by Distiller
* In dataset.py:
  * Speed up data loading - On first data will is loaded from CSVs and
    then pickled. On subsequent runs the pickle is loaded. This is much
    faster than the original implementation, but still very slow.
  * Added progress indication during data load process
* Removed some irrelevant content from README.md

* ncf.py - fix wrong nb_users/items

* NCF changes to make it compatible with latest changes in master

* Pass the 'sigmoid' flag in NeuMF.forward as a bool tensor instead of
  a simple boolean. Required to make the model traceable (it?d be better
  to not have it an argument of forward at all, but keeping changes to
  a minimum)
* Call prepare_model with dummy_input

* Cleanup code, update reqs, add stats files, add details to README

* Update NCF README.md

* Remove debug comments

* Remove unnecessary script + NCF QAT scheduler (for now)
parents b41c4d2d 8ca422db
No related branches found
No related tags found
No related merge requests found
Showing
with 2032 additions and 1 deletion
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment