Skip to content
Snippets Groups Projects
  1. Apr 28, 2020
  2. Aug 08, 2019
  3. Jul 03, 2019
  4. May 20, 2019
    • Guy Jacob's avatar
      NCF scripts with Distiller integration · 4385084a
      Guy Jacob authored
      This NCF implementation is based on the implementation found in the MLPerf
      Training GitHub repository, specifically on the last revision of the code
      before the switch to the extended dataset. See:
      https://github.com/mlperf/training/tree/fe17e837ed12974d15c86d5173fe8f2c188434d5/recommendation/pytorch
      
      We've made several modifications to the code:
      * Removed all MLPerf specific code including logging
      * In ncf.py:
        * Added calls to Distiller compression APIs
        * Added progress indication in training and evaluation flows
      * In neumf.py:
        * Added option to split final FC layer
        * Replaced all functional calls with modules so they can be detected
          by Distiller
      * In dataset.py:
        * Speed up data loading - On first data will is loaded from CSVs and
          then pickled. On subsequent runs the pickle is loaded. This is much
          faster than the original implementation, but still very slow.
        * Added progress indication during data load process
      * Removed some irrelevant content from README.md
      4385084a
  5. Apr 03, 2019
  6. Feb 26, 2019
  7. Dec 02, 2018
  8. Jun 21, 2018
  9. May 07, 2018
Loading