Skip to content
Snippets Groups Projects
Unverified Commit 4b6b5b19 authored by Guy Jacob's avatar Guy Jacob Committed by GitHub
Browse files

QAT: Better handling of optimizer and of creation of fp32 weights copy (#399)

* Create float copy such that the actual tensor being learned stays
  the same
* This way the optimizer doesn't have to be re-created, just need to
  add parameter groups if algo requires it (e.g. PACT)
* This also means we don't care about pre-existing parameter groups,
  as opposed to the previous implementation which ASSUMED a single
  existing group
parent 3710c464
No related branches found
No related tags found
No related merge requests found
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment