-
Yifan Zhao authoredYifan Zhao authored
notes.md 1.79 KiB
General API (Level 1)
-
Knob:
- A name -- constant
- kwargs (for knob parameter, for example: 25% in 25% perforation) -- constant
- Whether coexist with another knob or not -- method
- Useful when we want to support multiple knobs in an op
-
Application:
- List of knobs for each operator -- constant
- (We provide a knob value for "baseline" and sneak that in each layer)
- Argparser extra arguments -- static method
- How to measure QoS & performance given a configuration -- method
- Input? Don't care
- Empirical or modeled? Don't care
- This is a minimal interface to interact with
opentuner
, nothing ApproxTuner yet.
- List of knobs for each operator -- constant
Predictive Tuning (Level 2)
Performance model + QoS model (P1, P2)
-
Performance model (linear combination of operator cost * speedup of knob)
- Requires list of operator cost
- Requires table of knob speedup on layer
-
QoS P1 (linear in tensor output)
- Requires tensor output of each single knob
-
QoS P2 (linear in QoS)
- Requires QoS of each single knob
-
Predict-tunable application
- User should inherit from a number of interfaces. Inherit P1Interface -> can use P1, etc.
- Detect self capability using
isinstance(self, ...)
and offers argparser extra arguments (or config file entries).
Predictive Tuning for PyTorch Modules (Level 3)
Automate some definitions using inspection on module
- PyTorch application
- *Provides list of operators and cost in FLOPs
- *Provides function for running inference on module and combining output
- *Implements P2
- *Implements P1 if output is tensor
- Require a PyTorch module
- Require an input dataset
API usage
- User defines classes for knobs
- Perforation, sampling...
- User defines application
- Most general: just
Application
. Useful for tuning a binary, for example.
- Most general: just