Skip to content
Snippets Groups Projects
Commit 77f702d7 authored by Yifan Zhao's avatar Yifan Zhao
Browse files

Replaced AlexNet2 with VGG16

parent bee122b3
No related branches found
No related tags found
No related merge requests found
...@@ -3,14 +3,14 @@ Getting Started ...@@ -3,14 +3,14 @@ Getting Started
This tutorial covers the basic usage of all components in HPVM This tutorial covers the basic usage of all components in HPVM
(components listed :doc:`here </components/index>`). (components listed :doc:`here </components/index>`).
We will generate a DNN model, AlexNet2 (for CIFAR10 dataset), into HPVM code, compile it with HPVM, We will generate a DNN model, VGG16 (for CIFAR10 dataset), into HPVM code, compile it with HPVM,
perform autotuning on the compiled binary to find approximation choices (configurations), perform autotuning on the compiled binary to find approximation choices (configurations),
and profile the selected configurations to get real performance on device. and profile the selected configurations to get real performance on device.
The result will be a figure showing the accuracy-performance tradeoff of AlexNet2 over the The result will be a figure showing the accuracy-performance tradeoff of VGG16 over the
(pre-defined) approximations and the configurations in a few formats. (pre-defined) approximations and the configurations in a few formats.
Please check ``test/dnn_benchmarks/model_params/`` exists and contains Please check ``test/dnn_benchmarks/model_params/`` exists and contains
``alexnet2_cifar10/`` and ``pytorch/alexnet2_cifar10.pth.tar``, ``vgg16_cifar10/`` and ``pytorch/vgg16_cifar10.pth.tar``,
which may not be the case if you opted out of model parameter download in the installer. which may not be the case if you opted out of model parameter download in the installer.
In that case, you may run the installer again to download the parameter. In that case, you may run the installer again to download the parameter.
It will not rebuild everything from scratch. It will not rebuild everything from scratch.
...@@ -28,8 +28,8 @@ for easier access to ``test/dnn_benchmarks/model_params/``. ...@@ -28,8 +28,8 @@ for easier access to ``test/dnn_benchmarks/model_params/``.
You can also symlink it to other locations -- don't move it: it's used in test cases -- You can also symlink it to other locations -- don't move it: it's used in test cases --
and adjust the paths below accordingly. and adjust the paths below accordingly.
First, prepare 2 datasets for autotuning and testing for AlexNet2. First, prepare 2 datasets for autotuning and testing for VGG16.
These datasets are provided as ``model_params/alexnet2_cifar10/{tune|test}_{input|labels}.bin``, These datasets are provided as ``model_params/vgg16_cifar10/{tune|test}_{input|labels}.bin``,
where ``tune`` and ``test`` prefixes signify tuning and testing set. where ``tune`` and ``test`` prefixes signify tuning and testing set.
.. code-block:: python .. code-block:: python
...@@ -37,13 +37,13 @@ where ``tune`` and ``test`` prefixes signify tuning and testing set. ...@@ -37,13 +37,13 @@ where ``tune`` and ``test`` prefixes signify tuning and testing set.
from torch2hpvm import BinDataset from torch2hpvm import BinDataset
from pathlib import Path from pathlib import Path
data_dir = Path("model_params/alexnet2_cifar10") data_dir = Path("model_params/vgg16_cifar10")
dataset_shape = 5000, 3, 32, 32 # NCHW format. dataset_shape = 5000, 3, 32, 32 # NCHW format.
tuneset = BinDataset(data_dir / "tune_input.bin", data_dir / "tune_labels.bin", dataset_shape) tuneset = BinDataset(data_dir / "tune_input.bin", data_dir / "tune_labels.bin", dataset_shape)
testset = BinDataset(data_dir / "test_input.bin", data_dir / "test_labels.bin", dataset_shape) testset = BinDataset(data_dir / "test_input.bin", data_dir / "test_labels.bin", dataset_shape)
`BinDataset` is a utility `torch2hpvm` provides for creating dataset over binary files. `BinDataset` is a utility `torch2hpvm` provides for creating dataset over binary files.
Any instance `torch.utils.data.Dataset` can be used here. Any instance of `torch.utils.data.Dataset` can be used here.
*Note* that each `module` is bound to 2 datasets: a "tune" and a "test" set. *Note* that each `module` is bound to 2 datasets: a "tune" and a "test" set.
The generated binary accepts an argument to be either the string "tune" or "test", The generated binary accepts an argument to be either the string "tune" or "test",
...@@ -57,8 +57,8 @@ Create a DNN `module` and load the checkpoint: ...@@ -57,8 +57,8 @@ Create a DNN `module` and load the checkpoint:
from torch.nn import Module from torch.nn import Module
import dnn # Defined at `hpvm/test/dnn_benchmarks/pytorch` import dnn # Defined at `hpvm/test/dnn_benchmarks/pytorch`
model: Module = dnn.AlexNet2() model: Module = dnn.VGG16()
checkpoint = "model_params/alexnet2_cifar10.pth.tar" checkpoint = "model_params/vgg16_cifar10.pth.tar"
model.load_state_dict(torch.load(checkpoint)) model.load_state_dict(torch.load(checkpoint))
Any `torch.nn.Module` can be similarly used, Any `torch.nn.Module` can be similarly used,
...@@ -72,11 +72,11 @@ Now we are ready to export the model. The main functioning class of `torch2hpvm` ...@@ -72,11 +72,11 @@ Now we are ready to export the model. The main functioning class of `torch2hpvm`
from torch2hpvm import ModelExporter from torch2hpvm import ModelExporter
output_dir = Path("./alexnet2_cifar10") output_dir = Path("./vgg16_cifar10")
build_dir = output_dir / "build" build_dir = output_dir / "build"
target_binary = build_dir / "alexnet2_cifar10" target_binary = build_dir / "vgg16_cifar10"
batch_size = 500 batch_size = 500
conf_file = "hpvm-c/benchmarks/alexnet2_cifar10/data/tuner_confs.txt" conf_file = "hpvm-c/benchmarks/vgg16_cifar10/data/tuner_confs.txt"
exporter = ModelExporter(model, tuneset, testset, output_dir, config_file=conf_file) exporter = ModelExporter(model, tuneset, testset, output_dir, config_file=conf_file)
exporter.generate(batch_size=batch_size).compile(target_binary, build_dir) exporter.generate(batch_size=batch_size).compile(target_binary, build_dir)
...@@ -89,13 +89,13 @@ and path to the compiled binary respectively. ...@@ -89,13 +89,13 @@ and path to the compiled binary respectively.
This file decides what approximation the binary will use during inference. This file decides what approximation the binary will use during inference.
This path is hardcoded into the binary and is only read when the binary starts, This path is hardcoded into the binary and is only read when the binary starts,
so it's fine to have `conf_file` point to a non-existing path. so it's fine to have `conf_file` point to a non-existing path.
An example can be found at ``hpvm-c/benchmarks/alexnet2_cifar10/data/tuner_confs.txt``. An example can be found at ``hpvm-c/benchmarks/vgg16_cifar10/data/tuner_confs.txt``.
* `exporter.generate` generates the HPVM-C code while `exporter.compile` is * `exporter.generate` generates the HPVM-C code while `exporter.compile` is
a helper that invokes the HPVM compiler for you. a helper that invokes the HPVM compiler for you.
Now there should be a binary at ``./alexnet2_cifar10/build/alexnet2_cifar10``. Now there should be a binary at ``./vgg16_cifar10/build/vgg16_cifar10``.
Try running ``./alexnet2_cifar10/build/alexnet2_cifar10 test`` for inference over the test set. Try running ``./vgg16_cifar10/build/vgg16_cifar10 test`` for inference over the test set.
Compiling a Tuner Binary Compiling a Tuner Binary
------------------------ ------------------------
...@@ -111,13 +111,13 @@ It also doesn't define a `conf_file`. ...@@ -111,13 +111,13 @@ It also doesn't define a `conf_file`.
from torch2hpvm import ModelExporter from torch2hpvm import ModelExporter
tuner_output_dir = Path("./alexnet2_cifar10_tuner") tuner_output_dir = Path("./vgg16_cifar10_tuner")
tuner_build_dir = tuner_output_dir / "build" tuner_build_dir = tuner_output_dir / "build"
tuner_binary = tuner_build_dir / "alexnet2_cifar10" tuner_binary = tuner_build_dir / "vgg16_cifar10"
exporter = ModelExporter(model, tuneset, testset, tuner_output_dir, target="hpvm_tensor_inspect") exporter = ModelExporter(model, tuneset, testset, tuner_output_dir, target="hpvm_tensor_inspect")
exporter.generate(batch_size=500).compile(tuner_binary, tuner_build_dir) exporter.generate(batch_size=500).compile(tuner_binary, tuner_build_dir)
This binary is generated at ``alexnet2_cifar10_tuner/build/alexnet2_cifar10``. This binary is generated at ``vgg16_cifar10_tuner/build/vgg16_cifar10``.
It waits for autotuner signal and doesn't run on its own, so don't run it by yourself. It waits for autotuner signal and doesn't run on its own, so don't run it by yourself.
Instead, import and use the tuner `predtuner`: Instead, import and use the tuner `predtuner`:
...@@ -210,7 +210,7 @@ we obtained in the tuning step. ...@@ -210,7 +210,7 @@ we obtained in the tuning step.
from hpvm_profiler import profile_config_file, plot_hpvm_configs from hpvm_profiler import profile_config_file, plot_hpvm_configs
# Set `target_binary` to the path of the plain binary. # Set `target_binary` to the path of the plain binary.
target_binary = "./alexnet2_cifar10/build/alexnet2_cifar10" target_binary = "./vgg16_cifar10/build/vgg16_cifar10"
# Set `config_file` to the config file produced in tuning, such as "hpvm_confs.txt". # Set `config_file` to the config file produced in tuning, such as "hpvm_confs.txt".
config_file = "hpvm_confs.txt" config_file = "hpvm_confs.txt"
out_config_file = "hpvm_confs_profiled.txt" out_config_file = "hpvm_confs_profiled.txt"
...@@ -222,7 +222,7 @@ while ``configs_profiled.png`` shows the final performance-accuracy tradeoff cur ...@@ -222,7 +222,7 @@ while ``configs_profiled.png`` shows the final performance-accuracy tradeoff cur
An example of ``configs_profiled.png`` looks like this (proportion of your image may be different): An example of ``configs_profiled.png`` looks like this (proportion of your image may be different):
.. image:: _static/alexnet2_cifar10.png .. image:: _static/vgg16_cifar10.png
----------------------- -----------------------
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment