Re-compile your model
What we are going to do
In this tutorial, we will skip the tuning process and directly compile an AI model. This can be useful in scenarios where you need to retrieve an optimized model without repeating the tuning process, such as after accidentally deleting the results or revisiting the process later.
Optimium can perfectly tune and optimize your AI models for your target device, but this process can be time-consuming. To address this, Optimium allows users to leverage tuning history to skip the tuning process and obtain the optimized model more quickly.
In this tutorial, we will demonstrate how to use the tuning history to speed up the optimization process.
Requirements
We assume that you are already familiar with using Optimium, including setup and optimization steps. If not, please refer to the previous tutorials.
Concepts
When you run Optimium, a maintenance
directory is created in $WORKING_DIR
. This directory contains all tuning history recorded by Optimium.
Optimium automatically loads the tuning history when optimizing, allowing you to skip the tuning process.
Optimium's process consists of two main steps: 1. Optimize, and 2) Deploy, however this tutorial will focus only on the optimization step.
Optimize
Setting up Optimium to skip the tuning process is straightforward. The steps are almost identical to those for standard optimization with tuning, except for changes to the user_arguments.json
file.
Run Python and enter the following commands to create json file:
import optimium
optimium.create_args_template()
It will prompt you to specify the number of threads to use.
Press "no" when asked "Enable hardware-specific auto-tuning?"
Or, if you have already created user_arguments.json
, you can directly edit the value of enable_tuning
in to false
to skip the tuning process. Note that the user should keep opt_log_key
to reuse the optimization history.
Note: The opt_log_key
must match the key used during the initial tuning process. For example, if you previously ran Optimium with "opt_log_key": "MyOptKey"
, you must retain the same key value to exploit the optimization history.
{
...
"optimization": {
"opt_log_key": "MyOptKey",
"enable_tuning": false
},
...
}
Updated 8 months ago