PaddleX Edge Deployment Demo Usage Guide¶
- PaddleX Edge Deployment Demo Usage Guide
- Installation Process and Usage
- Reference Materials
- Feedback Section
This guide mainly introduces the operation method of the PaddleX edge deployment demo on the Android shell. This guide applies to 8 models across 6 modules:
Module | Specific Model | CPU | GPU |
---|---|---|---|
Object Detection | PicoDet-S | ✅ | ✅ |
PicoDet-L | ✅ | ✅ | |
Layout Area Detection | PicoDet_layout_1x | ✅ | ✅ |
Semantic Segmentation | PP-LiteSeg-T | ✅ | ✅ |
Image Classification | PP-LCNet_x1_0 | ✅ | ✅ |
MobileNetV3_small_x1_0 | ✅ | ✅ | |
Text Detection | PP-OCRv4_mobile_det | ✅ | |
Text Recognition | PP-OCRv4_mobile_rec | ✅ |
Note
- GPU
refers to mapping computations to GPU execution using OpenCL to fully utilize GPU hardware computing power and improve inference performance.
Installation Process and Usage¶
Environment Preparation¶
-
Install CMake build tool locally and download the required version of NDK software package from the Android NDK official website. For example, if developing on a Mac, download the NDK software package for the Mac platform from the Android NDK official website.
Environment Requirements -
CMake >= 3.10
(Minimum version not verified, recommend 3.20 and above) -Android NDK >= r17c
(Minimum version not verified, recommend r20b and above)Tested Environment Used in This Guide: -
cmake == 3.20.0
-android-ndk == r20b
-
Prepare an Android phone and enable USB debugging mode. Enable method:
Phone Settings -> Locate Developer Options -> Turn on Developer Options and USB Debugging Mode
. -
Install ADB tool on your computer for debugging. ADB installation methods:
3.1. For Mac:
3.2. For Linux:
# Debian-based Linux distributions sudo apt update sudo apt install -y wget adb # Red Hat-based Linux distributions sudo yum install adb
3.3. For Windows:
Install ADB by downloading the ADB software package from Google's Android platform: Link
Open a terminal, connect your phone to the computer, and enter in the terminal:
If there is an output from the device, it indicates that the installation was successful.
Material Preparation¶
-
Clone the
feature/paddle-x
branch of thePaddle-Lite-Demo
repository into thePaddleX-Lite-Deploy
directory. -
Fill out the survey to download the compressed package, place the compressed package in the specified unzip directory, switch to the specified unzip directory, and execute the unzip command.
Below is an example of the unzip operation for object_detection. Refer to the table below for other pipelines.
# 1. Switch to the specified unzip directory cd PaddleX-Lite-Deploy/object_detection/android/shell/cxx/picodet_detection # 2. Execute the unzip command unzip object_detection.zip
Pipeline Name Unzip Directory Unzip Command Object Detection PaddleX-Lite-Deploy/object_detection/android/shell/cxx/picodet_detection unzip object_detection.zip Semantic Segmentation PaddleX-Lite-Deploy/semantic_segmentation/android/shell/cxx/semantic_segmentation unzip semantic_segmentation.zip Image Classification PaddleX-Lite-Deploy/image_classification/android/shell/cxx/image_classification unzip image_classification.zip OCR PaddleX-Lite-Deploy/ocr/android/shell/ppocr_demo unzip ocr.zip
Deployment Steps¶
-
Switch the working directory to
PaddleX_Lite_Deploy/libs
and run thedownload.sh
script to download the necessary Paddle Lite prediction library. This step only needs to be executed once to support each demo. -
Switch the working directory to
PaddleX_Lite_Deploy/{Task_Name}/assets
, run thedownload.sh
script to download the paddle_lite_opt tool optimized model, test images, label files, etc. -
Switch the working directory to
PaddleX_Lite_Deploy/{Task_Name}/android/shell/cxx/{Demo_Name}
, run thebuild.sh
script to complete the compilation and execution of the executable file. -
Switch the working directory to
PaddleX-Lite-Deploy/{Task_Name}/android/shell/cxx/{Demo_Name}
, run therun.sh
script to complete the prediction on the edge.Note: -
{Pipeline_Name}
and{Demo_Name}
are placeholders. Refer to the table at the end of this section for specific values. -download.sh
andrun.sh
support passing in model names to specify models. If not specified, the default model will be used. Refer to theModel_Name
column in the table at the end of this section for currently supported models. - To use your own trained model, refer to the Model Conversion Method to obtain the.nb
model, place it in thePaddleX_Lite_Deploy/{Pipeline_Name}/assets/{Model_Name}
directory, where{Model_Name}
is the model name, e.g.,PaddleX_Lite_Deploy/object_detection/assets/PicoDet-L
. - Before running thebuild.sh
script, change the path specified byNDK_ROOT
to the actual installed NDK path. - Keep ADB connected when running thebuild.sh
script. - On Windows systems, you can use Git Bash to execute the deployment steps. - If compiling on a Windows system, setCMAKE_SYSTEM_NAME
towindows
inCMakeLists.txt
. - If compiling on a Mac system, setCMAKE_SYSTEM_NAME
todarwin
inCMakeLists.txt
.
Below is an example for object_detection. For other demos, change the directories switched in steps 2 and 3 according to the table at the end of this section.
# 1. Download the necessary Paddle Lite prediction library
cd PaddleX_Lite_Deploy/libs
sh download.sh
# 2. Download the paddle_lite_opt tool optimized model, test images, and label files
cd ../object_detection/assets
sh download.sh
# Supports passing in model names to specify the downloaded model. Refer to the Model_Name column in the table at the end of this section for supported models.
# sh download.sh PicoDet-L
# 3. Complete the compilation of the executable file
cd ../android/app/shell/cxx/picodet_detection
sh build.sh
# 4. Prediction
sh run.sh
# Supports passing in model names to specify the prediction model. Refer to the Model_Name column in the table at the end of this section for supported models.
# sh run.sh PicoDet-L
The run results are shown below, and a result image named dog_picodet_detection_result.jpg
is generated:
======= benchmark summary =======
input_shape(s) (NCHW): {1, 3, 320, 320}
model_dir:./models/PicoDet-S/model.nb
warmup:1
repeats:10
power_mode:1
thread_num:0
<b>* time info(ms) </b>*
1st_duration:320.086
max_duration:277.331
min_duration:272.67
avg_duration:274.91
====== output summary ======
detection, image size: 768, 576, detect object: bicycle, score: 0.905929, location: x=125, y=1
This section describes the deployment steps applicable to the demos listed in the following table:
Pipeline | Pipeline_Name | Module | Demo_Name | Specific Model | Model_Name |
---|---|---|---|---|---|
General Object Detection | object_detection | Object Detection | picodet_detection | PicoDet-S | PicoDet-S(default)PicoDet-S_gpu |
PicoDet-L | PicoDet-LPicoDet-L_gpu | ||||
PicoDet_layout_1x | PicoDet_layout_1xPicoDet_layout_1x_gpu | ||||
General Semantic Segmentation | semantic_segmentation | Semantic Segmentation | semantic_segmentation | PP-LiteSeg-T | PP-LiteSeg-T(default)PP-LiteSeg-T_gpu |
General Image Classification | image_classification | Image Classification | image_classification | PP-LCNet_x1_0 | PP-LCNet_x1_0(default)PP-LCNet_x1_0_gpu |
MobileNetV3_small_x1_0 | MobileNetV3_small_x1_0MobileNetV3_small_x1_0_gpu | ||||
General OCR | ocr | Text Detection | ppocr_demo | PP-OCRv4_mobile_det | PP-OCRv4_mobile_det |
Text Recognition | PP-OCRv4_mobile_rec | PP-OCRv4_mobile_rec |
Note
- Currently, there is no demo for deploying the Layout Area Detection module on the edge, so the picodet_detection
demo is reused to deploy the PicoDet_layout_1x
model.
Reference Materials¶
This guide only introduces the basic installation and usage process of the edge deployment demo. If you want to learn more detailed information, such as code introduction, code explanation, updating models, updating input and output preprocessing, updating prediction libraries, etc., please refer to the following documents:
Feedback Section¶
The edge deployment capabilities are continuously optimized. Welcome to submit issue to report problems and needs, and we will follow up promptly.