- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I use Caffe framework and do this mvNCProfile command below to transform caffe deploy.prototxt to Movidius
mvNCProfile deploy.prototxt -s 12
I found that if the PReLU、BatchNorm、Scale Layer use the same "bottom" and "top" name, it will produce the wrong result.
otherwise, ReLU can use the same "bottom" and "top" name.
- Example 1. deploy.prototxt - use the same "bottom" and "top" name
name: "Net V1.03"
input: "data"
input_shape {
dim: 1
dim: 3
dim: 200
dim: 200
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 6
kernel_size: 5
stride: 2
pad: 2
weight_filler {
type: "xavier"
}
}
}
layer {
name: "relu_conv1"
type: "PReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "bn_conv1"
type: "BatchNorm"
bottom: "conv1"
top: "conv1"
batch_norm_param {
use_global_stats: true
}
include {
phase: TEST
}
}
layer {
name: "scale_conv1"
type: "Scale"
bottom: "conv1"
top: "conv1"
scale_param {
bias_term: true
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "conv1"
top: "conv2"
convolution_param {
num_output: 12
kernel_size: 3
pad: 1
weight_filler {
type: "xavier"
}
}
}
layer {
name: "relu_conv2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool_final"
type: "Pooling"
bottom: "conv2"
top: "pool_final"
pooling_param {
pool: AVE
global_pooling: true
}
}
=> The convert result of Movidius
drive.google.com/file/d/1MmzeQfGwrbz_P3eB8CuJfiR1TuCGhLhV/view?usp=sharing
The connect of layers is wrong, relu_conv1 layer is independent outside.
(The right connect is conv1 -> relu_conv1 -> bn_conv1 -> scale_conv1 -> conv2 -> …)
- Example 2. deploy.prototxt - **use different "bottom" and "top" name **
name: "Net V1.03"
input: "data"
input_shape {
dim: 1
dim: 3
dim: 200
dim: 200
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 6
kernel_size: 5
stride: 2
pad: 2
weight_filler {
type: "xavier"
}
}
}
layer {
name: "relu_conv1"
type: "PReLU"
bottom: "conv1"
top: "relu_conv1"
}
layer {
name: "bn_conv1"
type: "BatchNorm"
bottom: "relu_conv1"
top: "bn_conv1"
batch_norm_param {
use_global_stats: true
}
include {
phase: TEST
}
}
layer {
name: "scale_conv1"
type: "Scale"
bottom: "bn_conv1"
top: "scale_conv1"
scale_param {
bias_term: true
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "scale_conv1"
top: "conv2"
convolution_param {
num_output: 12
kernel_size: 3
pad: 1
weight_filler {
type: "xavier"
}
}
}
layer {
name: "relu_conv2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool_final"
type: "Pooling"
bottom: "conv2"
top: "pool_final"
pooling_param {
pool: AVE
global_pooling: true
}
}
=> The convert result of Movidius:
https://drive.google.com/file/d/1FBsXzyOasqPvtL9q6Oifxsi_bblQU-p8/view?usp=sharing
When I chang the "top" and "bottom" name different, the connect of layers looks like right.
If "changing the top and bottom name different" is the right way to solve this situation? or anybody have other suggestion?
- Tags:
- Caffe
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@miily This seems to be the way to do it from what I've been seeing.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page