Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Yukihiro_Tanaka
Beginner
89 Views

VPU Plugin Fusing : Eltwise + ReLU → Eltwise (2020.R1)

Hello,

At the end of the each residual module of ResNet50, only first two [Eltwise + ReLU] layers are fused, others are not fused.

Each description of the [Eltwise + ReLU] patterns in prototxt is the same.

[Eltwise + ReLU] in ResNet50 prototxt:

..........

layer {
        bottom: "res2a"
        bottom: "res2b_branch2c"
        top: "res2b"
        name: "res2b"
        type: "Eltwise"
}

layer {
        bottom: "res2b"
        top: "res2b"
        name: "res2b_relu"
        type: "ReLU"
}
..........

layer {
        bottom: "res3a"
        bottom: "res3b_branch2c"
        top: "res3b"
        name: "res3b"
        type: "Eltwise"
}

layer {
        bottom: "res3b"
        top: "res3b"
        name: "res3b_relu"
        type: "ReLU"
}
..........

benchmark_app log:

..........

Add1_10457/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 793       cpu: 0               execType: MyriadXHwOp
Add1_9953/Fused_Add_          EXECUTED       layerType: Convolution        realTime: 1035      cpu: 0               execType: Copy
res2a                         EXECUTED       layerType: Eltwise            realTime: 1259      cpu: 0               execType: Sum
Add1_9917/Fused_Add_          EXECUTED       layerType: Convolution        realTime: 492       cpu: 0               execType: MyriadXHwOp
Add1_10517/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 787       cpu: 0               execType: MyriadXHwOp
Add1_10037/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 1179      cpu: 0               execType: Copy
res2b                         EXECUTED       layerType: Eltwise            realTime: 1562      cpu: 0               execType: Sum
Add1_10025/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 454       cpu: 0               execType: MyriadXHwOp
Add1_10109/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 205       cpu: 0               execType: MyriadXHwOp

..........

Add1_10313/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 583       cpu: 0               execType: MyriadXHwOp
res3a                         EXECUTED       layerType: Eltwise            realTime: 356       cpu: 0               execType: Sum
res3a_relu                    EXECUTED       layerType: ReLU               realTime: 536       cpu: 0               execType: Relu
Add1_10253/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 374       cpu: 0               execType: MyriadXHwOp
Add1_10061/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 844       cpu: 0               execType: MyriadXHwOp
Add1_10445/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 570       cpu: 0               execType: MyriadXHwOp
res3b                         EXECUTED       layerType: Eltwise            realTime: 534       cpu: 0               execType: Sum
res3b_relu                    EXECUTED       layerType: ReLU               realTime: 524       cpu: 0               execType: Relu
Add1_10301/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 533       cpu: 0               execType: MyriadXHwOp
Add1_10097/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 915       cpu: 0               execType: MyriadXHwOp
Add1_10145/Fused_Add_         EXECUTED       layerType: Convolution        realTime: 574       cpu: 0               execType: MyriadXHwOp
..........

 

In the log file of benchmark_app, res2b_relu doesn't exit so that I think it is fused.

I would like to know;

  - this is a log report issue or not

  - how to fuse all the [Eltwise + ReLU] patterns

Thanks,

0 Kudos
0 Replies
Reply