Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Lee__Sungjin
Beginner
108 Views

inference engine result

Hi, i'm trying to infer my tensorflow lane detection model using Openvino.

 

In original tf model, there are 4 output layers.

input_node = ['data/input_img']
output_nodes = ['model/softmax_output_bin',
                'model/softmax_output_type',
                'model/softmax_output_loc',
                'model/softmax_output_color']

 

and i tried to visualize lane detection just using output of 'model/softmax_output_bin'

 

In my xml file, the description of  'model/softmax_output_bin' layer is

<layer id="189" name="model/softmax_output_bin" precision="FP32" type="SoftMax">
			<data axis="3"/>
			<input>
				<port id="0">
					<dim>1</dim>
					<dim>2</dim>
					<dim>128</dim>
					<dim>256</dim>
				</port>
			</input>
			<output>
				<port id="1">
					<dim>1</dim>
					<dim>2</dim>
					<dim>128</dim>
					<dim>256</dim>
				</port>
			</output>
		</layer>

 

Because it is softmax layer, sum of each coordinate in two channels must be 1.

For example, (c,h,w)  ->  (0,0,0) + (1,0,0) = 1,  (0,127,255) + (1,127,255) = 1 like this.

But inference engine output doesn't follow the softmax output.

    ch1         ch2         sum
6.44548e-05,  0.0366291, 0.0366936
9.07284e-05,  0.0619251, 0.0620158
0.000432348, 0.00232028, 0.00275263
0.000214221, 0.00220713, 0.00242135
0.000673349, 0.00150115, 0.00217449
0.000294823, 0.00152492, 0.00181974

like this.

 

How can I solve this problem?

This is my code which get the output data and put into ptr.

        Blob::Ptr bin_output_blob = infer_request.GetBlob("model/softmax_output_bin");
        float *outputData = static_cast<PrecisionTrait<Precision::FP32>::value_type*>(bin_output_blob->buffer());   

 

0 Kudos
5 Replies
Shubha_R_Intel
Employee
108 Views

Dear Lee, Sungjin,

Another forum poster has also noticed that the softmax values don't add up to 1 here . That forum poster noticed it on FP16 (NCS2) and you are noticing it in FP32.

It seems to be a real issue. I will file a bug. 

Sorry for the inconvenience. I will keep you posted !

Shubha

 

Lee__Sungjin
Beginner
108 Views

Thanks for the answer!

 

In addition, I also inferred with FP16 IR model using MYRIAD (NEURAL COMPUTE STICK 2).

The Softmax results of FP16 model also don't add up to 1.

  x    y         ch1        ch2         sum
(184, 124)  0,           0.00418091, 0.00418091
(185, 124)  0,           0.00320625, 0.00320625
(186, 124)  7.07507e-05, 0.00298882, 0.00305957
(187, 124)  0,           0.00250053, 0.00250053
(188, 124)  7.42674e-05, 0.00341225, 0.00348651
(189, 124)  0,           0.00243378, 0.00243378
(190, 124)  7.15256e-05, 0.00343895, 0.00351048
(191, 124)  0,           0.00252724, 0.00252724

like this.

 

And, if I change threshold for detecting heuristically, the detection works correctly.

But I just don't understand that softmax results don't add up to 1 in both FP16 and FP32 model.

I'm looking forward to your answer. 

 

Thanks.

 

Sungjin

 

Shubha_R_Intel
Employee
108 Views

Dear Sungjin,

Thanks for your additional info ! Yes your findings exactly match the other forum poster's on FP16/NCS2 - he observed the same as you. I have filed a bug on the softmax issue. Can you clarify what you mean by this statement ?

And, if I change threshold for detecting heuristically, the detection works correctly.

Thanks,

Shubha

Lee__Sungjin
Beginner
108 Views

Hi!

In my model, 'model/softmax_output_bin' layer detects a lot of features as well as the lane features.

Therefore, I have to set the threshold up so that only strong features (lane) remain. so I can detect the lane only. It is about 0.98.

However, In the IR model inference, the threshold is about 0.02. because 0.02 is one of  the strong values in the result of softmax output. So I cannot understand why 0.02 is the strong value in the softmax output. But the detection works correctly.

 

This is all that I wanted to say.

 

Thanks.

Sungjin

Shubha_R_Intel
Employee
108 Views

Dear Lee, Sungjin,

Thank you for your explanation. As aforementioned, I have filed a bug on the softmax issue.

Thanks,

Shubha