Community
cancel
Showing results for 
Search instead for 
Did you mean: 
rongrong__wang
Beginner
94 Views

openvino lower precision

https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Int8Inference.html

in this document, i see the sentence:

This means that 8-bit inference can only be performed with the CPU plugin on the layers listed above. All other layers are executed in the format supported by the CPU plugin: 32-bit floating point format (fp32).

so, i should use the data format fp32 if i use the GPU and other layers that are not in the list?

Thank you!

0 Kudos
2 Replies
Shubha_R_Intel
Employee
94 Views

Dear rongrong, wang

INT8 inference is supported by GPU but the performance is not good on our current GPUs so I would advise against using it. 

Thanks,

Shubha

 

rongrong__wang
Beginner
94 Views

Shubha R. (Intel) wrote:

Dear rongrong, wang

INT8 inference is supported by GPU but the performance is not good on our current GPUs so I would advise against using it. 

Thanks,

Shubha

 

Thank you very much! I tried it and it really is like this

Reply