Community
cancel
Showing results for 
Search instead for 
Did you mean: 
pgleb
Beginner
273 Views

Implement and run the existing SSD network on the FPGA?

I have the SSD convolutional network that is currently working on the Android device. How it is possible to run this network on Intel FPGA? The network is in the protobuf (TF) format.

The first thing as I understand I should use model Optimizer script from OpenVino to convert this model into appropriate format. But what I should do next?

Does Intel CNN library for FPGA support SSD-mobilenet architecture?

0 Kudos
6 Replies
JohnT_Intel
Employee
12 Views

Hi, May I know if your model is train in FP16 or FP32 format? If yes then you can directly use the model optimizer script to convert the model to be OpenVINO compliance. Then you should be able to run it as with FPGA.
pgleb
Beginner
12 Views

Thank you for the answer. My model has FP32 format of weights. As I know Model optimizer converts TF model into Intermediate Representation. But what I should do next to prepare data for FPGA?

JohnT_Intel
Employee
12 Views

Hi, After converting it then you can run it directly using the new convert model (xml) file. There is no different compare to the existing model.
pgleb
Beginner
12 Views

Thank you for your answer. Could you, please, give me the link to the step by step instruction of FPGA SSD model deployment?

pgleb
Beginner
12 Views

Is it possible to emulate inference of FPGA model representation on the CPU?

JohnT_Intel
Employee
12 Views

Hi, No, you are not able to emulate it on CPU. You may refer to https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Samples_Overview.html on how the samples or demo application is being develop.