Community
cancel
Showing results for 
Search instead for 
Did you mean: 
n30
Beginner
451 Views

TensorFlow Object Counting API with OpenVINO

Hi,

is there a good guide or tutorial on how to use the TensorFlow Object Counting API with OpenVINO, ideally on Raspberry Pi + the Intel Neural Compute Stick and ideally for custom objects using a frozen model in form of a .pb file.

I really tried to find something, but encountered only solutions for parts of it, which then do not work together. If anyone has any links, please let me know.

Many thanks o_O

0 Kudos
6 Replies
David_C_Intel
Employee
428 Views

Hi n30,

Thanks for reaching out. We do not have an official guide or tutorial for that specific API, but you can refer to this Converting TensorFlow* Object Detection API Models documentation to try it and get an idea on how it works. Also, there might be other projects such as this People Counter that use OpenVINO™ toolkit and Intel® NCS2, which you can check and modify to suit your needs.

Best regards,

David C.


n30
Beginner
390 Views

Dear David,

 

thank you for your reply.

 

I followed the Udacity tutorial. 

However, when I try to convert my .pb file to the IR format using the model optimizer, I get the following error:

[ FRAMEWORK ERROR ] Cannot load input model: TensorFlow cannot read the model file: "retractedpath/model_optimizer/model.pb" is incorrect TensorFlow model file.
The file should contain one of the following TensorFlow graphs:
1. frozen graph in text or binary format
2. inference graph for freezing with checkpoint (--input_checkpoint) in text or binary format
3. meta graph

Make sure that --input_model_is_text is provided for a model in text format. By default, a model is interpreted in binary format. Framework error details: Wrong wire type in tag..
For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #43.

 

Is it possible this is because I exported the .pb from Google Cloud AutoML Vision API.

 

I used the "Export your model as a TF Saved Model to run on a Docker container." export function. Is there any extra step here - it is a normal .pb file. There is no export for plain TensorFlow, only this and TensorFlowJS and TensorFlow Lite.

 

Thanks!

n30
Beginner
389 Views

Dear David,

 

thank you for your reply.

 

I followed the Udacity tutorial. 

However, when I try to convert my .pb file to the IR format using the model optimizer, I get the following error:

[ FRAMEWORK ERROR ] Cannot load input model: TensorFlow cannot read the model file: "retractedpath/model_optimizer/model.pb" is incorrect TensorFlow model file.
The file should contain one of the following TensorFlow graphs:
1. frozen graph in text or binary format
2. inference graph for freezing with checkpoint (--input_checkpoint) in text or binary format
3. meta graph

Make sure that --input_model_is_text is provided for a model in text format. By default, a model is interpreted in binary format. Framework error details: Wrong wire type in tag..
For more information please refer to Model Optimizer FAQ question #43.

 

Is it possible this is because I exported the .pb from Google Cloud AutoML Vision API.

 

I used the "Export your model as a TF Saved Model to run on a Docker container." export function. Is there any extra step here - it is a normal .pb file. There is no export for plain TensorFlow, only this and TensorFlowJS and TensorFlow Lite.

 

Thanks!

n30
Beginner
409 Views

Hi David,

 

thank you for your reply.

 

I followed the Udacity tutorial. 

However, when I try to convert my .pb file to the IR format using the model optimizer, I get the following error:

[ FRAMEWORK ERROR ] Cannot load input model: TensorFlow cannot read the model file: "retractedpath/model_optimizer/model.pb" is incorrect TensorFlow model file.
The file should contain one of the following TensorFlow graphs:
1. frozen graph in text or binary format
2. inference graph for freezing with checkpoint (--input_checkpoint) in text or binary format
3. meta graph

Make sure that --input_model_is_text is provided for a model in text format. By default, a model is interpreted in binary format. Framework error details: Wrong wire type in tag..


 

Is it possible this is because I exported the .pb from Google Cloud AutoML Vision API.

 

I used the "Export your model as a TF Saved Model to run on a Docker container." export function. Is there any extra step here - it is a normal .pb file. There is no export for plain TensorFlow, only this and TensorFlowJS and TensorFlow Lite.

 

Best

David_C_Intel
Employee
399 Views

Hi n30,

Could you please answer the following:

  • From which topology is your model?
  • Could you share your .pb file and the command you used to convert it to IR format, so we can test it from our end?

Regards,

David C.


David_C_Intel
Employee
376 Views

Hi n30,

If you have any additional questions, please submit a new thread as this discussion will no longer be monitored.

Best regards,

David C.


Reply