Intel® Compute Stick
Discussions Regarding Intel® Compute Sticks and Cards
Announcements
The Intel sign-in experience has changed to support enhanced security controls. If you sign in, click here for more information.
547 Discussions

Does Intel Compute Stick 2 support Tensorflow 2.2.0?

LuoHan
Beginner
1,076 Views

Does Intel Compute Stick 2 support Tensorflow 2.2.0?

Labels (1)
0 Kudos
5 Replies
Victor_G_Intel
Moderator
1,060 Views

Hello LuoHan,

 

Thank you for posting on the Intel® communities.

 

Can you please confirm if this is the product that you are contacting us about?


I look forward to hearing from you.

 

Regards,

 

Victor G.

Intel Technical Support Technician 


LuoHan
Beginner
1,057 Views
Yes, the product that I'm asking about is the one in the link you sent.
Victor_G_Intel
Moderator
1,051 Views

Hello LuoHan,


Thank you for posting on the Intel® communities.


We appreciate the confirmation, a specialist in your product will be helping you as soon as possible; therefore, please wait for their contact here in the forum. We appreciate your patience and business.


Regards,


Victor G. 

Intel Technical Support Technician


Peh_Intel
Moderator
1,041 Views

Hi LuoHan,


Greetings to you.


Neural Compute Stick 2 is to be used with OpenVINO™ toolkit. OpenVINO™ toolkit is a comprehensive toolkit for quickly developing applications and solutions that solve a variety of tasks. OpenVINO™ toolkit consists of two main components, which are Model Optimizer and Inference Engine. Model Optimizer optimizes and converts models into its Intermediate Representation (.xml and .bin file). Intermediate Representation is the only format that the Inference Engine accepts and understands. The Inference Engine helps in proper execution of the model on different devices. It manages the libraries required to run the code properly on different platforms.


You may refer to the official documentation to gain more understanding about OpenVINO™ workflow.

https://docs.openvinotoolkit.org/latest/index.html#introduction


OpenVINO™ toolkit introduced MYRIAD plugin which allows inferencing of neural networks on Intel® Neural Compute Stick 2. The Inference Engine MYRIAD plugin supports many TensorFlow networks, which you may refer to the following links:

https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_supported_plugins_MYRIAD.html#supported_networks


OpenVINO™ officially supports TensorFlow2 models in two model formats: SavedModel and Keras H5 (or HDF5) but OpenVINO™ does not support models with Keras RNN and Embedding layers. Furthermore, please take note that currently, OpenVINO™ support for TensorFlow2 models is in preview (aka Beta), which means limited and not of production quality yet.


You may refer to the following link for more information about converting TensorFlow2 model into IR formats.

https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_TensorFlow.html#Convert_From_TF2X



Regards,

Peh



Peh_Intel
Moderator
999 Views

Hi LuoHan,


This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question. 



Regards,

Peh


Reply