- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Does Intel Compute Stick 2 support Tensorflow 2.2.0?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello LuoHan,
Thank you for posting on the Intel® communities.
Can you please confirm if this is the product that you are contacting us about?
I look forward to hearing from you.
Regards,
Victor G.
Intel Technical Support Technician
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello LuoHan,
Thank you for posting on the Intel® communities.
We appreciate the confirmation, a specialist in your product will be helping you as soon as possible; therefore, please wait for their contact here in the forum. We appreciate your patience and business.
Regards,
Victor G.
Intel Technical Support Technician
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi LuoHan,
Greetings to you.
Neural Compute Stick 2 is to be used with OpenVINO™ toolkit. OpenVINO™ toolkit is a comprehensive toolkit for quickly developing applications and solutions that solve a variety of tasks. OpenVINO™ toolkit consists of two main components, which are Model Optimizer and Inference Engine. Model Optimizer optimizes and converts models into its Intermediate Representation (.xml and .bin file). Intermediate Representation is the only format that the Inference Engine accepts and understands. The Inference Engine helps in proper execution of the model on different devices. It manages the libraries required to run the code properly on different platforms.
You may refer to the official documentation to gain more understanding about OpenVINO™ workflow.
https://docs.openvinotoolkit.org/latest/index.html#introduction
OpenVINO™ toolkit introduced MYRIAD plugin which allows inferencing of neural networks on Intel® Neural Compute Stick 2. The Inference Engine MYRIAD plugin supports many TensorFlow networks, which you may refer to the following links:
OpenVINO™ officially supports TensorFlow2 models in two model formats: SavedModel and Keras H5 (or HDF5) but OpenVINO™ does not support models with Keras RNN and Embedding layers. Furthermore, please take note that currently, OpenVINO™ support for TensorFlow2 models is in preview (aka Beta), which means limited and not of production quality yet.
You may refer to the following link for more information about converting TensorFlow2 model into IR formats.
Regards,
Peh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi LuoHan,
This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question.
Regards,
Peh

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page