This article is posted on behalf of Devang Aggarwal, Kumar Vishwesh, and Vibhu Bithar.
We all know that one size doesn’t fit all, right? So how about 270? Yes, that’s the number of models we support with OpenVINO™ integration with TensorFlow*.
To make life simple for you, Intel has brought the OpenVINO™ toolkit and TensorFlow so close that they almost feel like one.
TensorFlow is a well-known framework for Machine learning and the Intel® Distribution of OpenVINO™ toolkit is a toolkit from Intel to accelerate inferencing on Intel® hardware. To bring you the best of both worlds, we have integrated OpenVINO™ Toolkit with TensorFlow. You can now use existing TensorFlow models and accelerate the performance of those models using OpenVINO™ Integration with TensorFlow on Intel® hardware.
We know that the list is huge, and you’ll see some common ones like Efficient Net which is used for Image Classification, MobileNet which is used for Object Detection, and Bert that is used for Natural Language Processing. Let’s look at some interesting ones that are supported by OpenVINO™ Integration with TensorFlow.
Universal Sentence Encoder
The Universal Sentence Encoder is one of the most interesting encoders you can come across. The Universal Sentence Encoder encodes text into high-dimensional vectors that can be used for text classification, semantic similarity, clustering, and other natural language tasks. All you have to do is feed text to this encoder and it embeds it for you.
An interesting object detection model is Mask RCNN. Mask RCNN is a deep neural network aimed to solve instance segmentation problems in machine learning or computer vision. In other words, it can separate different objects in an image or a video. You give it an image. It gives you the object bounding boxes, classes, and masks.
Check out the complete list of models supported by OpenVINO™ Integration with TensorFlow here.
Are you ready to try out a few models from the list and accelerate performance with OpenVINO™ Integration with TensorFlow? Go ahead and follow the steps here.
Want to see how it works right away? You can go ahead and try some of our existing samples here.
Don’t have access to Intel® Hardware? Sign up on Intel® DevCloud and test out the OpenVINO™ Integration with TensorFlow sample Jupyter Notebooks, a development sandbox that has the latest Intel® hardware and software. Here you can test out the Object Detection and Image Classification sample Jupyter Notebooks on Intel® CPU, GPU, and VPU.
It doesn’t matter what your use case is, OpenVINO™ Integration with TensorFlow is there to help you boost your performance with minimal effort required from your side. All you have to do is bring your TensorFlow model and OpenVINO™ Integration with TensorFlow will take care of the rest on Intel® Hardware.
- Install OpenVINO™ Integration with TensorFlow from PyPi.
- Check out the OpenVINO™ Integration with TensorFlow Github Repository for in depth information and exciting samples.
- Sign up on Intel® DevCloud and test out the OpenVINO™ Integration with TensorFlow sample Jupyter Notebooks.
Notices & Disclaimers
Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex .
Performance results are based on testing as of dates shown in configurations and may not reflect all publicly available updates. See backup for configuration details. No product or component can be absolutely secure.
Your costs and results may vary.
Intel technologies may require enabled hardware, software or service activation.
© Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.