Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

RNN LSTM / GRU in CV SDK

nikos1
Valued Contributor I
2,171 Views

Are LSTM / GRU RNN supported in the CV SDK?  Have not been able to find any samples yet, I think it is just CNN support for now. Are there any plans to support RNN in the next release?

0 Kudos
1 Solution
Anna_B_Intel
Employee
2,171 Views

Hi Nikos, 

You're absolutely right - RNNs are not supported in current release. Could you please provide us more details about your use case? Are you going to use it in CV tasks? This information will help us to prioritize further plans. 

Best wishes, 

Anna 

View solution in original post

0 Kudos
18 Replies
Anna_B_Intel
Employee
2,172 Views

Hi Nikos, 

You're absolutely right - RNNs are not supported in current release. Could you please provide us more details about your use case? Are you going to use it in CV tasks? This information will help us to prioritize further plans. 

Best wishes, 

Anna 

0 Kudos
nikos1
Valued Contributor I
2,171 Views

Thank you for the confirmation on RNN support. Yes, this is mainly needed for typical CV tasks when we need to combine CNN with RNN. In some applications, for example, CNN will get us the convolutional features we need from different video frames and send them to an LSTM (or bidirectional LSTM) or GRU for further analysis, like for example activity recognition tasks. In some other cases we can simply use CNN+RNN in one single frame, like in many existing solutions that use CNN and RNN for typical OCR tasks. Your existing release with CNN support is fantastic and we are looking forward to your future releases that will start supporting RNN.

0 Kudos
Anna_B_Intel
Employee
2,171 Views

Hi Nikos, 

Thanks for positive feedback and use-case explanation. I'll let you know when RNN support appear.

Best wishes, 

Anna

0 Kudos
houbiao__liu
Beginner
2,171 Views

Are  LSTM / GRU RNN supported in the Movidius Myraid X VPU (Movidius NCS 2)?

0 Kudos
nikos1
Valued Contributor I
2,171 Views

Hello Anna,

Just wondering if you could possibly provide an update on current status of LSTM in OpenVino.

Based on release notes:

Extends neural network support to include LSTM (long short-term memory) from ONNX*, TensorFlow*& MXNet* frameworks, & 3D convolutional-based networks in preview mode (CPU-only) to support additional, new use cases beyond computer vision.

Does this mean LSTM is in preview mode and the only supported device is CPU ? if I understand this correctly there is no LSTM support for GPU/NCS, correct?

Thanks,

Nikos

 

Anna B. (Intel) wrote:

Hi Nikos, 

Thanks for positive feedback and use-case explanation. I'll let you know when RNN support appear.

Best wishes, 

Anna

0 Kudos
cbohr1
Beginner
2,171 Views
hello Anna Are LSTM / GRU RNN supported in the Movidius Myraid X VPU (Movidius NCS 2)?
0 Kudos
Shubha_R_Intel
Employee
2,171 Views

Dear bohra, chandni,

According to the Supported Devices document, LSTM is supported on MYRIAD but GRU is not. RNN is also unsupported on VPU.

Hope it helps,

thanks,

Shubha

0 Kudos
cbohr1
Beginner
2,171 Views
Dear Shobha in openvino R2 vpu is not supporting Atanh layer and its supporting in cpu .As Lstm architecture used tanh how it is possible its supporting lstm model and one more question when i am try to generate xml and bin files of lstm using openvino R2 its showing some error while running on CPU RuntimeError: Cannot detect right dims for nodes lstm1/while/Const_4/Output_0/Data__const and lstm1/while/clip_by_value_2 the above error shown...how can i remove this error??
0 Kudos
Shubha_R_Intel
Employee
2,171 Views

Dear bohra, chandni

Where are you seeing that atan is supported on CPU  ?

If atan is an Activation Layer it's not one of the layers OpenVino supports : see The IR Activation Section . I see tanh support but not atanh.

And LSTM normally uses tanh not arctanh.

As for your runtime error above, it's very hard to say what happened without looking at your IR and your Inference Engine code.

Hope it helps,

Shubha

0 Kudos
cbohr1
Beginner
2,171 Views
https://docs.openvinotoolkit.org/2019_R2/_docs_IE_DG_supported_plugins_Supported_Devices.html here its written that atanh is supporting cpu u can check this link...
0 Kudos
Shubha_R_Intel
Employee
2,171 Views

Dear bohra, chandni,

You got me there. Thanks for finding the documentation proof for  CPU support of atanh (via custom kernels). atanh in this case is not an Activation layer though - it's just a regular layer. Do you have an example model with atanh which fails OpenVino either at the Model Optimizer stage or Inference Engine stage ? If so, I'd be happy to reproduce the issue and file a bug. Please attach your model and Inference program to this ticket as a *.zip. Of if you'd prefer you can PM the package to me privately.

Looking forward to receiving your stuff,

Shubha

0 Kudos
cbohr1
Beginner
2,171 Views
yes mam ,i can share with u privately please provide ur email id .I have one lstm model that is having tanh layer and its not working while doing inference .
0 Kudos
ha__minh_quyet
Beginner
2,171 Views

Shubha R. (Intel) wrote:

Dear bohra, chandni,

You got me there. Thanks for finding the documentation proof for  CPU support of atanh (via custom kernels). atanh in this case is not an Activation layer though - it's just a regular layer. Do you have an example model with atanh which fails OpenVino either at the Model Optimizer stage or Inference Engine stage ? If so, I'd be happy to reproduce the issue and file a bug. Please attach your model and Inference program to this ticket as a *.zip. Of if you'd prefer you can PM the package to me privately.

Looking forward to receiving your stuff,

Shubha

Dear Shubha R. ,

Thank you so much for fixing the memory corruption bug while loading the IR model with cpu_extension. However, I got the same issue with Mr. Bohra:
RuntimeError: Cannot detect right dims for nodes bidirectional_1/while/Const_4/Output_0/Data__const and bidirectional_1/while/clip_by_value_2

If you need file a bug for the issue, I will send you my IR model and inference code. Thank you so much for your work in the program.

Best regards,

Minh Quyet

0 Kudos
Shubha_R_Intel
Employee
2,171 Views

Dear Minh ,

I got your PM and replied to it also - via PM. Sure, please send me your IR model and inference code which demonstrates this bug.

Thanks !

Shubha

0 Kudos
cbohr1
Beginner
2,171 Views
hi shobha how do i can share my model with u privately??an i want to know can we use any framework like onnx,kaldi which support some layers that tensorflow doesnot support by generating IR files for the same model for which i m using tensorflow means two models of tensorflow and one model trained using differernt framework.
0 Kudos
Shubha_R_Intel
Employee
2,171 Views

Dear bohra, chandni,

I have PM'd you, simply reply to this message with an attached *.zip file.

Looking forward to receiving your stuff -

Shubha

 

0 Kudos
cbohr1
Beginner
2,171 Views
Hi Shobha i Can share the model that already shared by neha soni we are working together You are requesting for inference file that file will not work without using 3 cnn models which we cant share .I want to know can we run three models of tensorlfow and one model of some other framework beacause in documentation its given that lstm with ncs is supported using onnx framework.In documentation its not clearly given that lstm supports tensorflow model its only given information of devices that this device will support and where layer support given nowhere mention lstm supports tensorflow model using tanh as activation function. Thanks
0 Kudos
Shubha_R_Intel
Employee
2,171 Views

Dear bohra, chandni,

In OpenVino since models are converted to IR, yes you can use different frameworks for multiple models, i.e. one originally ONNX, another originally Caffe, and the final originally Tensorflow. From OpenVino's perspective, it doesn't care, as long as the models get converted to IR successfully.

I think the document you're referring to is the Supported Devices Document which lists supported layers per hardware. However, there is also The Model Optimizer Supported Layers document.

If you cannot share your models, I definitely understand. However, in this case there's a limited amount I can help you unless the error is super obvious.

Hope it helps,

Thanks,

Shubha

0 Kudos
Reply