Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Custom stateful model

rawsock
Employee
1,988 Views

Hi,

I have a pytorch CNN model which needs to store and retrieve last conv states between low-latency inferences for custom padding. Onnx does not support Assign/ReadValue ops and InferenceEngine::LowLatency() works only with LSTMs. What would be the recommended way to add Assign/ReadValue ops at arbitrary places so that the states I need are stored between inferences?

0 Kudos
10 Replies
Adli
Moderator
1,936 Views

Hi rawsock,

 

Thank you for reaching out to us. We are investigating this issue and will get back to you soon.

 

Regards,

Adli


Adli
Moderator
1,908 Views

Hi rawsock,

 

Thank you for waiting. Just a quick check, have you tried exporting the Pytorch model with custom op to ONNX? Please refer to the following page. Then, please refer to the following article. The article is on how to add support for an unsupported layer in the OpenVINO toolkit.

 

Regards,

Adli

 

 

0 Kudos
rawsock
Employee
1,900 Views

No, not yet, but it sounds like a lot of work. Is this the only work-around as of today?

Regards

0 Kudos
Adli
Moderator
1,892 Views

Hi rawsock,

 

Thank you for your prompt response. It is just a quick check. Feel free to try it. I will update any info regarding this issue as soon as possible. 

 

Regards,

Adli

 

0 Kudos
Adli
Moderator
1,874 Views

Hi rawsock,

 

OpenVINO contains a special API to simplify work with networks with states. The state is automatically saved between inferences, and there is a way to reset the state when needed. You can also read the state or set it to some new value between inferences.

 

Inference Engine has the InferRequest::QueryState method to get the list of states from a network and IVariableState interface to operate with states. Please refer to the following link:

https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_network_state_intro.html#openvino_state_api

 

Using several threads is possible if you have several independent sequences. Then each sequence can be processed in its own infer request. Please note that inference of one sequence in several infer requests is not recommended. You can refer to the following link for an example of stateful network inference:

https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_network_state_intro.html#example_of_stateful_network_inference

 

Member Function of GetState(), Reset() and SetState() usage might be helpful for your situation:

https://docs.openvinotoolkit.org/2021.2/classInferenceEngine_1_1IVariableState.html

 

Regards,

Adli

 

0 Kudos
rawsock
Employee
1,869 Views

My understanding was that GetState/SetState work as long as there are ReadValue/Assign ops in proper places. And if that's the case, I do not need GetState/SetState anyway, since state will be saved automatically.

Are you saying that GetState/SetState work on any node, without ReadValue/Assign ops present in IR?

0 Kudos
Adli
Moderator
1,856 Views

Hi rawsock,

 

My sincere apologies for the delayed response. We will verify this and get back to you as soon as possible.

 

Regards,

Adli


0 Kudos
rawsock
Employee
1,842 Views
0 Kudos
Adli
Moderator
1,825 Views

Hi rawsock,

 

Thank you for your patience. Based on the documentation:

  • GetState/SetState -> returns current value of state/set new value for state
  • ReadValue/Assign -> return/assign value

 

It is possible for the GetState/SetState to work on any node without ReadValue/Assign operation present in IR. In the case of low-latency transformation, the transformation will not involve the state. Rather use ReadValue/Assign to interact with the value.

 

Regards,

Adli

 

0 Kudos
Munesh_Intel
Moderator
1,807 Views

Hi Rawsock,

This thread will no longer be monitored since we have provided references and recommendations. If you need any additional information from Intel, please submit a new question.


Regards,

Munesh


0 Kudos
Reply