<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Openvino 2022 - using multiple NCS2 asynchrounously in Intel® Distribution of OpenVINO™ Toolkit</title>
    <link>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1404628#M28041</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm trying to update my stack to Openvino 2022 to benefit from the latest news. And I'm using this update to gain a little performance by trying to use multiple NCS2 asynchrounously.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;However I'm a bit stuck despite reading quite some documentation on your website and the community. I'm finding a lot of examples of this on 2021 version but I can't manage to pass it to 2022 2.0 API.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The closest example I found is this :&amp;nbsp;&lt;A href="https://github.com/openvinotoolkit/openvino/tree/master/samples/cpp/classification_sample_async" target="_blank"&gt;https://github.com/openvinotoolkit/openvino/tree/master/samples/cpp/classification_sample_async&lt;/A&gt;&amp;nbsp;but it doesn't manage multiple devices.&lt;BR /&gt;&lt;BR /&gt;Would you have a very basic example of a model loaded and called by batch on more than 1 NCS2 ? That would be immensely helpful for me.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Greg&lt;/P&gt;</description>
    <pubDate>Fri, 29 Jul 2022 16:33:42 GMT</pubDate>
    <dc:creator>gaudibert</dc:creator>
    <dc:date>2022-07-29T16:33:42Z</dc:date>
    <item>
      <title>Openvino 2022 - using multiple NCS2 asynchrounously</title>
      <link>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1404628#M28041</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm trying to update my stack to Openvino 2022 to benefit from the latest news. And I'm using this update to gain a little performance by trying to use multiple NCS2 asynchrounously.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;However I'm a bit stuck despite reading quite some documentation on your website and the community. I'm finding a lot of examples of this on 2021 version but I can't manage to pass it to 2022 2.0 API.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The closest example I found is this :&amp;nbsp;&lt;A href="https://github.com/openvinotoolkit/openvino/tree/master/samples/cpp/classification_sample_async" target="_blank"&gt;https://github.com/openvinotoolkit/openvino/tree/master/samples/cpp/classification_sample_async&lt;/A&gt;&amp;nbsp;but it doesn't manage multiple devices.&lt;BR /&gt;&lt;BR /&gt;Would you have a very basic example of a model loaded and called by batch on more than 1 NCS2 ? That would be immensely helpful for me.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Greg&lt;/P&gt;</description>
      <pubDate>Fri, 29 Jul 2022 16:33:42 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1404628#M28041</guid>
      <dc:creator>gaudibert</dc:creator>
      <dc:date>2022-07-29T16:33:42Z</dc:date>
    </item>
    <item>
      <title>Re: Openvino 2022 - using multiple NCS2 asynchrounously</title>
      <link>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1405016#M28045</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Hi Gaudibert,&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Thanks for reaching out to us.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;For your information, you can use multiple Intel® Neural Compute Stick 2 asynchronously as shown as follows:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;1.&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;Run &lt;/SPAN&gt;&lt;A style="font-family: Helvetica, sans-serif; font-size: 16px;" href="https://github.com/openvinotoolkit/openvino/tree/2022.1.0/samples/python/hello_query_device#hello-query-device-python-sample-openvino_inference_engine_ie_bridges_python_sample_hello_query_device_readme" target="_blank" rel="noopener noreferrer"&gt;Hello Query Device Python Sample&lt;/A&gt;&lt;SPAN&gt; script which is located in the following directory:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;cd &amp;lt;INSTALL_DIR&amp;gt;\samples\python\hello_query_device

python hello_query_device.py&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2 device.png" style="width: 999px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/32214i5CBFF7B87A3B9599/image-size/large?v=v2&amp;amp;px=999&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="2 device.png" alt="2 device.png" /&gt;&lt;/span&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;2.&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;Use Multi-device plugin with the following command:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;python classification_sample_async.py -m "&amp;lt;path_to_model&amp;gt;\alexnet.xml" -i "path_to_input_1" "path_to_input_2" -d MULTI:MYRIAD.5.2-ma2480,MYRIAD.5.3-ma2480&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="multi.png" style="width: 999px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/32216i81BB0D33C05DCFBD/image-size/large?v=v2&amp;amp;px=999&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="multi.png" alt="multi.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;For more information, please refer to &lt;/SPAN&gt;&lt;A style="font-family: Helvetica, sans-serif; font-size: 16px;" href="https://docs.openvino.ai/latest/openvino_docs_OV_UG_Running_on_multiple_devices.html?sw_type=switcher-python#running-on-multiple-devices-simultaneously" target="_blank" rel="noopener noreferrer"&gt;Running on multiple devices simultaneously&lt;/A&gt;&lt;SPAN&gt; and &lt;/SPAN&gt;&lt;A style="font-family: Helvetica, sans-serif; font-size: 16px;" href="https://www.intel.sg/content/www/xa/en/support/articles/000055294/boards-and-kits/neural-compute-sticks.html" target="_blank" rel="noopener noreferrer"&gt;Multi-Device Plugin with the Intel® Neural Compute Stick 2&lt;/A&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Regards,&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Wan&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 01 Aug 2022 04:30:16 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1405016#M28045</guid>
      <dc:creator>Wan_Intel</dc:creator>
      <dc:date>2022-08-01T04:30:16Z</dc:date>
    </item>
    <item>
      <title>Re: Openvino 2022 - using multiple NCS2 asynchrounously</title>
      <link>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1405061#M28046</link>
      <description>&lt;P&gt;Hello Wan,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you for your quick reply. That is exactly what I needed.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I should have taken a look inside the openvino repository and not only online, sorry about that !&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Cheers,&lt;BR /&gt;Greg&lt;/P&gt;</description>
      <pubDate>Mon, 01 Aug 2022 08:59:43 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1405061#M28046</guid>
      <dc:creator>gaudibert</dc:creator>
      <dc:date>2022-08-01T08:59:43Z</dc:date>
    </item>
    <item>
      <title>Re:Openvino 2022 - using multiple NCS2 asynchrounously</title>
      <link>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1407790#M28123</link>
      <description>&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;Hi Gaudibert,&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;Thanks for your question.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;This thread will no longer be monitored since we have provided information.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;If you need any additional information from Intel, please submit a new question. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;Best regards,&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: Helvetica, sans-serif; font-size: 16px;"&gt;Wan&lt;/SPAN&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 12 Aug 2022 03:08:45 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Openvino-2022-using-multiple-NCS2-asynchrounously/m-p/1407790#M28123</guid>
      <dc:creator>Wan_Intel</dc:creator>
      <dc:date>2022-08-12T03:08:45Z</dc:date>
    </item>
  </channel>
</rss>

