- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
It is possible to use 2 graph files now in V2, so I have a question about that.
Is it required that each graph has its own fifo or can you share that? I ask because I would have the same image data that should be inferenced by one graph, then by the other graph. So I would think that I only need to send it once preferrably.
Also, the is there some mode where the device just runs the inference as soon as it finds something in the input fifo?
So that you don't need to execute "queueInference", but rather just need to monitor the output fifo queue?
Thanks,
Best regards
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@metalMajor Each distinct graph should essentially have their own fifo. You can allocate multiple fifos to one device. If you're using two different graphs, you'll have to make two inference calls, one for each fifo and you'll get the results back one at a time since the device can only process one inference at a time.
Instead of using fifo.write_elem() and then using graph.queue_inference(), you can use graph.queue_inference_with_fifo_elem which will do a fifo.write and then a graph.queue_inference in one shot. Example: https://github.com/movidius/ncappzoo/blob/ncsdk2/apps/image-classifier/image-classifier.py#L90
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page