- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I am developing videoconference application. Please help me to understand how many sessions can be processed simulteniously. Request of Decoder->QueryIOSurf() returns NumFrameMin = NumFrameSuggested = 17 for 640x480 frame (also 7 for 1280x720 and 9 for 320x240). The same values passed to callback Alloc() when Decoder->Init() is called. Any different values I am trying to pass with Request.NumFrameSuggested leads to Decoder->Init() error. So, 1 decoder session required to allocate 17 * 640*480*1,5 = 7833600 bytes and 10 simultaneous sessions will require 78 Mb. But I have only 64M of video memory on my PC. Is it real limitation of 8 sessions in this case? What happened when I shall try to alocate more video memory than available? Is any way to reduce the number of allocated frames for decoder session?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Andrew,
QueryIOSurf does return the number of required surfaces for decode. The # of surfaces required depends on many things, such as profile, GOP pattern, asyncdepth etc.
There is unfortunately no way of using less surfaces than suggested by QueryIOSurf. If you need more sessons you will have to make sure your machine as enough memory to support it.
Regards,
Petter
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page