- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Link Copied
2 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Q.S,
There is one simple thing we can think,
- 1. There are multiple layers until decoding/ encoding happens in each OS.
- Also, those layers are not equivalent in both OS as well.
- Like, D3D11, D3D9, vs Libva.
Regards,
Peter.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
SEUNGHYUK P. (Intel) wrote:
Hi Q.S,
There is one simple thing we can think,
- 1. There are multiple layers until decoding/ encoding happens in each OS.
- Also, those layers are not equivalent in both OS as well.
- Like, D3D11, D3D9, vs Libva.
Regards,
Peter.
So the performance issus is related to libva?

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
CPU: i7 6700
command:
sample_multi_transcode -i::h264 a.h264 -o::h264 b.h264
Ubuntu16.04.3 X64
*** session 0 PASSED (MFX_ERR_NONE) 554.268 sec, 92583 frames
Win7 X64
*** session 0 PASSED (MFX_ERR_NONE) 450.646 sec, 92583 frames
Should I change some parameters?