- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I downloaded the Media SDK 2013.
This comes with a precompiled app that is called Intel Media SDK Browser.
Through this, you can access sample_encode.exe
This encodes the test_stream.yuv into res.264. This can be played with the sample media player - GUI player app - through the SDK browser. No problems here. It shows a low resolution river for a few seconds.
Now, import the sample_encode solution to VS2010, compile, run with the exact same command line arguments that the example used, get a .264 file out. Play that file. It is smeared garbage.
I am running windows7 64bit, and have a i7-3820QM, so it is working in SW only.
Anybody else see this or know how to fix it?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
The sample code should produce equivalent binary 'sample_encode.exe' if no source code is modified.
One way to debug what may be different is to use the 'tracer.exe' found in the <installdir>\tools\mediasdk_tracer folder to capture logs of the two applications and compare.
Also, I do not understand you comment about "SW only'. The application defaults to using only software. To use the hardware capabilities of your platform, you can use the "-hw" option and observe the code path this effects.
-Tony
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You were right, it was my mistake. The sample_encode seems to only be able to handle very specific output dimensions correctly. The one the precanned package has is 176x96. If I switch that to 176x90, bad things happen. Is that a limitation of the program, or something fundamental I don't understand about the formats?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, it depends on how you 'swtich that'.
The surface data is not stored as complete/packed pixels. The data is 'planar' and contains 'luminance' for the full surface followed by 'chrominance'. In this case the .yuv files has a 'Y' plane of 176x96 pixels followed by U and V, so you can not use the same data as a different size.
Also, video encoding requires some limits and the mfxFrameInfo structure needs to be filled out with correct/usable alignments. The actual 'video' content may be smaller using the 'crop' feature. For example 1080p content must be encoded with a height of 1088.
Coded width and height of the video frame in pixels; Width must be a multiple of 16. Height must be a multiple of 16 for progressive frame sequence and a multiple of 32 otherwise.
-Tony
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, it depends on how you 'swtich that'.
The surface data is not stored as complete/packed pixels. The data is 'planar' and contains 'luminance' for the full surface followed by 'chrominance'. In this case the .yuv files has a 'Y' plane of 176x96 pixels followed by U and V, so you can not use the same data as a different size.
Also, video encoding requires some limits and the mfxFrameInfo structure needs to be filled out with correct/usable alignments. The actual 'video' content may be smaller using the 'crop' feature. For example 1080p content must be encoded with a height of 1088.
Coded width and height of the video frame in pixels; Width must be a multiple of 16. Height must be a multiple of 16 for progressive frame sequence and a multiple of 32 otherwise.
-Tony
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page