Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
22 Views

VPP - Composition - Windows

Hello,

I am trying to use sample_vpp.exe provided. with following param file. It gives error "Return on error: error code 1,  src\sample_vpp.cpp      380". I tried to put some prints inside sample code and SyncOperation is failing with error code of -17 (MFX_ERR_DEVICE_FAILED               = -17,  /* device operation failure */). Is RGBA as second surface supported on Windows. I am having API version 1.16 hw and 1.17 sw. Also if Pixel level alpha supported on this platform and API.

stream=camera_capture.nv12
width=640
height=480
cropx=0
cropy=0
cropw=640
croph=480
fourcc=nv12
dstx=0
dsty=0
dstw=640
dsth=480

stream=camera_capture.rgba
width=640
height=480
cropx=0
cropy=0
cropw=640
croph=480
dstx=0
dsty=0
dstw=160
dsth=120
fourcc=rgb4
GlobalAlphaEnable=0
GlobalAlpha=128

LumaKeyEnable=0
LumaKeyMin=250
LumaKeyMax=255

0 Kudos
3 Replies
Highlighted
22 Views

The sample_vpp par files can be tricky.  There is a readme-vpp pdf which comes with the VPP sample which has more details and examples than are listed when you run it without parameters.

Alpha blending/Luma keying is supported on nv12 and rgb4 surfaces.  However, the primary stream, overlay stream, and output shouldn't mix color formats.  For best results convert all inputs to the same format before starting composition.  This can be a separate VPP filter if you like.  Separate VPPs instead of combining steps which may have ambiguous order is often a good approach.

Here is how I set up my experiments on a BDW Windows machine with the latest Media SDK and samples:

 ffmpeg -s 1920x1080 -i park_joy_1080p50.yuv -s 480x320 -pix_fmt bgra input_480x320.rgb
 ffmpeg -s 1920x1080 -i park_joy_1080p50.yuv -s 720x480 -pix_fmt bgra input_720x480.rgb

(ffmpeg bgra pix_fmt corresponds to Media SDK's rgb4.)

Par file:

primarystream=input_720x480.rgb
width=720
height=480
cropx=0
cropy=0
cropw=720
croph=480
dstx=0
dsty=0
dstw=720
dsth=480
fourcc=rgb4

stream=input_480x320.rgb
width=480
height=320
cropx=0
cropy=0
cropw=480
croph=320
dstx=100
dsty=100
dstw=320
dsth=240
fourcc=rgb4
GlobalAlphaEnable=1
GlobalAlpha=128
LumaKeyEnable=1
LumaKeyMin=128
LumaKeyMax=255

 

 

Running the test:
 

$ sample_vpp -lib hw -dcc nv12 -composite vpp3.par -o out_nv12.yuv
VPP Sample Version 6.0.0.142

Input format    RGB4
Resolution      720x480
Crop X,Y,W,H    0,0,720,480
Frame rate      30.00
PicStruct       progressive
Output format   NV12
Resolution      720x480
Crop X,Y,W,H    0,0,720,480
Frame rate      30.00
PicStruct       progressive

Video Enhancement Algorithms
Denoise         OFF
VideoAnalysis   OFF
ProcAmp         OFF
Detail          OFF
ImgStab         OFF

Memory type     system

MediaSDK impl   hw
MediaSDK ver    1.17

VPP started
Frame number: 500
VPP finished

 

 To view output: 

ffplay -pix_fmt nv12 -s 720x480 out_nv12.yuv

 

0 Kudos
Highlighted
Beginner
22 Views

As per Intel Media SDK Reference manual V 1.17 (Page 113, section on mfxExtVPPComposite)

 The only supported combinations of input and output color formats are:

 RGB to RGB,

 NV12 to NV12,

 RGB and NV12 to NV12, for per pixel alpha blending use case.

As per the 3rd option, one should be able to specify one stream as RGB and another as NV12 and get output as NV12. My specific use case is that I have a NV12 stream and I want to overlay content which is coming as RGB4 frame with Alpha channel and then encode the result.

0 Kudos
Highlighted
Employee
22 Views

Hi Ashim,

You are absolutely right, RGB+NV12=NV12 use case is supported, but only in case of per pixel alpha blending. Each RGB input surface can be blended by using PixelAlphaEnable value. Can't see it in your parameters list.

Best wishes,

Anna

0 Kudos