Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

MFXVideoCORE_SyncOperation() keeps returning MFX_WRN_IN_EXECUTION

Diederick_H_
Beginner
1,241 Views

Hi,

I'm using the INDE Media SDK, 6.0.0.388 and using the encoder API for a real time video streaming application. Encoding runs fine for about 15 minutes but after that calls to MFXVideoCore_SyncOperation() keeps returning MFX_WRN_IN_EXECUTION. When I use a 'wait' of e.g. 60000 as used in the tutorials (mediasdk-tutorials-0.0.3/simple_3_encode_vmem_async) the call will block until it times out after 60 seconds. Waiting for 60 seconds for a real time stream is of course not usable and therefore I've set the wait parameter to MFXVideoCode_SyncOperation() to 10 ms. When I use 10ms, I keep receiving MFX_WRN_IN_EXECUTION. 

My encode function looks like this (I removed some redundant checks):

  int VideoEncoderQuickSync::encode(MediaPacket* pkt) {

    uint32_t size = 0;
    uint8_t* size_ptr = NULL;
    mfxStatus status = MFX_ERR_NONE;
    VideoEncoderQuickSyncTask* task = NULL;
    VideoEncoderQuickSyncSurface* surf = NULL;
    VideoFrameEncoded encoded_frame;
    

    if (0 != getFreeTask(&task)) {
      /*
      When there are now free tasks, we sync. Because we're doing
      a conference video stream we cannot wait forever; 10ms is more
      the max.
      */
      task = tasks[task_dx];
      status = MFXVideoCORE_SyncOperation(session, task->syncpoint, 10);
         
      if (MFX_ERR_NONE != status) {
        /* Dropping frame, this is where we always get MFX_WRN_IN_EXCECUTION. */
        pkt->makeFree();
        return 0;
      }
      else {

        task_dx = (task_dx + 1) % tasks.size();
        
        if (0 != writeEncodedTask(task)) {
          /* Never happens. */
          return -7;
        }

        /*
        Reset the bitstream and syncpoint. When we don't    
        reset the bitstream we get MFX_ERR_NOT_ENOUGH_BUFFER
        results when calling MFXVideoENCODE_EncodeFrameAsync
        */
        task->syncpoint = NULL;
        task->bitstream.DataLength = 0;
        task->bitstream.DataOffset = 0;
        task->bitstream.TimeStamp = 0;
        task->bitstream.DecodeTimeStamp = 0;

        /* At this point we freed up a task, so getFreeTask() receives
           a free one and we can continue execution.
        */
        task = NULL;
        if (0 != getFreeTask(&task)) {
          SX_ERROR("Even after syncing we couldn't get a free task.");
          return -9;
        }
      }
    }
    
    /*
    Get a free surface. We have our own struct which has a surface and pkt member.
    The surface member is mfxFrameSurface1. The pkt member is a pointer to the 
    raw input data (NV12). We need to keep these together so we know when we 
    can reuse the pkt again. 
    */
    if (0 != getFreeSurface(&surf)) {
      SX_ERROR("Failed to get a free surface.");
      return -8;
    }
    
    if (NULL != surf->pkt) {
      /* 
        When we found a free surface, we can make reuse the previously
        set pkt member with NV12, so we can reuse it in the application again.
      */
      surf->pkt->makeFree();
    }
      
    /*
    Wrap the input pkt into the surface.
    */
    surf->surface->Data.Y = pkt->video_plane[0];
    surf->surface->Data.UV = pkt->video_plane[1];
    surf->surface->Data.TimeStamp = pkt->pts;
    surf->surface->Data.Pitch = pkt->video_width[0]; /* @todo use PitchLow / PitchHigh */
    surf->pkt = pkt;

    for (;;) {

      status = MFXVideoENCODE_EncodeFrameAsync(session, NULL, surf->surface, &task->bitstream, &task->syncpoint);
      
      if (MFX_ERR_NONE < status && !task->syncpoint) {
        if (MFX_WRN_DEVICE_BUSY == status) {
          /* Sleep a bit here ... (never happens) */
          sleep_millis(1);
        }
      }
      else if (MFX_ERR_NONE < status && task->syncpoint) {
        status = MFX_ERR_NONE;
        break;
      }
      else if (MFX_ERR_NOT_ENOUGH_BUFFER == status) {
        SX_ERROR("Bitstream buffer size is insufficient.");
        break;
      }
      else {
        break;
      }
    }
    return 0;
  }

What am I doing wrong? and how can I solve this?

Thanks!

 

0 Kudos
1 Reply
Diederick_H_
Beginner
1,241 Views

I've got a small update for this issue. I'm pretty sure this issue is caused by either a misunderstanding of the encoder parameters or a bug in the encoders itself. When I change the encoder parameters to "low latency" parameters as used in the smiple_encode_vmem_lowlat.cpp example everything works great and I've tested streaming for 2 hours w/o any issues.

I'll paste the settings below that work, though I would love to hear some feedback about this; I'm curious why / what is causing this.

 

    mfxVideoParam mfxEncParams;
    memset(&mfxEncParams, 0, sizeof(mfxEncParams));
    mfxEncParams.mfx.CodecId = MFX_CODEC_AVC;
    mfxEncParams.mfx.TargetUsage = MFX_TARGETUSAGE_BALANCED;
    mfxEncParams.mfx.TargetKbps = options.values.Bitrate;
    mfxEncParams.mfx.RateControlMethod = MFX_RATECONTROL_VBR;
    mfxEncParams.mfx.FrameInfo.FrameRateExtN = options.values.FrameRateN;
    mfxEncParams.mfx.FrameInfo.FrameRateExtD = options.values.FrameRateD;
    mfxEncParams.mfx.FrameInfo.FourCC = MFX_FOURCC_NV12;
    mfxEncParams.mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV420;
    mfxEncParams.mfx.FrameInfo.PicStruct = MFX_PICSTRUCT_PROGRESSIVE;
    mfxEncParams.mfx.FrameInfo.CropX = 0;
    mfxEncParams.mfx.FrameInfo.CropY = 0;
    mfxEncParams.mfx.FrameInfo.CropW = options.values.Width;
    mfxEncParams.mfx.FrameInfo.CropH = options.values.Height;
    mfxEncParams.mfx.FrameInfo.Width = MSDK_ALIGN16(options.values.Width);
    mfxEncParams.mfx.FrameInfo.Height =
        (MFX_PICSTRUCT_PROGRESSIVE == mfxEncParams.mfx.FrameInfo.PicStruct) ?
        MSDK_ALIGN16(options.values.Height) :
        MSDK_ALIGN32(options.values.Height);

    mfxEncParams.IOPattern = MFX_IOPATTERN_IN_VIDEO_MEMORY;

    // Configuration for low latency
    mfxEncParams.AsyncDepth = 1; 
    mfxEncParams.mfx.GopRefDist = 1;   

    mfxExtCodingOption extendedCodingOptions;
    memset(&extendedCodingOptions, 0, sizeof(extendedCodingOptions));
    extendedCodingOptions.Header.BufferId = MFX_EXTBUFF_CODING_OPTION;
    extendedCodingOptions.Header.BufferSz = sizeof(extendedCodingOptions);
    extendedCodingOptions.MaxDecFrameBuffering = 1;   
    mfxExtBuffer* extendedBuffers[1];
    extendedBuffers[0] = (mfxExtBuffer*) & extendedCodingOptions;
    mfxEncParams.ExtParam = extendedBuffers;
    mfxEncParams.NumExtParam = 1;

 

The following parameters is what I changed to make encoding work for long periods:

    mfxEncParams.AsyncDepth = 1;    
    mfxEncParams.mfx.GopRefDist = 1;
    extendedCodingOptions.MaxDecFrameBuffering = 1;

Any ideas what may cause the issue what I described in my previous post?

Thanks

D

0 Kudos
Reply