Items with no label
3335 Discussions

We are observing error messages as mentioned in the log file.We are using below code to get colour frames from RealSense camera.

HShet1
Beginner
1,193 Views

Code to get colour frames:

public static IEnumerable<System.Drawing.Image> realSenseColorFrames()

    {

       

      log.Info("StreamApi::realSenseColorFrames Getting RealsenseClorFrames!!");

       

      Align align = new Align(Intel.RealSense.Stream.Color);

      try

      {

        while (true)

        {

          using (var frames = pipe.WaitForFrames(10000))

          {

            log.Info("StreamApi::WaitForFrames!!");

            

            var frames2 = align.Process(frames).DisposeWith(frames);

             

             

            using (VideoFrame vf = frames2.ColorFrame)

            {

              Console.WriteLine("Inside bitmap");

              log.Info("StreamApi::VideoFrame!!");

              Bitmap bmpfile = FrameToBitmap(vf);

              yield return bmpfile;

            }

          }

        }

        

      finally

      {

        log.Debug("StreamApi::realSenseColorFrames finally block called ");

         

      }

 

    }

0 Kudos
8 Replies
MartyG
Honored Contributor III
842 Views

Focusing on the error message 'User didn't release frame resource' in your log, this seems to be generated when the number of published frames exceeds the maximum frame queue size.

 

Typically, the brackets after WaitForFrames do not contain a value and are written just as WaitForFrames() - what happens if you write it like this without '10000' in the bracket, please?

0 Kudos
HShet1
Beginner
842 Views

Thanks for the response.What is the default maximum frame queue size.?

 

"Typically, the brackets after WaitForFrames do not contain a value and are written just as WaitForFrames() - what happens if you write it like this without '10000' in the bracket, please?"

Harsha: Since deafult value is 5ms we wanted to wait for 10ms if in case frames didn't arrive within 5ms. Will try with WaitForFrame() and let you know the result.

 

0 Kudos
MartyG
Honored Contributor III
842 Views

I should provide the disclaimer that RealSense programming is not one of my specialist areas and I am a generalist in this regard, so others can offer more specialist advice about coding.

 

The frame buffering documentation suggests setting the queue size to '1' if using one stream, and to '2' if using two streams. These are the settings for optimal performance though, and the script in the link below sets frame queue capacity to '10'.

 

https://github.com/IntelRealSense/librealsense/wiki/Frame-Buffering-Management-in-RealSense-SDK-2.0#latency-vs-performance

0 Kudos
HShet1
Beginner
842 Views

Thanks for the reply Marty.

Since I am using pipeline I am not using any frame queue and as per link provided above I found this.

"The pipeline is a synchronized platform, it contains a frame queue that buffers the received frames."

 

"The pipeline does not expose an API for setting the frame queue capacity, if the user wish to set the capacity to a value that is more compatible for his needs it can be done by changing the source code:"

0 Kudos
MartyG
Honored Contributor III
842 Views

Reading back over your script at the start of this conversation, could you explain please about why you are using Align if you are only using the color stream. It does not seem as though there are any other streams (such as depth) being generated by the script to align the color with.

 

Here's an example of how someone else approached the problem of writing the color frames to bmp.

 

https://github.com/IntelRealSense/librealsense/issues/3026

 

Otherwise, the best course of action may be to post a help request on the RealSense GitHub by visiting the link below and clicking the New Issue button. Sorry I couldn't be of more help in this particular case.

 

https://github.com/IntelRealSense/librealsense/issues

0 Kudos
HShet1
Beginner
842 Views

could you explain please about why you are using Align if you are only using the color stream?

 

Answer: I am also using Depth frames to get distance.But I am streaming only the colour frames over http .And streaming of colour frames stcuk after sometime.

0 Kudos
MartyG
Honored Contributor III
842 Views

Ok, thank you very much for the clarification. As suggested above, seeking specialist coding advice on the GitHub may be the best course of action, as this topic is unfortunately outside of my programming knowledge. Good luck!

HShet1
Beginner
842 Views

Thank you..Sure..Will raise this issue in github forum.

0 Kudos
Reply