Code to get colour frames:
public static IEnumerable<System.Drawing.Image> realSenseColorFrames()
log.Info("StreamApi::realSenseColorFrames Getting RealsenseClorFrames!!");
Align align = new Align(Intel.RealSense.Stream.Color);
using (var frames = pipe.WaitForFrames(10000))
var frames2 = align.Process(frames).DisposeWith(frames);
using (VideoFrame vf = frames2.ColorFrame)
Bitmap bmpfile = FrameToBitmap(vf);
yield return bmpfile;
log.Debug("StreamApi::realSenseColorFrames finally block called ");
Focusing on the error message 'User didn't release frame resource' in your log, this seems to be generated when the number of published frames exceeds the maximum frame queue size.
Typically, the brackets after WaitForFrames do not contain a value and are written just as WaitForFrames() - what happens if you write it like this without '10000' in the bracket, please?
Thanks for the response.What is the default maximum frame queue size.?
"Typically, the brackets after WaitForFrames do not contain a value and are written just as WaitForFrames() - what happens if you write it like this without '10000' in the bracket, please?"
Harsha: Since deafult value is 5ms we wanted to wait for 10ms if in case frames didn't arrive within 5ms. Will try with WaitForFrame() and let you know the result.
I should provide the disclaimer that RealSense programming is not one of my specialist areas and I am a generalist in this regard, so others can offer more specialist advice about coding.
The frame buffering documentation suggests setting the queue size to '1' if using one stream, and to '2' if using two streams. These are the settings for optimal performance though, and the script in the link below sets frame queue capacity to '10'.
Thanks for the reply Marty.
Since I am using pipeline I am not using any frame queue and as per link provided above I found this.
"The pipeline is a synchronized platform, it contains a frame queue that buffers the received frames."
"The pipeline does not expose an API for setting the frame queue capacity, if the user wish to set the capacity to a value that is more compatible for his needs it can be done by changing the source code:"
Reading back over your script at the start of this conversation, could you explain please about why you are using Align if you are only using the color stream. It does not seem as though there are any other streams (such as depth) being generated by the script to align the color with.
Here's an example of how someone else approached the problem of writing the color frames to bmp.
Otherwise, the best course of action may be to post a help request on the RealSense GitHub by visiting the link below and clicking the New Issue button. Sorry I couldn't be of more help in this particular case.
could you explain please about why you are using Align if you are only using the color stream?
Answer: I am also using Depth frames to get distance.But I am streaming only the colour frames over http .And streaming of colour frames stcuk after sometime.
Ok, thank you very much for the clarification. As suggested above, seeking specialist coding advice on the GitHub may be the best course of action, as this topic is unfortunately outside of my programming knowledge. Good luck!
Thank you..Sure..Will raise this issue in github forum.