Hello all, I am performing testing on a recently developed dev kit by my workplace. One function of the dev kit is to accept an analog video signal and feed it through a nios II video pipeline that utilizes many of the VIP megacore IPs before outputting the signal via a DVI port. I am running into some issues with the Deinterlacer IP, however. Both the decoder and encoder on the dev kit have been successfully tested and are showing correct outputs.For testing purposes, I am using the VIP test pattern generator to produce a 4:2:2 YCbCr signal. The pipeline is as follows: test pattern generator --> clipper --> sequencer --> resampler --> csc --> deinterlacer --> clocked video output. The problem I am experiencing is in the control packet of the deinterlacer (which is configured for motion adaptive triple buffering and output frame rate as input frame rate (F0 synchronized)). The clipper correctly resizes my 720x500 test pattern generated image to 640x480. The control packets passed from clipper to sequencer to resampler to csc all show a 640x480 width/height in my signal tap waveforms. However, the output control packet on the deinterlacer shows width/height to be 49344x33152 progressive. I am wondering if anyone has any input or feedback as to why/how the deinterlacer is generating a control packet with such absurd values. The video I am seeing on the output is out of sync with incorrect colors. I am not sure that correcting the deinterlacer control packet will solve both of these issues, but it seems that this would be a good place to start. I'd appreciate any input on the matter. Thanks!!
Hi,Correcting the deinterlacer control packets will probably not help because this also indicates that the core is not handling the input video as it should. This is just a guess but I think that if your TPG is set for 720x500 interlaced then you should set your clipper to 640x240 (The clipper clips the interlaced fields not the frames). All control packets between the clipper and the deinterlacer should be with height=240.