I'm having trouble integrating an HD SDI transceiver into my existing Stratix IV GX video design. I've ended up stripping it down to a single video path and integrating in the AN600 design to debug the issues, but I'm still unable to get valid video through the system.My SOPC system is as follows: CVI -> CRS -> CSC -> Deint (Bob, no buffering) -> CPR (parallel -> serial) -> Clipper (no clipping) -> Scaler (1920x1080 to 640x480) -> VFB (triple buffer) -> CPR (serial -> parallel) -> CVO The conversion from parallel to serial and vise versa is an artifact of the pre-existing design. The clipper is currently software configurable but not clipping any of the frame. I have the AN600 Rx-only transceiver driving the CVI. I have the Tx-only transceiver with the test pattern generator hard coded for HD SDI (tx_std = "01"). Using SignalSpy, I can see the data port on the Tx and the Rx sides and everything looks good. I'm seeing valid Vsync, Hsync and Field flags and the Rx_status is 0x01D indicating it has achieved PLL, alignment, TRS and Frame lock. Looking at the CVI using SignalSpy, I'm seeing the overflow flag set constantly. I'm seeing valid Vsync, Hsync and Field flags. The active_line_count_f0 is 0x0C48 and active_line_count_f1 is 0x0439?? The total_line_count_f0 is 0x0465 and total_line_count_f1 is 0x0467. total_sample_count is 0x1131?? The CVI internal locked flag periodically deasserts for a full video line period. When I configured the Clocked Video Input module, I used the SDI 1080i60 preset. Does anyone have any input on what could be wrong here? Any advice on what to look at?
I would guess that it's the parallel/serial conversions that are causing the problem. 1080i input @74.xx MHz, de-interlaced to 148.xx MHz, then you are re-sequencing 3 planes in parallel to serial? How fast is your SOPC clock?
Yea, that was my suspicion, as well. My internal clock is running at 150MHz. I pulled out the CPR blocks and reconfigured the other blocks for parallel color planes. It's a step in the right direction. Now I'm seeing what appears to be 3 sets of color bars across in a scrolling zig zag pattern.The CVI no longer loses lock, but it still indicates overflow. The Avalon ST ready signals between each of the blocks appears to cycle between asserted and deasserted with a very regular frequency. They look like a clock signal with a period equal to the incoming video line rate. It appears to be a back pressure issue still.
I should point out that I had originally started development with a TPG inserted in the system in place of the CVI until we received an HD SDI camera. That worked fine, but I assume the TPG responded to the backpressure without overflowing like the CVI, so the effective frame rate was likely lower than 30fps. I have not confirmed this in SignalSpy, though.I attempted to simulate the design, but the transceiver models apparently don't simulate the dynamic reconfiguration properly. I see the dynamic reconfiguration complete, but as soon as the transceiver requests reconfiguration, the PLL loses lock and the recovered clock stops transitioning (as expected). However, after reconfiguration has completed, the transceiver fails to lock to the incoming signal and the recovered clock never starts back up. I submitted a service request but the response was "i regret that the reference design is just used to show the capability of the products. we don't provide the support if customer want to run the rtl simulation in modelsim.
i hope you could understand.
thank you and have a nice day." Seriously? :confused:
Sounds like you're making some progress. 150 MHz might be cutting it a bit too close, considering the overhead of the Avalon bus. I'd try upping it, if possible.Also, how big is the FIFO in the CVI, it might not be big enough to cope with the backpressure from the Scaler. You might also want to experiment with putting framebuffers before and after the scaler (only one of them needs to be triple buffered) if you can afford the bandwidth. I've had success with that in the past. If you're only ever going to scale down to 640x480, you may be able to get away with just a large FIFO in the CVI and a framebuffer downstream of the scaler.
Good timing. I had just come that conclusion a few minutes before your post and kicked off a build with my core clock increased to 200MHz.The CVI FIFO is set to 4000 per the UDX4.1 reference design. The current design only outputs VGA resolution, but the next generation will move to an HD output, likely XGA or 1080i. Thanks for the help!
So that didn't completely fix the problem. The Avalon ST ready signals now look like a clock with a 70-30 duty cycle and the video still looks like 3 sets of color bars across with the scrolling zig zag.I've moved the frame buffer between the CVI and the CRS to see if I can at least get a full frame buffered. Interestingly, the UDX4.1 reference design uses a 148.5MHz video core clock with the following video pipeline: CVI -> AFD Extractor -> Switch -> Clip -> Snoop -> MA Deint -> AFD Clip -> Scaler II -> VFB -> CRS -> CSC -> ........ There has to be something else up in my system. I'm going to work on getting a sim up and running.
Nope, not yet.I had to spend some time getting the simulation up and running properly. It's running now, albeit rather slowly. At ~3ms in, I'm not seeing any problems yet. I'm not seeing the CVI overflow and all of the Avalon ST handshaking is looking good. It appears to have plenty of bandwidth in that clock domain, especially now that it's running at 200MHz. I have another build running in Quartus to test it in the hardware again.
So the simulation looked great and I just couldn't figure out why on earth it wasn't working on the board. I've been having problems with the smart recompile not actually "taking" updates so I blew away the db folder and rebuilt. It worked. <sigh>On another note, I'm pretty sure serializing the 1080p30 stream is a problem. HD SDI with 2200 samples per line @ 74.25MHz is a line rate of ~29.63us and a frame rate of 33.33us. To maintain that line rate but serializing the data stream, that gives us 6600 samples per line at that 29.63us line rate. That gives a minimum core clock rate of 222.75MHz.
cvo, needs two clocks, one at the rate of the system, one at the output clock rate of the video system . if the video clock is to slow, or the cvo buffer to small, then you will drop frames. back pressure signals will look like a clock..if the output clock is not syncronous / or the cvo has wrong h or v sync, then the video will rip, sounds as if you might have both of these..