Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Valued Contributor III
777 Views

VIP Newbie

We need to support S-Video for the next 10 years and graphics cards no longer have S-Video output. I've been tasked with coming up with a DVI to S-Video conversion circuit but have no prior video conversion experience so I have a huge learning curve. 

 

It looks like VIP has everything I need (except for a ADV7179 type device at the end of the chain).  

 

I'm hung up on the input side: 

Can an Arria II GX take in DVI/TDMS signals directly? I didn't think the signal levels are compatible with the FPGA's I/O but the CVI core lists DVI as an input option??? 

 

Thanks for any info
0 Kudos
3 Replies
Highlighted
Valued Contributor III
4 Views

your project should be pretty straight forward with VIP Suite. :) 

 

indeed, i don't believe FPGA I/Os are compatible with the actual DVI I/O standard. there is an HSMC card you should take a look at, it has DVI input and output. as i remember, the DVI interface chip actually breaks the DVI serdes into a parallel bus for interfacing to the FPGA.
0 Kudos
Highlighted
Valued Contributor III
4 Views

The easiest thing for you to do is to buy an external DVI/HDMI receiver IC (TI and Silicon Image are good sources). Then use a parallel interface to the FPGA (Cyclone family may be sufficient). Use the VIP for the video conversion. And of course use your video DAC on the output. 

 

I've yet to see a direct TMDS solution using FPGA transceivers. They would most certainly have to go through a cable equalizer. 

 

Jake
0 Kudos
Highlighted
Valued Contributor III
4 Views

Hi mikenap, 

 

Yes, you would have to use an HDMI/DVI receiver chip. Even of the FPGA could directly interface to the DVI signals, it would still need to do a lot of decoding since the DVI stream is much more than just the serialized video data. All HDMI receivers should be able to decode DVI. The output of such a chips is typically a BT1120 (or BT656 if SD) video stream which can be directly fed to the Clocked Video Input core in the VIP toolkit.  

 

The FPGA would have to convert the BT1120 stream to an S-Video compatible stream. Afaik (could be wrong), S-Video is only defined for PAL/NTSC standard definition video (ie 422 13.5/27MHz pixel clock), while DVI outputs are often much higher resolutions and may have other frame rates as well. Converting the resolution can be done with a scaler from the VIP, but a frame rate conversion is maybe more tricky. A quick and dirty approach is to use the frame buffer with tripple buffering to drop/repeat frames as required. This may or may not be acceptable in your application since it may lead to "jerkiness". 

 

On the output you need a video encoder which takes a BT656 stream and converts it to analog S-Video signals. There are many from AD/TI and others. The Clocked Video Output core provides a BT656 / BT1120 compatible output stream (with external or embedded syncs). 

 

So, at least three chips then. 

 

Regards, 

Niki
0 Kudos