Programmable Devices
CPLDs, FPGAs, SoC FPGAs, Configuration, and Transceivers
21602 Discussions

source synchronous interface or virtual clock?

Altera_Forum
Honored Contributor II
2,119 Views

My FPGA is connected to an ADC. There are three interface signals between them : SCLK and DOUT, which are outputs from FPGA, and DIN, input from ADC. 

 

SCLK is generated by dividing the system clock of FPGA by 4.  

 

(I do not use SCLK directly as a clock to sample DIN, the logic in FPGA is all synchronized to system clock.) 

 

How do I constrain these three signals?  

 

Shall I use the source synchronous interface style? That is, use create_generated_clock on SCLK, and specify output delay on DOUT with respect to SCLK? In that case, how do I constrain input delay on DIN, if I want to consider board delay on SCLK? 

 

Or, shall I create a virtual clock with the same frequency as SCLK, and constrain input/output delay on DIN/DOUT with respect to this virtual clock? 

 

Thank you very much for advice!
0 Kudos
9 Replies
Altera_Forum
Honored Contributor II
1,173 Views

The path to sclk is relevant on both sides, so you want to constrain the output source-synchronous style. For the input, you also use sclk as the set_input_delay -clock option, where the -max and -min are the longest and shortest round-trip delay from the time sclk leaves the FPGA to data coming back.  

What is the clock rate? I've never heard of an ADC not sending a clock with its data. I assume it is quite slow, as that's not a very fast way to make an interface.
0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

Thank you for the reply, Rysc. 

 

Yes, the max clock rate of ADC is only 20MHz (it is AD7928 of Analog Devices). I assume in this case, using source synchronous style or virtual clock will not make a big difference in the result of timing analysis, but source synchronous style is more correct in concept. Am I right in saying so? 

 

I compare the results of timing analysis with these two methods. Under the same contraint, it seems TQ estimates a smaller clock uncertainty of SCLK in source synchronous style than it does in virtual clock.
0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

At 20MHz, you can do it either way and close timing. That being said, one is correct and one isn't, depending on how your design is laid out. Since you're sending a clock to the ADC and getting back data, it is source-synchronous, and the delay for the internal clock to get out of the FPGA will be used in all calculations. If the ADC were being clocked by an external clock, then a virtual clock would be the correct method, since the delay for you to get the clock off chip has no bearing on timing analysis. You basically want the constraint to represent what's occuring in hardware as best you can.

0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

 

--- Quote Start ---  

Since you're sending a clock to the ADC and getting back data, it is source-synchronous, 

--- Quote End ---  

 

I thought 'source synchronous' meant that the data and the clock are travelling in the same direction, i.e. one receives the sending clock along with the sent data?
0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

Josyb, you are right about the definition of "source synchronous", but here Rysc is just mentioning about my application, where FPGA sends out SCLK and DOUT to an ADC, and ADC uses SCLK as its clock to send its data DIN to FPGA. SCLK and DOUT form the "source synchronous" interface. 

 

Rysc, thank you for clarifying the concept for me.  

 

So I think in my application, I need to constrain the ADC interface with the following: 

 

(1) use create_generated_clock to constrain SCLK  

(2) set_output_delay to DOUT with respect to SCLK 

(3) use create_generated_clock to create a virtual clock which is generated from SCLK 

(4) use set_clock_latency on this virtual clock to estimate board delay/skew of SCLK as it reaches ADC 

(5) set_input_delay to DIN with respect to this vitual clock  

 

Is that right? Sorry for asking such detailed things. 

 

Thank you all!
0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

Close, but: 

(3) You can't do a virtual clock based on a generated clock. I tried the same thing and .sdc syntax doesn't allow it. So you just use the generated clcok on sclk for your input. 

(4) Since you can't do (3), you can't do this one either. 

(5) Your set_input_delay will have its -clock option based on SCLK. The -max value is the max roundtrip delay external, i.e. the board delay from FPGA's SCLK to ADC, the max Tco of ADC and board delay back to FPGA. The -min value it the min of all those values. 

Though your interface is really slow, are you doing anything on the output side to center align the clock and data, i.e. are you inverting the clock going out or something? (I'm assuming this is single-data rate). Take a look at my source-synchronous document on the alterawiki site, as it talks about different approaches for this.
0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

 

--- Quote Start ---  

 

(3) You can't do a virtual clock based on a generated clock. I tried the same thing and .sdc syntax doesn't allow it. So you just use the generated clcok on sclk for your input. 

 

 

--- Quote End ---  

 

 

Really? I use the following constraints and it is accepted: 

 

create_generated_clock -name SCLK_ADC1 -source [get_pins {inst15|altpll_component|auto_generated|pll1|clk[0]}] -divide_by 4 [get_ports {SCLK_ADC1}] 

 

create_generated_clock -name SCLK_ADC1_virt_in -source [get_ports {SCLK_ADC1}]  

 

Here is the Clocks Summary in TimeQuest: 

 

http://www.alteraforum.com/forum/attachment.php?attachmentid=5011&stc=1&d=1320735423
0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

Sorry, you can create a virtual generated clock, but I believe you get a warning that it couldn't find a connection between the generated clock and this clock and latency was not included. So if you report_timing -detail full_path, you'll see that the delays to SCLK_ADC1 are not included, so it's not really doing what you want. At this point it's the same thing as a regular virtual clock. (It gets the waveform from the generated clock, but not the delays, which is the important part for source-synchronous interfaces.)

0 Kudos
Altera_Forum
Honored Contributor II
1,173 Views

Indeed I was wondering why set_clock_latency doesn't take effect on my generated virtual clock. Now I see the point. 

 

Thank you, Rysc!  

 

I am reading your source-synchronous wiki document. Good material. It helps me have a deeper understanding of timing constraint on this kind of interface.Thanks again!
0 Kudos
Reply