Intel® Quartus® Prime Software
Intel® Quartus® Prime Design Software, Design Entry, Synthesis, Simulation, Verification, Timing Analysis, System Design (Platform Designer, formerly Qsys)
17256 Discussions

Bidirectional IO timing constraints

Altera_Forum
Honored Contributor II
3,997 Views

Hi all, I've created some output delays for a bidirectional data bus, but I wanted to make sure I was getting the input delays set right as well. 

 

max board delay = 6.04 

min board delay = 1.34 

setup = 11 

hold = 6 

min clock delay = 1.65 

max clock delay = 6.35 

 

That gave me the values below: 

 

set_output_delay -clock CPU_O_CLK40 -max 15.39 [get_ports "CPU_IO_DATA[31]"] 

set_output_delay -clock CPU_O_CLK40 -min -11.01 [get_ports "CPU_IO_DATA[31]"] 

 

set_input_delay -clock CPU_O_CLK40 -max 23.39 [get_ports "CPU_IO_DATA[31]"] 

set_input_delay -clock CPU_O_CLK40 -min 8.99 [get_ports "CPU_IO_DATA[31]"] 

 

Do these seem correct to everyone? 

 

Thanks!
0 Kudos
4 Replies
Altera_Forum
Honored Contributor II
2,960 Views

No. 

Your constraining a single pin, bit 31, and the max and min board delays are 6.04ns and 1.34ns. I don't know how they could vary that much across different pins unless there's huge board skew, but a single bit couldn't possibly vary that much. 

For setup and hold, I assume those are specs of the device it's talking to. Hold is inverted to become a delay, so a hold of 6ns becomes a set_output_delay -min -6. (By say the external delay is -6, then the FPGA must hold the data +6ns to get the delay back to the 0ns hold requirement) 

I'm not sure what math you did to get the input delays. Also not sure what your topology is, i.e. is there a board clock driving the FPGA and the external chip, or is the FPGA sending a clock to the external chip or vice versa. I don't know what the clock delays even represent.
0 Kudos
Altera_Forum
Honored Contributor II
2,960 Views

Hi Rysc, sorry for the lack of info; let me see if I can clear some of this up. 

 

The wide difference in the min and max board delays are due to the CPU_IO_DATA bus , and the CPU_O_CLK40, passing through 5V tolerant buffers and their way to this CPU. I believe with these buffers it has a minimum 1ns delay, maximum ~5.7ns. I added those values in with the calculated board trace delays. That is also the reason for the clock delays having such a wide variance.  

 

The CPU_O_CLK40 (40MHz clock) is produced inside the FPGA and supplied to the CPU. I got the setup and hold times from its datasheet. 

 

Of course, I can't find my reference for the input delay formula I used. But as I recall, for max it was: max board delay + setup + max clock delay. For min, min board delay + setup + min clock delay. 

 

 

I really appreciate your help on this!
0 Kudos
Altera_Forum
Honored Contributor II
2,960 Views

That's a big variance, but I guess it is what it is. 

For your set_output_delay, that looks correct except for the sign of the hold, which was already mentioned. 

For the set_input_delay, you didn't say but I'm guessing you're sending the clock to the other device and then getting data back? If so, you don't care about the setup and hold of the external device. It's really just the max and min roundtrip delay. So something like: 

set_output_delay -max [expr $max_clk_FPGA2Extdev + $Tco_max_Extdev + $max_data_delay_Extdev2FPGA]... 

set_output_delay -min [expr $min_clk_FPGA2Extdev + $Tco_min_Extdev + $min_data_delay_Extdev2FPGA]
0 Kudos
Altera_Forum
Honored Contributor II
2,960 Views

 

--- Quote Start ---  

That's a big variance, but I guess it is what it is. 

For your set_output_delay, that looks correct except for the sign of the hold, which was already mentioned. 

For the set_input_delay, you didn't say but I'm guessing you're sending the clock to the other device and then getting data back? If so, you don't care about the setup and hold of the external device. It's really just the max and min roundtrip delay. So something like: 

set_output_delay -max [expr $max_clk_FPGA2Extdev + $Tco_max_Extdev + $max_data_delay_Extdev2FPGA]... 

set_output_delay -min [expr $min_clk_FPGA2Extdev + $Tco_min_Extdev + $min_data_delay_Extdev2FPGA] 

--- Quote End ---  

 

 

Hi Rysc, sorry for the long delay in responding, I've been out of town for a while. 

 

Okay, so if the FPGA is supplying the clock to a device, then setup and hold times on that device can be ignored. That can simplify things. 

 

I have a couple questions regarding signals to devices that don't have clocks: 

I have some output enable and directional controls to buffers. The enables turn on once the FPGA is configured and stay on, and the few directionals that do toggle, don't do so rapidly. Is there a value to constraining these signals in the sdc? If so, what would I take into account to determine their min and max? 

 

There are a few interrupt signals that come into the FPGA from off board (we have a code based DMA controller within), they are edge activated and can happen at anytime. Is the trace length delay the only thing that should be taken into consideration for these signals' input delays? 

 

Finally, I have a asynchronous serial interface that communicates with another unit. One singal is transmit, the other is receive. The interface has a constant idle pattern that is transmitted, but data/commands occur at anytime. It's NRZ, so a transition = 1, no transition = 0. The FPGA generates a single ended signal at 60MHz for this and it connects to a PECL converter. This differential signal travels along a cable of varying length (depends on the unit our board is placed in) to reach its destination. To receive, the differential enters our board, gets converted to single ended, and is latched into the FPGA with a 240MHz clock, to ensure all edges are observed correctly. I would assume just using trace length to the PECL converters is where I'd start with these two signals. 

 

Thank you very much for all your help!  

 

Oh, I am looking at taking the online courses from Altera for Timing Analysis; do you feel these are good investments?
0 Kudos
Reply