Intel® Quartus® Prime Software
Intel® Quartus® Prime Design Software, Design Entry, Synthesis, Simulation, Verification, Timing Analysis, System Design (Platform Designer, formerly Qsys)
16606 Discussions

Source Synchronous Interface Input and Output Delays

Altera_Forum
Honored Contributor II
971 Views

Hi Everyone, 

 

This is my first post and I tried looking for a solution before asking it here. I am new to static timing and have been studying source synchronous interfaces and how to constrain them. In some examples here within Altera Resources itself, they use a virtual clock to set the output delays. In some examples, a generated clock which is defined at the output port of the design is used to set the output delays. 

 

So, is there no difference between using a virtual clock and a output clock for setting output delays? 

 

Of course The same question about the input delays on the input ports, whether we should use the virtual clock or the clock defined on one of the input ports. Does it matter? If not, why? 

 

Please let me know if my question is unclear.
0 Kudos
1 Reply
Altera_Forum
Honored Contributor II
221 Views

For source synchronous input, a virtual clock drives the "upstream" device (the launch clock) and the clock generated by that device is the base clock into the FPGA (the latch clock or a PLL-generated clock from that incoming base clock). 

 

For source synchronous output, the launch clock is the clock you use to clock out the data, usually from a PLL. For the latch clock, you create a generated clock on an output of the FPGA. This is the only situation for an I/O interface where there is no virtual clock. 

 

See this online training for details: 

 

https://www.altera.com/support/training/course/ocss1000.html
0 Kudos
Reply