Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Novice
110 Views

How to define timing constrains for input and output

Jump to solution

If I look the timing analysis user guide, it recommend to use virtual clock to define timing constrain for input and output. But I don't know what's the difference between using virtual clock and clock from FPGA port. Could you use below example to explain a little? Thanks a lot!

 

0 Kudos

Accepted Solutions
Highlighted
Moderator
102 Views

A virtual clock is used for I/O timing analysis.  It's created using the create_clock constraint and describes the clock that launches data for an upstream device (an input to the FPGA) or the clock that latches the data for a downstream device (from an output of the FPGA).  It's called virtual because it has no target because it never actually enters the FPGA itself.  Your set_input_delay and set_output_delay constraints always reference the virtual clock (except for some unique circumstances like source synchronous clocks).

To learn more, see this online training:

https://www.intel.com/content/www/us/en/programmable/support/training/course/odsw1118.html

If you are talking about source synchronous interfaces, where the clock is generated by the upstream device or the FPGA generates the clock to be sent to a downstream device, see these trainings, depending on whether you're talking about single data rate or double data rate:

https://www.intel.com/content/www/us/en/programmable/support/training/course/ocss1000.html

https://www.intel.com/content/www/us/en/programmable/support/training/course/oddr1000.html

 

View solution in original post

0 Kudos
4 Replies
Highlighted
Moderator
103 Views

A virtual clock is used for I/O timing analysis.  It's created using the create_clock constraint and describes the clock that launches data for an upstream device (an input to the FPGA) or the clock that latches the data for a downstream device (from an output of the FPGA).  It's called virtual because it has no target because it never actually enters the FPGA itself.  Your set_input_delay and set_output_delay constraints always reference the virtual clock (except for some unique circumstances like source synchronous clocks).

To learn more, see this online training:

https://www.intel.com/content/www/us/en/programmable/support/training/course/odsw1118.html

If you are talking about source synchronous interfaces, where the clock is generated by the upstream device or the FPGA generates the clock to be sent to a downstream device, see these trainings, depending on whether you're talking about single data rate or double data rate:

https://www.intel.com/content/www/us/en/programmable/support/training/course/ocss1000.html

https://www.intel.com/content/www/us/en/programmable/support/training/course/oddr1000.html

 

View solution in original post

0 Kudos
Highlighted
Novice
84 Views

Thanks for your feedback. Could you look at my attached picture?

So there is no difference if I define adc_clock on FPGA port or just virtual clock, right?

I am just thinking virtual clock make you easy to analyze, but nothing different.

Please correct me if I am wrong.

0 Kudos
Highlighted
Moderator
73 Views

So this is a more complicated design.  It's referred to as a data feedback design so there is no virtual clock here.  The constraints you need are:

1) Input clock (base clock constraint with create_clock)

2) Generated clock output of PLL (use derive_pll_clocks)

3) Generated clock output of device (create_generated_clock with -source pointing to output clock pin of PLL and target pointing to get_ports [adc_clock})

4) False path on output clock path (so output clock path is not analyzed as a data path)

5) Input delay constraint (uses output generated clock in place of virtual clock; your constraint is close but needs to reference #3 above, not the output clock port) 

0 Kudos
Highlighted
Novice
69 Views

Thanks  a lot!

0 Kudos