- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This is a question about how to constraint a bi-direction bus relative to a single input clock.
I have a 8-bit bi-direction bus that is clocked by a single input clock. In input direction the min required setup time is 4.0ns, and min hold time is 0.0ns relative to clk. In output direction the clock to output delay is 1.0ns (min) to 4.0ns (max). What is the best way to constraint it? I am using the following command in my SDC: for input setup/hold, I have set_input_delay -rise -max -clock clk 4.00 [get_ports my_bi_dir_port] set_input_delay -rise -min -clock clk 0.00 [get_ports my_bi_dir_port] for output delay I have set_output_delay -clock clk -max 4.00 [get_ports my_bi_dir_port] set_output_delay -clock clk -min 1.00 [get_ports my_bi_dir_port] are they correct? Assuming my clock runs at 50Mhz (20ns period). So a min 4.0ns input setup time will mean that the data must arrive to the port no later than 16ns (assuming it was launched at 0ns at source). Does above input delay command could be interpreted to this result? The real TimeQuest IO timing report didn't show this relationship. Its IO setup timing report only report the timing from the IO register to internal registers, not between the input clock and IO register. Did I miss anything here?Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
See the Quartus handbook, Volume 3, Section II, Chapter 7, Table 7-4.
For tsu and tco, "latch - launch" in the table is the 20 ns period for your case. That means the set_input_delay -max and set_output_delay -max are each 20 - 4 = 16 ns. For th and min tco, "latch - launch" is zero for your case. Note the minus sign that makes set_output_delay -min be -1.0 ns.
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page