- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying to improve input/output timing on several signals but am confused as to which is the best way to go about it. Is it best to use set_input_delay or set_output_delay in the timequest .sdc file or to force the signal to use an input/output register by using the .qsf specification such as the following:
set_instance_assignment -name FAST_INPUT_REGISTER ON -to signal_in If trying to minimize the delay from the pin to the first input register, doesn't seem as if you can do better than to force it to use the input register as specified above without having to go thru all of the work of setting up virtual clocks, determine board routing delays, etc. that is involved with using set_input_delay. Do experienced developers have a preference and if so, why? Also, if using the qsf variable such as the one above, is it recommended to then declare a false path so it won't flag the input (or output) path as an ignored constraint? Thanks, GradyLink Copied
2 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The two issues are not alternatives. For io timing you need to set delays so that timequest knows how to achieve and report on io timing. The use of fast io registers helps achieve timing after entering delays.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ok, thanks. Looks like if I do both then the set_input_delay will have no affect on routing but will indicate timing margin of the data to the IO cell input flip-flop

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page