- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm involved in an international open source Software Defined Radio project that uses Altera FPGAs (www.openhpsdr.org (http://www.openhpsdr.org/)). We are currently encouraging our members to contribute to FPGA development. Many of our members are very experienced C/C++/C# programmers but have not been exposed to FPGA coding before. One issue we find is the steep learning curve for such programmers when starting to use Timequest. In which case we have written a user guide, aimed at beginners, called "A standardized procedure for closing timing on openHPSDR FPGA firmware designs". You can obtain a copy of the document here: http://www.k5so.com/timingclosurefieldguide.pdf We would welcome peer reviews by experienced Timequest users in order to continually improve this document. Regards philhLink Copied
11 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- Hi, I'm involved in an international open source Software Defined Radio project that uses Altera FPGAs (www.openhpsdr.org (http://www.openhpsdr.org/)). We are currently encouraging our members to contribute to FPGA development. Many of our members are very experienced C/C++/C# programmers but have not been exposed to FPGA coding before. One issue we find is the steep learning curve for such programmers when starting to use Timequest. In which case we have written a user guide, aimed at beginners, called "A standardized procedure for closing timing on openHPSDR FPGA firmware designs". You can obtain a copy of the document here: http://www.k5so.com/timingclosurefieldguide.pdf We would welcome peer reviews by experienced Timequest users in order to continually improve this document. Regards philh --- Quote End --- Obviously we are all not happy about the volume of Altera documentation on Timequest. However, One has to be careful writing or reading 2nd hand summaries or guides as it just adds to the pool of words and may have errors. I skimmed a bit through your summary. I wasn't happy about your account of set_input_delay and set_output_delay as you imply you can just enter some arbitrary values and check if it is ok with timing report. These values have to be entered from physical relationship at your io between data and clock. The purpose is not just to get a clean pass but enter correct values + pass. The tool is not aware about io relatioship and needs your entry.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Kaz,
Thanks for the feedback - much appreciated. Could you please suggest a simple example, that explains this issue, that we could include in the documentation.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- Hi Kaz, Thanks for the feedback - much appreciated. Could you please suggest a simple example, that explains this issue, that we could include in the documentation. --- Quote End --- rather than examples one should be clear about the meanings (definitions), here is my understanding: set_input_delay min is minimum data offset from its clock launch edge max is maximum data offset from its clock launch edge zero means data and clock aligned so given tCO of input device and board delay then min = min tCO, and max = max tCO of external device plus board delay effect at each path of data and clock from device to io This is straightforward set_output_delay min is minimum allowed offset of data from its clock latch edge, negative by convention max is maximum allowed offset of data from its clock latch edge i.e. transition allowed between min & max (but not between max &min) zero means no requirement (but this is not realistic) so given tSU/tH of external device and board delay then min = -tH, max = tSU of external device plus board effect on both data and clock from io to device This is not straightforward as it is not symmetrical in definition with set_input_delay due to negativity of min which apparently comes from min offset seen as after edge while max is before edge.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Kaz,
Thanks for your continued assistance. Lets take set_input_delay first. Assume we have an SPI interface that is sending a clock at 10MHz and a single data bit. So we have two input pins SPI_clock and SPI_data. Assume the clock and data are synchronous and the PCB delays to each input are equal. We define the clock as create_clock -name SPI_clock -period 100 [get_ports {SPI_clock}] create_clock -name virt_SPI_clock -period 100 We want to clock the data in the middle of a bit, what set_input_delay do we use?- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- create_clock -name SPI_clock -period 100 [get_ports {SPI_clock}] create_clock -name virt_SPI_clock -period 100 We want to clock the data in the middle of a bit, what set_input_delay do we use? --- Quote End --- from fpga perspective you don't clock an input as fpga receives it as it is. You pass this information to the tool. if you are receiving data transitions right in the middle of clock period then min = max = 50 ns It is at the fpga outputs that you have control over data/clk relation but then the tool stops once timing is achieved and does not give you the choice of optimising it any further.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In my example the clock and data are synchronous but within the FPGA I wish to sample the data at the middle of the clock period. In which case what set_input_delay settings do I use please?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- In my example the clock and data are synchronous but within the FPGA I wish to sample the data at the middle of the clock period. In which case what set_input_delay settings do I use please? --- Quote End --- Your question is wrong. It seems you are thinking like this: I got data and clock coming into fpga (their relation unknown) then I want first fpgs io register to sample it in the middle of clock period by inserting some delay and without knowing what is the received data offset from clock launch edge. The purpose of set_input_delay is to give information to the tool about data clock offset as received at io and you leave it to the fitter to insert a siutable delay and get timing right at io register. you can narrow down your offset if you want stricter control up to a limit but once the tool gets timing it does not care about positioning the data eye as you prescribe. There might be other absolute delay constraints so that you decide the delay rather than the tool but ultimately you want to pass timing rather than put data eye somewhere nice.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
OK - thank you. So assume my clock and data are have zero degrees of timing error when applied to the input pins of the FPGA I don't need any set_input delay then? If I want to sample in the middle of the data eye then I write FPGA code to do that.
But if they don't have zero degrees of timing error, say due to different PC track lengths, then can I use set_input_delay to compensate?- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- OK - thank you. So assume my clock and data are have zero degrees of timing error when applied to the input pins of the FPGA I don't need any set_input delay then? If I want to sample in the middle of the data eye then I write FPGA code to do that. But if they don't have zero degrees of timing error, say due to different PC track lengths, then can I use set_input_delay to compensate? --- Quote End --- No no... you always need to tell the tool the io data/clock offset othrwise it has no way to "see" that offset. It only then decides timing at that register. So how could timing pass with no such knowledge?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So I measure the relationship between the clock and the data with a scope and use that value for the set input delay?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- So I measure the relationship between the clock and the data with a scope and use that value for the set input delay? --- Quote End --- correct and that is the ideal. alternatively if you assume both board traces are identical and equal in length (data and clock) then you can ignore board delay and deduce offset from external device tCO. Then min = min tCO and max = max tCO

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page