- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi
I am interfacing to a DDR SRAM (not SDRAM), and am clocking read data with a clock generated in the FPGA, rather than a source synchronous echo clock generated by the SRAM. The SRAM is operating with a single input clock k (actually a complementary pair - k and kn), generated by the FPGA. SRAM clock (k) to data out time (tCO) = 0.45ns max. I want to add an SDC constraint which specifies the timing between the k clock FPGA pin (an FPGA output) and the input data FPGA pin dq - ie the path from FPGA k clock output pin, over the k clock trace, through the SRAM (tCO) to the dq outputs, then over the dq trace back to FPGA input pins. 1) Is this a legal SDC constraint? Or can I only specify relationships between output clock and output data, or input clock and input data? 2) Can I specify the timing relative to the FPGA clock output pin, as opposed to the internal clock which drives the pad? 3) What SDC statement should I use? set_input_delay? 4) Where can I find documentation on this? I've gone through the Source Synchronous App Notes, but unfortunately they aren't really applicable, as this is not a source synchronous clocking scheme. Thanks, ChrisLink Copied
5 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Put a create_generated_clock assignment on the clock being driven by the FPGA(k clock) and it's -source should be whatever drives it, most likely a PLL ouptut.
Put another create_generated_clock assignment on the clock coming back into the FPGA, and it's -source should be the output port k. Finally, add set_clock_latency assignments to the clock coming back in, with -early and -late values that represent the max and min round-trip delay: set_clock_latency -source -early 3.2 dqs set_clock_latency -source -late 5.8 dqs (I made up the name and numbers). Your clock coming back in will now trace the whole path around with max and min value. Now for your set_input_delay constraints, you will probably use the clock going off chip, i.e. k, for the -clock option.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Rysc
Thanks for the quick reply. I may not have explained my situation well - I don't have a clock coming back to the FPGA, just data. The input data is clocked in the FPGA by a phase-shifted version of the k output clock. Thanks, Chris- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Put a generated clock on the output sending the clock off chip. Use that clock for the -clock option of your set_input_delay constraints. The -max and -min values will be your max and min roundtrip delays.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks - I'll do that.
One last question: I'm not very familiar with the tool (or SDC constraints), so forgive me if this is a dumb question: if I create a clock for the output sending the clock off chip, will that reference the clock at the pin, or will it be the internal version of the clock? To be more specific: I'm generating the k clock from the output of an ALTDDIO_OUT component. There will be a delay through the pad (call it tPAD) and a trace delay (tTRACE) from the FPGA k output pin to the SRAM clock input pin. Thus if Timequest uses the pin clock as its reference, delay from the clock in the create_generated_clock statement, to the SRAM pin, will be tTRACE; however if Timequest references the internal version of that clock, the total delay to the SRAM input pin would be tPAD + tTRACE. So is there a way to actually distinguish between the clock on the internal or external side of the pad (or pin)? Or does Timequest simply use one or the other? I hope I haven't confused the issue - without graphics I'm never sure I'm communicating this clearly. Thanks, Chris- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Not fully getting it, but just put the generated clock on the output port and it will calcualate all the way to the signal going off the device(i.e. to the external side), which is what you want.
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page