Programmable Devices
CPLDs, FPGAs, SoC FPGAs, Configuration, and Transceivers
20684 Discussions

Did I set up input timing constraints correctly?

JSmit123
Beginner
1,096 Views

I have both clock and data coming from external chip: input_clock, input_data. Datasheet says setup time 3ns, hold time 1ns.

 

Can you please check if these constraints are correct? I believe so but then I test it on the real device, I receive incorrect data.

 

create_clock -name input_clock -period 20 [get_ports input_clock]

set_input_delay -clock [get_clocks input_clock] -min 1 [get_ports input_data] 

set_input_delay -clock [get_clocks input_clock] -max 17 [get_ports input_data] 

 

0 Kudos
10 Replies
IDeyn
New Contributor III
665 Views

Hi JSmit123!

 

First of all, it's strange that you mention datasheet setup and hold params, because in your case as I understand your FPGA receives signal. So I you want sombody to help you you need to add detailed description of your system.

For now I can say that your constraints are wrong.

 

I think that you have an external device (let it be ADC), and it outputs Source Synchronous data, so to properly latch data on FPGA you need to use one of the Source Synchronous methods.

 

You can use PLL in sourse synchronous mode, for example.

 

Hope that helps.

 

--

Best Regards,

Ivan

0 Kudos
JSmit123
Beginner
665 Views

What would be the correct constraints? I added more details below

0 Kudos
IDeyn
New Contributor III
665 Views

Hi JSmit123!

 

I agree with sstrell last post, it is not a simple task, so you need some reading for deeper understanding of Source Synchronous Interface Constraints.

In my opinion best resourse on that topic - https://fpgawiki.intel.com/wiki/index.php?title=File:Source_Synchronous_Timing.pdf by Ryan Scoville.

 

To write a proper sdc constraints in your case you can choose from number of methods.

 

Hope that helps.

 

--

Best Regards,

Ivan

 

0 Kudos
sstrell
Honored Contributor III
665 Views

This is closer to what you should have:

 

# virtual clock driving "upstream" device assuming it's the same as what that device is outputting

create_clock -name vir_clock_in -period 20

 

# input clock to the FPGA; remove -waveform option if data enters FPGA edge-aligned with the clock

create_clock -name input_clock -period 20 [get_ports input_clock] -waveform {10.0 20.0}

 

# if you're using a PLL

derive_pll_clocks

 

# input delays calculated based on upstream device datasheet and board delays (if board trace is not perfectly matched); typically device spec would be tcomax and tcomin

set_input_delay -clock vir_clock_in -max <max_delay> [get_ports input_data] 

set_input_delay -clock vir_clock_in -min <min_delay> [get_ports input_data] 

 

See this online training for details, especially on the calculations for input delay max/min:

 

https://www.intel.com/content/www/us/en/programmable/support/training/course/ocss1000.html

 

#iwork4intel

0 Kudos
JSmit123
Beginner
665 Views

Your vir_clock_in is not related to input_clock . How does it work?

 

And, according to Altera/Intel, it's not mandatory to create a virtual clock. I still can't understand what is wrong with my SDC file.

0 Kudos
JSmit123
Beginner
665 Views

I use two input pins on fpga: input_clock, input_data.

Both signals are generated by the same external chip (ADC).

Data arrives on the input pin 3ns before clock arrives, and stays valid for 1ns after. This is according to datasheet of the external chip, and this is what I observe on the scope.

Clock frequency is 50 MHz. PCB traces are very short and equal length.

I'm not using PLL, I just use clock from the clock pin directly.

 

So, the calculations are very simple:

50MHz = 20ns

min = 1ns

max = 20ns-3ns = 17ns

 

SDC file:

 

create_clock -name input_clock -period 20 [get_ports input_clock]

set_input_delay -clock [get_clocks input_clock] -min 1 [get_ports input_data] 

set_input_delay -clock [get_clocks input_clock] -max 17 [get_ports input_data] 

 

0 Kudos
RPDM
New Contributor I
665 Views

On the assumption that the constraints are correct, have you checked the Timing report file to ensure that it was able to meet your constraints? (I.e. no negative slack shown in red for that clock?)

0 Kudos
sstrell
Honored Contributor III
665 Views

You asked if your constraints were correct, and as I mentioned, they're not. Unfortunately, they're not that simple. This is a source synchronous interface so you have to do things a bit differently.

 

As I said, you need a virtual clock to correctly set up the relationship between launch and latch edge. That may be optional for other types of interfaces (though it's highly recommended), but with source synch, it's basically required to set up the relationship correctly. What are the properties of the clock that is driving this ADC device? And what is the relationship between that device's input clock and its output clock?

 

It sounds like the datasheet is giving you a tco or skew specification between the clock and data, but it is odd that the data is launched before the clock, unless this is meant to be a center-aligned clock (though not truly center aligned since it's only 3 ns past the data edge instead of 10. What is the name of the spec for this that it is giving you?

 

I highly recommend you check out that online training. Also, seeing a timing report of what you're getting with your existing constraints may help.

 

#iwork4intel

0 Kudos
JSmit123
Beginner
665 Views

There is no clock driving the ADC. ADC generates the clock

0 Kudos
Reply