Programmable Devices
CPLDs, FPGAs, SoC FPGAs, Configuration, and Transceivers
20262 Discussions

Importance of set_input_delay

Honored Contributor II
1,015 Views Dear all, 


I'm trying to create a source synchronous input/output design, which gets/send data from/to a connected device with 32 bit interface bus. 


The manufactorer of the device provided for use with FPGA the following output delay command: 


set_output_delay -add_delay -rise -max -clock 6.000  


But I'm missing the input delay values. When I'm using the design for example to make a loopback with a 1024 byte datastream I do not get the correct data set back. In some cases it works, in other cases it doesnt work and one bit is set to zero instaed of being set to one. When I look at this failure run in signaltap then I can see, that the error happens when streaming the data over the interface into the fpga. The implemented fifo in fpga holds already a wrong value before reading it out. 


I suspect that input_delay has to be set correctly. I already read the time quest guide and now reading the Intel quartus prime timing analyzer cookbook but I do not get my head around finding the correct value for set_input_delay. 


The manufactorer gave this information according to setup and hold times, see attachment 1. I now want to set_input_delay with formular accord to timing analyzer cookbook: 


set_input_delay -clock virt_clk \ -max \ }] 




set_input_delay -clock virt_clk \ -min $tH \ }] 


But I'm not sure what tsu and th is in the provided diagramm. 

The manufactorer provided even a diagram for output data to interface, see attachment 2. Even here I do not quite understand why for set_output_delay 6 ns is set.  


Can you provide me an insight? The clk is created in fpga and output via ddr. FPGA is Altera Cyclone V. 

What exactly happens when I set these delays? Does the fpga create logic to delay these signals?
0 Kudos
0 Replies