hi. I thought about this problem about days and days... I really need your help.I want to design so that maximum operating frequency is 100MHz. So I added D register in middle of my RTL code to design the maximum operating frequency to be over 100MHz. Also I made .sdc file as 100MHz. After compiling, I confirmed that the maximum operating frequency was 113MHz in Slow 1200mV/85C model. But in gate level simulation, it picking up delay of output data, not the right part of the output. So I changed compiler setting, optimization mode(balanced to performance), at first it picking up right part of the output, but it turns red in middle. Why is this happening? How can I get right result of gate level simulation? I think this problem is about compiler setting. I use quartus prime lite addition 16.0. This also happens at quartus standard 16.0 addition. Should I have to add more D register in my RTL code?
Hi Solatidore,What device that you were using? If you were using V series, gate level simulation is not supported. Best Regards, kentan (This message was posted on behalf of Intel Corporation)
Can you put up the screenshot of your red in middle? Does your rtl simulation shows the correct value in the first place? One things to note that we don't encourage user to run the gate level timing simulation in the modelsim because it take a long time for the analysis. If possible, you run the timing analysis in the Time quest to see if there are violation. If you can, post your design.qar files here for us to have a look. Thanks