Turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- Intel Community
- FPGAs and Programmable Solutions
- Intel® Quartus® Prime Software
- FPGA calculation time vs modelsim simulation time

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page

KBill3

Beginner

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

09-25-2019
11:15 AM

276 Views

FPGA calculation time vs modelsim simulation time

Hallo everyone,

if i have a function that calculate for example the square root of an integer. This function is simulated by using Modelsim, and the time required to receive the output result is approximately 1 ns (as an example).

is that mean, that the calculation time needs to implement this function in FPGA is also equal to 1 ns?

In other word, the time needs to get the output result after the input value is gave to this function.

PS: i am not using a clock in the function.

Thanks in advance.

Billel

Link Copied

1 Reply

RichardTanSY_Intel

Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

09-30-2019
09:17 AM

54 Views

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

For more complete information about compiler optimizations, see our Optimization Notice.