Intel® Fortran Compiler
Build applications that can scale for the future with optimized code designed for Intel® Xeon® and compatible processors.

Hardware requirements for ifort

Nick3
New Contributor I
2,746 Views

We have several developers who use the following:

 

laptop with 16 GB RAM, I think one of the Microsoft Surface laptops

Visual Studio 2022, latest version

Intel Fortran Compiler Classic and Intel Fortran Compiler for Windows 2023.2.1

FEE patch (posted on this Forum)

Intel oneAPI Math Kernel Library for Windows 2023.2.0

 

All complain of sluggish performance while debugging, including text not rendering in vast blank spaces of Visual Studio until the time they give up and complain to me.  The last person I talked to showed 97% RAM usage in task manager.

 

My question after reading this page,

https://www.intel.com/content/www/us/en/developer/articles/system-requirements/oneapi-fortran-compiler-system-requirements.html

is, you specify 8 GB RAM recommended.  Is that really true?  Does anyone actually use VS + Fortran debugger with 8 GB of RAM?  Or even 16?  I have 32, and I can barely manage with a few Internet windows and a handful of different solutions being debugged.  What specs are actually a reasonable minimum?

 

Because, whatever you say is the minimum, I would expect the boss to buy not much more.

 

0 Kudos
17 Replies
andrew_4619
Honored Contributor III
2,735 Views

I have no issues with debugging Fortran with vs2022 and Ifort with 16gb or ram on W10 and W11 however that might depend how memory hungry your application is. 

0 Kudos
jimdempseyatthecove
Honored Contributor III
2,722 Views

>>All complain of sluggish performance while debugging

As @andrew_4619 mentions, if your memory allocations of the application approach that of the available resident user space less the amount of memory required by MSVS less the amount of memory used by other applications, then you will enter a paging situation..

 

This said, I've been noticing recently (after a windows update), that the disk pegs at 100% usage (without MS VS) for a very long period of time.

After googling around, I found a link that indicated one of the Microsoft Windows applications could be the root of the problem.

I am sorry that I cannot locate the link, or remember exactly what it is. But it is something like

 

"User experience" something like that. You may be able to catch it in Task Manager (the name will be there) but it may not show high disk activity (as a service is doing it for the application). The web page had several suggestions as how to shut it down.

 

The name, whatever it is/was, sounds like it is innocuous or at least something you may want running. I suggest to use Task Manager, sort by disk usage, and see if you can locate something (screen bounces around). If you find something, google the name and see if there is a report of excessive disk usage by this process or service. If you find it, see if it is safe to disable or remove.

 

On my system, the behavior was it took an excessively long time to load a Solution (20 minutes or longer).

Couldn't get to the point of debugging.

 

Jim Dempsey

 

 

0 Kudos
Barbara_P_Intel
Employee
2,714 Views

The 8GB memory mentioned in the System Requirements is to do the install. The amount of memory required to build/debug/run applications varies by the application.

 

0 Kudos
Nick3
New Contributor I
2,707 Views

Our leading theory is that ... the solution with 4 MB worth of source code can live and be developed on a laptop with 16 GB of RAM, while the solution with 17 MB of source code needs 32 GB of RAM.  I'm a little surprised - the latter only uses on the order of a few hundred MB in task manager when running.  We may have a case to order laptops with 32 GB of RAM.

0 Kudos
andrew_4619
Honored Contributor III
2,692 Views

I am not convinced the the amount of source code is a good metric, a small program can demand massive amounts of dynamic memory it  depends on the code. Is there anything unusual about the code for example having 1000s of source files,  large quantities of block data or some other unusual thing that imposes a resource constraint on VS memory management.  If your solutions have huge numbers of projects maybe that is a problem? It is possible in the solution manager to unload projects that do not need to that active. 

 

On building do you serial or parallel build do you have large build cascades? Build cascades used to slow my development down a lot until submodules became a thing.... 

 

I am making guesses because I do not think your experience is typical based on he lack of whinging on this forum on this topic. 

0 Kudos
Nick3
New Contributor I
2,649 Views

I honestly wonder if it's something about a recent update to VS or ifort.

This morning I found out that the 17 MB project developers were perfectly happy with the debugger about 5 months ago, but in the recent few weeks, when all the problems started, they became unhappy as well.  That project was modularized and modernized extensively.

The other one is about 1500 files, 1000 COMMON blocks, 20000 lines of block data ) fixed-format, 1000 lines of block data zeroing stuff out, all fixed format, no memory allocation.

I'm going to try and hunt down somebody who hasn't installed the recent versions of VS with the FEE update and someone who has VS 2019, this could be interesting.

0 Kudos
jimdempseyatthecove
Honored Contributor III
2,629 Views

Did you see if the disk I/O was pegged at 100% while you were not interacting with the Debugger?

Have you instructed your AntiVirus program to not scan your development folder(s).

And, there are some of the IDE helpers (intelisense, and a few others) that may generate excessive searching (for things it won't find).

 

Jim Dempsey

0 Kudos
Nick3
New Contributor I
2,611 Views

The disk IO is showing as about 17%; mostly low, occasional spikes.  As for the antivirus, I know that the older one we used significantly degraded the compiling times, and that's all it did.  The developers I'm talking to right now don't even know better, and the boss gets really upset if you touch antivirus settings, so I learned to live with that one.

I asked one of the devs with the 16 GB RAM laptop to open up visual studio ... RAM went from 11.9 GB to 15.6 GB (99%) and the laptop fans are spinning at full speed [with 25% CPU usage].

We found some temporary solutions until we can get laptops with more RAM.

0 Kudos
JohnNichols
Valued Contributor III
2,600 Views

Fans running flat out on a laptop can occur in the middle of the night when nothing is happening on the computer.  Mine lives in my bedroom I know this for a fact. A fast running fan is not always a load on the computer. 

It is easy to go to best buy and get a couple of RAM chips and replace the ones in your current computers, I do it all the time as I have many old computers and a lot of spare ram.  The chips are cheap.  Ten minutes work if you are careful and do not lose the screws. 

You need to give us a few more clues, what are the CPU models and what are your hard drives, it is not just ram it is all of the above.  

Finally, there is a program discussed and provided on another post  - 

Runtime overhead of using CLASS and TYPE-BOUND PROCEDURE

download the file and run it three times on your worst computer in debug mode, preferable both variants 32 and the old man 64.  And take a screen shot of the timings. 

Upload them and we can compare them to a few other computers on this site.  

Or tell you boss to lash out 5000 USD on the latest Dell precision with core I9 and the NVIDIA 4000.   Honestly that is the cheapest solution in the long run considering the time value of money.  

PS you only need Windows antivirus, anything else is a ponze scheme for virus developers. 

0 Kudos
Nick3
New Contributor I
2,582 Views

@JohnNichols wrote:

You need to give us a few more clues, what are the CPU models and what are your hard drives, it is not just ram it is all of the above. 


Either a Microsoft Surface 4 or 5 with a core i7, 512 GB SSD.


@JohnNichols wrote:

Runtime overhead of using CLASS and TYPE-BOUND PROCEDURE


This is Fortran 77, fixed format, none of that.

 


@JohnNichols wrote:

 

Or tell you boss to lash out 5000 USD ... that is the cheapest solution in the long run considering the time value of money. 


Which is exactly what I want to do.  However, when I say "boss", I'm really talking about your typical corporate management structure.  If Intel, Microsoft, etc. said "developers of light-weight Fortran programs [insert definition] need __specs__, typical Fortran program developers need ___ while developers of heavier Fortran programs need ___" I would have actionable data.

 

 

 

0 Kudos
andrew_4619
Honored Contributor III
2,591 Views

 11.9 GB seems like a lot of base load RAM usage before you open VS . Anyway I made a test  I opened a solution with 10 projects 8 of which are small. On opening  9 projects were unloaded and VS was using 500mb of RAM.  I loaded the  unloaded projects  and RAM went to 1.8GB after working for a while I unloaded the  9 projected but the RAM usage did not go down, it appears that having allocated resources VS is not in a hurry to release them.   I would look at how you use VS  if you have solutions with many active but not really active projects it may be a ball and chain.....  Or maybe just buy lots of RAM.....

0 Kudos
Nick3
New Contributor I
2,577 Views

A part of the problem could be that we're working on a handful of different things at the same time. I have 4 Visual Studios open with heavy projects, plus 3 light-weight Visual Studios, plus all the typical Office software and web sites.  I assume other developers are working on who-knows-what other projects as well.  I like the 2 GB number for a single Visual Studio that you provided.

0 Kudos
JohnNichols
Valued Contributor III
2,571 Views

That is a lot of programs running on one poor computer.  

The Fortran 77 or the Fortran 2023 gets translated into machine language in the end, the point of running the program was to compare the results to other computers of known speeds so we get a comparison of your machine,  

But if you are using a 512 SSD, then that does not leave much disk space for working files,  Jim knows more about that than I, but given your description, I would get the 2TB Samsungs EVO's and do a copy disk and slip them in.  

You really need an economic analysis of time and effort to justify the more expensive computers.  

Now that I have done a lot of. 

 

0 Kudos
Nick3
New Contributor I
2,563 Views

I'm just comparing what this version of the program calculates versus that version of the program, hunt down any bugs, find problems for somebody using that slightly different variant while the other two things are running to some debugger breakpoint ... I don't know, maybe I do have a heavier workload than most people.

Maybe I'll spend a little more time to put together realistic costs on running with under-spec machines [waiting, having to hunt down files to delete, being unable to debug multiple things simultaneously].  Appreciate if you have any pointers.

0 Kudos
JohnNichols
Valued Contributor III
2,553 Views

Let us assume for the each programmer you wage costs are 60,000 a year

To get the on costs in the corporate environment  multiple by 3.3 --  I have done this for 40 years that is the correct number unless you are in NY London or SF

So you computer guy costs you - using the neutral gender version of guy as my female friends do - do not ask me I do not understand English

Jim as an aside I had to help my 16 year old daughter with her English grammar homework, I got 20 out of 37 correct.

So you are spending 200000 per year and they are "effective" for 1200 hours, trust me if they are effective for 1200 hours you are lucky,  If you want to look good in a corporate environment only hire ladies for all the wrong economic reasons, but you will make a boat load of money and your team will be a lot better.  -  bitter personal experience. 

So each hour costs you 200 dollars,  

a 5000 machine is thus equivalent to 25 hours or 3 days of full time or a week of actual effective hours

Now show you machines lose more than 2.5% of their time being slow and you have an effective answer.  

Does that make sense

 

0 Kudos
JohnNichols
Valued Contributor III
2,547 Views

On a sheet of paper draw a table with 120 rows and seven columns

each row number from 1 to 120

each column after that is 0 - 15 15 - 30 etc to 85-100

pick a friend who has a watch - a real watch 

pick you best programmer who is efficient

with the friend sit behind them and every 30 seconds mark in the box how effective they were with the computer  - here you need to define what is effective and non effective - the friend taps you on the shoulder at 30 second intervals in your report you call it a traffic analysis

how much time did they lose and then if the computer was faster how much time would they gain. - which is why you want to quote the rates determined for different computers on the Intel website in the forum, using the standard program - use a credible resource. 

draw a histogram 

write a 3 page paper - intro , method, results,  conclusion show you will save 20000 per person per year

Or fire all your male employees, hire females and the wage saving will pay for it.  === This is not a joke it is a serious social comment on the issue of gender disparity in the workforce.  

 

 

0 Kudos
Nick3
New Contributor I
2,525 Views

Homework accepted ... I like your method of counting productivity.

As far as hiring and firing and all that topics, well ... I'll jump into the pits of debugging engineering equations written in assembly before I touch those issues

 

0 Kudos
Reply