- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I work in a company that develops simulation software for Geological Process Modelling. We are using TBB to increase the performance of the software. This works fine with both Windows and Linus - except when we are running simulations on Windows Server (e.g.2012 and 2016). On these platforms the performance is bad and the CPU times are not predictable. This is the case both in internal testing and with clients usage. Is this a known issue?
Is it anything in particular we should be aware when running on WS?
Regards,
Morten Gaupaas
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Morten,
It is difficult to say what is going wrong on Windows Server. From TBB logic perspective, there is no big difference between server and client OSes. I can suppose that the server can have bigger number of cores and, if the algorithm has poor scalability, it can lead to some performance issues. Have you tried any performance analyzers, e.g. Intel VTune Amplifier?
Regards,
Alex

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page