- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In an attempt to discern the maximum throughput of my [100Mbps] local network, and to a lesser extent see what level of overhead is introduced by the operating system (Windows), I performed a crude test which simply involved pushing as much traffic onto the link as possible destined to the next hop (router)
Surprisingly, network utilization peaked at around 4MB/s (~33%); clearly very far from optimal.
Long story short, the problem turned out to be the interrupt rate of the network adapter which was [during the initial test] set at 'adaptive', after modifying this to the lowest possible setting network utilization was able to maintain a stable ~99%. Oddly enough, Intel's recommendation was for higher interrupt rates when in high-load situations ... this actually produced a negative effect rendering < 12% utilization.
Surprisingly, network utilization peaked at around 4MB/s (~33%); clearly very far from optimal.
Long story short, the problem turned out to be the interrupt rate of the network adapter which was [during the initial test] set at 'adaptive', after modifying this to the lowest possible setting network utilization was able to maintain a stable ~99%. Oddly enough, Intel's recommendation was for higher interrupt rates when in high-load situations ... this actually produced a negative effect rendering < 12% utilization.
Link Copied
0 Replies
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page