I've seen that most people here use CrystalDiskMark to measure the performance. However, I've read several artitcles on anandtech.com (and other places) where they use Iometer instead. I'm mostly intrested in "4KB random read", which should roughly translate into startup time, both for OS and for apps. What I've seen is that when using CrystalDiskMark, numbers around 20 MB/s are common, but according to the tests on Anandtech, we should get around 60 MB/s! For example:
(I'm mostly concerned about the 160 GB G2 drive, but it seems to be the same for other devices as well)
Also, I should get higher "random read" than "random write". This is not the case with CrystalDiskMark, where write performance is roughly twice the read performance. What's up here, do the programs do their testing in different ways? Anyone knows?
I found this:
CrystalDiskMark seems to be pretty broken i general. I wouldn't trust any data it produces.
The current version of CrystalDiskMark does not support queue depths higher than one. For an SSD like the X-25M which heavily implements queuing in order to hit its true performance numbers CDM is not the best tool to use for evaluating drive performance. I heard a new version (3.X) of CDM that will support higher queue depths will be coming out early next year.