I’ve been working on InfoWorld’s power and cooling benchmark suite (I call them “Greenmark,” but I haven’t officially cleared that name’s use) for several months. Why, you might wonder, haven’t you seen any published Greenmark results? Check out the comments thread f following Ted Samson's Sustainable IT blog entry reporting Neal Nelson’s tests showing that AMD is markedly superior to Intel in server power efficiency. It’s remarkable how much emotion the commenters to Ted’s blog invest in their arguments against Nelson’s findings. Some of these people skip technical objections and go straight after Nelson’s reputation, alleging that AMD had purchased the positive outcome.
This seems an extreme reaction until you think about the impact that a headline like “AMD servers consume up to 44% less power than those based on Intel” can have. Energy-efficiency test reports will steer billions of dollars in IT spending and stock market investment over the next several years. One such headline as this, run at an opportune time, can swing several points of market share over a period of several months, and if picked up by an analyst, a definitive finding such as Nelson’s could have an immediate and substantial impact on a vendor’s share price. I know that when InfoWorld’s Test Centre runs its first set of Greenmark results, there’s a fair chance the vendor that finds itself in second place will launch a concerted effort to tear the tests, the reviewer, and the magazine apart. For that matter, the vendor itself might not need to lift a finger. Self-interest among shareholders and lesser investors in the chips and systems deemed less efficient can motivate a response more energetic and far-reaching than any that the vendor’s marketing department could orchestrate. There is that much money at stake. The furore over Nelson’s findings proves my point.
There’s also a good deal of ass-covering there. Energy efficiency has rocketed to the top spot in server purchasing criteria. Management expects that everyone with input into server buying decisions has learned the ins and outs of power conservation, but there are some lazy IT folk who just fudge the rationale that justify major system purchases. Something like the Neal Nelson report can end up on the IT slacker’s desk with a Post-It reading, “Server criteria review meeting today @ 10:00. Bring your energy-efficiency research notes.”
The remarks in the Sustainable IT blog’s comments thread barely hint at the firestorms that energy benchmarks will trigger once they become commonplace. It took real courage for Neal Nelson to take the first arrows along this trail. As for challenges to Nelson’s objectivity and professionalism, I’ll weigh in with the caveat that I won’t be looking at Nelson’s energy tests until I’ve finished developing my own. Neal Nelson has long experience and solid credentials in the performance testing business.
It’s been my experience that Nelson’s organisation creates fair and relevant tests, documents the process, makes itself accountable for the results, practices full disclosure, and responds to challenges to methodology and findings. A tester that provably meets those standards (and so very few do) earns the benefit of the doubt where integrity is concerned, and only test results published by independent testing organisations with a commitment to those standards can be taken seriously by investors of any stripe.
Challenges to the tester’s objectivity that aren’t backed by direct proof of bad faith have to be ignored by those who develop tests and who use test results as criteria for purchases. That makes room for the kind of discussion that leads to better tests and to the wiser application of their results.