Everyone needs an avocation they pursue with passion. For you, it may be cars, clothes, Greek food or detective novels. Now that Ahead of the Curve is reaching a larger audience online than it could reach in print, there are robot makers, solar power fanatics, and hard-core gamers among its readers. When your favourite pastime yields value while you're on the clock, you are a lucky person indeed. For me, science, most any science, is my passionate avocation, and I'm fortunate to have inhabited the untitled role of staff scientist for InfoWorld's Test Centre for several years. Every few years, I thrash about for an adequately descriptive title that fits on my business card. I took up the title of chief technologist before it found use in IT. House geek, IT enthusiast, Property of Apple, jack-of-all-trades — so many ideas have been floated. But really, my gig is all about turning science into strategy for IT, and for the users and buyers of commercial and professional technology. I'm a firm proponent of planning from the gut. However, the only people who can operate that way with any hope of good results are those who have applied science with success for so long that it has become instinct. I have been operating on gut for a while on systems, storage, networking, and the basic food groups of IT, reaching out to things like mobile devices for the challenges that keep me fired up. But while I've been tackling that and indulging a fascination for physics and mechanical engineering, one work-related topic that I had relegated to "got it down cold" gave me a rude bite by way of reminder that I'm never too smart or too experienced to go back to school. Being of a scientific bent, it's natural that I've been a benchmark fanatic since I started writing about computers. There is a great deal to be learned from benchmarks once you understand the gears that spin to make the published numbers light up on the tote board. You learn a lot about vendors and what they think of their customers from their use of benchmarks in advertising. I did my time at the bottom of the benchmark food chain, writing benchmark code, but just before I started working for InfoWorld, I got the ultimate correspondence course in performance characterisation: A complete set of CDs from SPEC (the Standard Performance Evaluation Corporation), its entire library of killer benchmark tests. I carry those discs to this day, the originals in their white paper envelopes, everywhere I go. SPEC's software and the guidance that accompanies it form my bible when it comes to test methodologies, standards for quality coding, and straightforward statistical analysis, but also organisational transparency and stringent ethics. To dip into the SPEC library, which has been my avocation, is to sit down with the sharpest minds in computer and statistical sciences. It is an awesome and endlessly varied code base, put to use for a very lofty purpose. Thanks in large part to SPEC, I can hold my own with system, chip and development tools designers, where SPEC performance baselines are taken as understood and SPEC terms are part of the vernacular. I put the science and statistics of performance characterisation in plain language for readers, using lessons that I learned from SPEC. I added a practical angle to my scientific understanding of compiler optimisations, processor scheduling, CPU cache utilisation, comparative efficiency of message passing techniques, and other deep concepts by building, running, profiling and debugging SPEC software in on- and off-label capacities. Thanks to SPEC, I can speak expertly on matters of performance characterisation, its relevance in buying decisions, and its usefulness in tracking trends such as the migration of high-performance computing principles to IT. I made good use of the years I spent passionately exploring and dissecting that SPEC library. Readers' interest in benchmarks waned as x86 took over, and no one figured it made any difference whose brand was on the front of the box or on the chips inside. I took one last tilt at benchmark science back in 2005. I volunteered to use SPEC CPU2000, the 800-pound gorilla of benchmarks and just about as friendly, in a review. I knew this cold, but I discovered that my discs, or my knowledge, or both had aged in five years to a point where I couldn't get the suite to compile any more. Whatever the reason, when I published the numbers I could get, apologising profusely to SPEC for violating its rules, SPEC was furious but readers didn't notice. I might as well have run baseball scores. After that, I couldn't set fire to a passion for benchmark science if I soaked it in gasoline. I resolved to keep weighing in on discussions of benchmarks as a seasoned snob, but it wasn't something I did for fun. One must never assume. Apple carried benchmarks back into favour as a means of marketing to mainstream buyers, capitalising on the fact that most prospective buyers had no clue what the numbers meant and little motivation to find out. This caught on, and now you will find SPEC CPU results in retail brochures. This is good for SPEC, but how can people make decisions based on statistics they don't comprehend? I must rail against this, I must stand up, I must... I must find out what SPEC CPU2006 is before I go running my mouth about how it's being used and interpreted. I got the benchmark from SPEC, hunched over an eight-core Barcelona server for which results had not been published, and hung in with the requisite pad, pencil, sweaty clothes and fast food. More than one month and 118 runs later, I can now hold my head high as a master of SPEC CPU2006. I learned that while I was relying on what I knew and what I'd read, everything changed. Now that I can run and, more importantly, explain the tests that my readers are studying to make comparisons among vendors, I'm fired up about benchmarks. The trouble with the thrill of scientific discovery is that it's never enough. I look mournfully at my library of SPEC discs; they seem pretty tired now. The code in SPEC CPU2006 so outshines it and brought so many new insights to my mind, that I'm reminded of the slow, rough road to mastery that the original SPEC discs started me on. I'm relieved to find myself eager to start down that road again, even though it means discarding a lot of hard-earned knowledge that's become a bit overripe. As the banner in the school cafeteria says, science is fun. When you can bring that attitude to work and end up making smarter decisions with the new knowledge you've gathered from the most trustworthy source on the planet (you), science is good for your career, whatever it says on your business card.
- Free Whitepaper! Learn how IT is evolving from producer to enabler, and fostering collaboration around analytics.
- Free Whitepaper! Learn how to create an analytics environment that is governed, scalable and self-serve.
- Free Whitepaper! The 5 criteria to help you select the right analytics platform for your organization.