Microsoft's Allchin backs a strategy that is all-inclusive

For the past four years, Jim Allchin has led the charge at Microsoft for Windows NT 5.0, now Windows 2000. Allchin talks about Windows 2000 - and Microsoft's expectations for the product - with InfoWorld editor in chief Michael Vizard and associate news editor Bob Trott

The day Microsoft launches Windows 2000, will be an important date in the life of Jim Allchin. For the past four years, Allchin has led the charge at the company for Windows NT 5.0, now Windows 2000. Recently promoted to group vice president of the Platforms Group, Allchin is one of the senior Microsoft executives who are "betting the company" on the next-generation operating system.

Allchin recently talked about Windows 2000 -- and Microsoft's expectations for the product -- with InfoWorld editor in chief Michael Vizard and associate news editor Bob Trott.

Note: The interview with Greg Southcombe of travel.co.nz listed in this week's Computerworld will appear at a later date.

Because the experience with NT has been widely varied, how do you know Windows 2000 is going to be as reliable as you say?

We spent $160 million, give or take, just on the reliability aspects of the products, so we've taken it very seriously. That includes the tools we've focused on as well as the analysis. We went out to many different corporate sites, [and] Internet sites, and gathered and audited the data from their systems. We went and methodically addressed the problems that we found. But the proof is, 'Hey, try the product.'

How broad has the testing been outside of Microsoft?

There's a whole set of people, both current companies in intranet as well as Internet sites. Customers like Barnes & Noble have been running the software for some time. They ran it through the Christmas season on their fulfillment system, I believe. There are a number of dot-coms that are running in production right now. And, of course, Microsoft [Windows 2000] is on well over 50,000 clients and probably 1,000 servers, or it actually may be above that. It's a large number of servers -- every one of our line of business applications, our Internet sites, all the pieces of Microsoft.com, they're all running on it. We've got a lot of exposure to this software.

Is Windows 2000 optimised around a client/server mentality, or a Web mentality, or does it matter?

You get benefits without even touching the Internet. If you touch the Internet, you'll find that our support of networking protocols is just superior. One of the things I'm super proud of in Windows 2000 is the level of Internet protocols and the flexibility there. Whether it's the quality of service or whether it's multicasting or the IP security, it's just an incredible array of support there. On the client we have very capable XML in the browsing part of the system, and on the server we really have beefed up the Web applications environment so you can write applications very fast, and they're more secure and they're more reliable than they've ever been in the past. It's hard to buy a platform with as much technology as we've put into it. We've thought about this problem in terms of the world the way it is today, and the world the way we think it's going to be in the future.

Right now, the majority of dot-coms and ISVs seem to be leaning toward Linux and Solaris. How will Windows 2000 change their minds?

Linux is this 'handyman's special' operating system. You can tinker with it, and maybe the house won't be so straight when you're done but it feels good pounding the nails in when you're building it. For small and maybe even embedded systems, it's a system that is competitive. I think that there's nothing in Linux in the e-commerce space if you're going to run a reasonable-size business on it that somebody's going to consider it. Who knows what'll happen in the future, but that's what I see today. What you get from Microsoft Windows 2000 is a more integrated, holistically tested system that is capable as an e-commerce, mission-critical environment. Certainly, with all the facilities like transactions and the like tested, it's not a builder's special. On the Solaris side, [and] the Sun side, I think it's flat price-performance. The question is: Do you want a proprietary Sun solution that's dramatically more expensive, or do you want industry-standard hardware with an operating system with Windows 2000 on it?

Now Bill [Gates] is working on something called Next Generation Windows Services. The description of that is it's a pure Internet operating system. I'm a little unclear what the relationship is between those two projects.

Conceptually, the way we attacked the Internet was to infuse Internet technology into our products. We infused it in all of our products, and we're a far cry from being done because there's always more to do. But there's another transition happening, and that's one of services. We want to infuse into our system the services the way we infused Internet technology. When the Internet started off, it was basically communications protocols -- all you could do was send bits from one place to another. Then it turned into presentation, so that you could get these screens sent to you, and that's the world the way it is today. But the screens are static; it's sort of a dumb terminal world the way it's being done today. Imagine the next generation, which is a programmatic way to use the data that's sent to you. Certainly XML is a key building block for doing that, but that's insufficient. You need rich schemas and you need the ability to have a programming model based on those schemas.

What kind of services in terms of training and migration tools do you have to help large corporate customers get to Windows 2000 from where they are today?

There are more pages written in training materials for this product than anything we've ever done. There are more people trained in our product support organisations and our Microsoft consulting services and the third parties providing the service.

Is there any sense for time lines for upgrades to 2000 that people can plan on, or is there still very much a black art out there in terms of developing the OS?

I'm sure you've seen the code name Whistler, that's one code name, but even before Whistler we have Millennium coming. We always have new products under development -- 64-bit, Datacenter -- it's not like one release. There's a whole series of products that are going to come out.

Some of your competitors would argue that there's so much going into the OS, any one piece that breaks will have a disastrous effect across the board, and that putting everything into the operating system is inherently a bad design.

And if you didn't have all of the technology we have, what would you say? Exactly what they're saying. Of course the system's modular. Linux is a 30-year-old architecture. [It doesn't] even have asynchronous I/O, for heaven's sake. [Its] SMP [symmetric multiprocessing] is terrible. It's not about being modular; this is about integration and making it easier for customers. I'm a hard-core believer that by integrating things together, things get simpler.

What is it that Windows 2000 does with security that will make people's minds rest easier in terms of fighting off potential threats?

Security's a lot more than technology. We've spent quite a bit of time documenting the right processes to use. If you go to our security Web site, we have paper after paper on these processes. I believe if there's a commitment from us -- there is a commitment from us -- it's just as important a commitment to notify users if you find anything and also to help with educational materials about how to set up a Web site and do it in a simpler way. Over time, we'll come out with some scripts to customise a system in a particular model. We do have a security tool that can help you analyse your system today ... and give you a report back on the health of it. It depends on what level of security you want.

Are you tracking the ASP [application service provider] space? Is that a niche market or is that going to become the main market?

If somebody wants to run terminals into a system and run the apps all off the server, I say, 'Hallelujah! We have Windows Terminal Server,' that'll let them do that. On the other hand, if they want to run a more balanced distributing model, then they can do that. If a company wants [to] completely outsource the management of their clients, and let a company like CenterBeam actually do the management, then I think we've got the facilities to do that. We're trying to give [a] flexible computing model set of choices to somebody. In some cases they'll use it within their own company in two different ways. I believe it's the flexibility -- that's the core thing we're offering.

So when it comes to thick vs. thin, you're agnostic?

I'm a distributed computing guy. I don't believe centralised computing is the answer to scale. Although I do like the idea of e-sites being built out of commodity pieces being put together, so you can get scale that way. But that, even inherently, is distributed. I like a personal environment; my perfect world is I want to be able to roam anywhere and get information from services in the sky, but I want to be able to have my information with me so I can use it regardless of where I'm at. There's a set of people who believe we're going back to centralised computing, and I'm not one of them. I believe we can have the power of centralised computing, but do it in a decentralised way. I've called it logically centralised but physically decentralised. That, to me, is the future.

What's the status of Windows 2000 Datacenter as far as production -- who's it for, and why do they want it?

It's under development, we're waiting for hardware right at the moment. As soon as we get enough hardware that we feel good about testing coverage of it, then we'll ship it. In terms of true code development, it's done. In terms of who it's for, it's for the most mission-critical, high-end systems you can imagine.

Join the newsletter!

Error: Please check your email address.
Show Comments

Market Place

[]