Given two start-ups, one selling itself on its parallel-processing nous and the other not emphasising that aspect, would investors be significantly more attracted to the former?
That was the question posed by Nicolàs Erdödy to an expert panel at the Multicore World conference in Wellington last month.
Erdödy says although it is hard to see it as a specific differentiator, he believes there is opportunity to make New Zealand a centre of parallel-processing expertise.
"The bottom line is: customers don't care and they'll never care," says Intel's Tim Mattson.
"And no one's going to say 'I'm going to work with you, because you will deliver multicore. People want applications that excite them and solve problems." Behind those applications there may well be a multicore platform, but its design will be irrelevant to the user or the investor.
Intel, he says, is employing sociologists and ethnographers to help it understand "these messy things called human beings".
"You've got to go out and study the people in the identified user base and then build a product that will give them an experience they value. I'm sorry I can't produce a killer app [for parallelism], but that's the process I'd go through to find one."
Multicore will not be a selling point in itself, but it is a material factor in the improving capability of computers, says Poul-Henning Kamp, of Varnish Software. "Computers don't always get faster, but they get better.
"It changes the game, and whenever a game changes, there will be a differential between the people who understand the new rules and the people who don't."
Kamp also sees monitoring user needs as an essential counterpoint to having bright technological ideas. The dot-com boom and crash in the early 2000s was the product of a gap between ideas and consumer needs, he says.
"Any time you have an application which gives you more compute power there is potential for innovation," says Barbara Chapman of the University of Houston, Texas. Parallelism is relatively easy to exploit in such fields as imaging and digital signal processing, a key element of telecomms. She says that in the business sector an Italian developer applied parallel processing successfully to improve database performance by several times. "It is a differentiating factor," Chapman says.
Any boost in computing power or ability to provide the same power for lower cost makes a difference because "it means you can afford to use computers for things you could never use them for before," says IBM's Paul McKenney. The first time that happened, with affordable personal computers, "we suddenly needed 10 to 100 times as many programmers," he says -- and programmers with skills appropriate to the new machines, not mainframes. With parallel processing as an element in another increase of power, we will find ourselves short of specific skills again.
At this point the discussion moved to education and nurturing of talent, with some recommending inculcating programming skills at school level -- a point web inventor Sir Tim Berners-Lee made on his recent New Zealand visit.
There are talented potential software developers at school level, says Kamp, but no talent scouts comparable with those who spot and nurture sports players at that level.
Speakers from the audience held out for good general education in school, including the old-style study of Greek philosophers and geometers, hence encouraging skill in sound logic and mathematics.
"Like it or not", says Mattson, a lot of programming in future will be done in high-level scripting languages. Developers in that medium are more likely to be retrained experts in the relevant subject-matter he says, be it engineering or biology, than computer specialists.