Obstacles to the efficient and cost-effective networking of scientific research in and from New Zealand include a disinclination of the nation’s universities to collaborate on buying international capacity and a “one-size-fits-all” perspective on security, says Steve Cotter, CEO of REANNZ, the company that runs the KAREN science and education network.
A powerful international network is an essential ingredient of an increasingly international education scene, Cotter says. Lack of such a network, he suggests, is acting as a disincentive both to overseas researchers looking to collaborate with New Zealand research teams and to Kiwi scientists who have established a reputation overseas and now want to return home.
Cotter, whose career includes senior roles with research networks in the US and with Google, spoke this month to members of the Institute of IT Professionals at events in Wellington, Hamilton and Auckland.
At the Wellington event Cotter said that while New Zealand students fill the classrooms, the overseas students bring in the money and provide future links between local and overseas research teams.
“If you don’t have the infrastructure and you don’t utilise the infrastructure you have got to the full, you won’t get so many international students,” Cotter says.
REANNZ has been talking with the universities and other research establishments about reducing cost by aggregating their international bandwidth requirements. “Today all universities go out and buy their own fibre capacity. But they’re all buying the same service, from Southern Cross,” says Cotter. He has carried out pricing exercises that indicate a better deal is possible by aggregating demand and he says he has persuaded universities in the US to aggregate in this way, with good results.
The other bottleneck is on-campus, Cotter says, and his staff have been helping NZ’s research establishments tune and tweak the performance of their networks. “Ninety percent of the issues reside on campus, not in the backbone. We spend a lot of time making sure the backbone is working efficiently,” he says.
One problem is that campus firewalls lock down all data equally, whether it’s research data that will and should be shared or personal data in the human resources database, which should rightly have strong privacy protection.
“That’s why at Berkley [the US Department of Energy’s Lawrence Berkley National Laboratory, where he was department head of the energy sciences network] we came up with a concept of the science DMZ,” an enclave of science researchers with their own separate access to communications capacity without the latency-sapping effect of elaborate firewalls. This idea has been followed widely.
The challenges of science networking should arouse interest from the whole computing community, Cotter told the audience, because under rapid growth of demand, similar problems will be faced by commercial networks in five years’ time. Network traffic in the financial services sector, Cotter says, is increasing even faster than in scientific research.
One major trend he identifies for the future is towards software-defined networks.
Rather than relying on proprietary specialist components, these will be maximally open and flexible, so a piece of commodity hardware can be, for example, a switch today and a load-balancing device tomorrow, depending on need.