“Computing can be regarded as having succeeded beyond everyone’s wildest dreams,” says Ajit Narayanan, the new head of the School of Computing and Mathematical Services at the Auckland University of Technology (AUT).
“From its humble beginnings that involved huge machines that required a team of technicians to maintain, computing has now progressed to the point where computers are everywhere — on our desks, in our pockets, in our home appliances, in our cars, in aeroplanes, in our banks and in almost every aspect of business. There is no aspect of our life that is not affected by computing.”
On Tuesday October 16, AUT celebrates its 40th anniversary of computing courses.
AUT, then known as the Auckland Technical Institute, first had a class of 20 or so students in 1967 for its one year Certificate for Computer Personnel. This was before the rise of computer science departments at New Zealand universities in the early and mid-1970s (Wellington Polytechnic began a computing course in 1963).
Computers then were plug board machines. Courses were held in Plan (the ICT/ICL assembly language), FORTRAN and computing concepts.
ATI tutors taught non-computing courses such as accounting, with vendor staff teaching the technical courses on secondment. Students were sponsored by the local industry.
Today, AUT has 25,000 students with 600 on IT courses.“Graduates are in all areas of computing and organisational sectors: management, technical leadership, database administration, business analysis, systems architecture, software development, web development, systems administration, network administration, service centres, operations, sales and consultancy, academic education and research roles,” says Narayanan.
“Many graduates are overseas in major cities such as London, Amsterdam, New York, Melbourne, Sydney, Hong Kong, Beijing, Ho Chi Min City.”
Looking back, Narayanan says the biggest change over the years was the birth of the microcomputer and personal computer during the late 1970s.
Previously, computers could only be accessed through large organisations with the resources to maintain mainframes and mini-computers.
Such computers were also strictly controlled in the type of programs they ran, so learning was similarly controlled, whereas personal computers could be switched on and off at will, with programming in the operator’s own time. Users could interface with them directly through a screen and keyboard.
“Users very quickly became used to ‘tools’ that required point and click rather than laborious typing through the keyboard. The task of computing education then became to take students between the layers of tools and to train students in how to construct new tools for future applications, rather than simply write software that performed a series of tasks. New courses had to be introduced in human computer interaction, graphics and interactive software design,” Narayanan recalls.
The next big thing was the arrival of the internet and the ability for PCs to communicate with each other and share information. This needed new courses in internet and network computing and standards on how they operated.
Over time, courses also evolved to be more practical, rather than theoretical and conceptual, with students graduating as fully-fledged professionals. Enterprise-strength software became used in student learning, academic staff typically had practitioner backgrounds in the IT industry and with AUT gaining industry input and advice from its industry advisory committee.
Today, AUT also has a much stronger emphasis on research, with more than 100 Masters and PhD students investigating topics across the full spectrum of computing and mathematical sciences, from radio astronomy, through data mining to organisational and human computer interaction.
Such a reputation has helped AUT attract many overseas students, around 20% of the total. Narayanan is looking to counter a recent drop in overseas student numbers due to greater competition for the education dollar.
The plan is to identify new markets in forensic IT, scientific IT, biology, studying the human body and diseases.
Indeed, it is human biology where Narayanan sees the biggest development in computing over the next 50 years. New computing and mathematical techniques will help make sense of the massive amounts of biological data available to help prevent disease and help people live longer.
Miniaturisation will also see the development of ‘biochips’ implanted into our bodies for health monitoring and disease prediction. Computing power may also be developed to create sophisticated models to help the world manage climate change.
“Finally, the presence of computers in almost every aspect of our lives also brings dangers such as information theft from our databases, financial theft from our bank accounts and identity theft that allows criminals to pass themselves off as us,” he warns.
“During the last 40 years, the emphasis has been on designing increasingly sophisticated software that will allow us to do things better and faster. New software engineering techniques will need to be explored to ensure that future software is not only sophisticated but also robust against attack to some level of quality assurance and standards.