A designer would never get away with building a bridge without considering how it would affect nearby roads, houses and the local community, yet software is often written without taking into account its effect on the various “stakeholders”, says Professor Don Gotterbarn.
Gotterbarn was speaking at a recent Wellington meeting of the Computer Society. The East Tennessee State University academic is currently Visiting Professor of Software Ethics at the Auckland University of Technology.
Gotterbarn used electronic voting as a prime example of such failure. He told horror stories about US counties returning eight times as many votes as there were people, and of voting terminals seemingly designed with no thought as to whether or not they were actually usable.
“I’m sure there’s an army of elderly voters [out there] clamouring to use screens too tiny to see,” commented Gotterbarn.
New Zealand’s chief electoral officer, Robert Peden, heard all this as he sat in the front row, aware that the decision has already been made not to opt for online voting for the 2008 General Election.
US-style electronic voting is further compromised by a “confidential” attitude which greets questions concerning inconsistencies in the system with the stock response, “That’s a trade secret”, says Gotterbarn.
He described how he was assigned to check an electronic voting system being used in Pittsburgh, Pennsylvania. The ballot’s organisers assured him that adequate audit procedures were in place, but could not describe them as they were “confidential”. Nor could they produce conclusive evidence that the voting machines would start with a vote count of zero in their memories.
Some of the risks of e-voting are to do with less-than-competent technicians, but much of the fault, Gotterbarn says, lies with “technological determinism” – a belief that says because a technology exists to do something it should, therefore, be done using that technology.
“I’m sure getting everyone to put a microchip in their pet dog will do a lot to stop wild pack-dogs biting people,” commented Gotterbarn wryly, referring to NZ’s recent row over tagging pets.
Conventional analysis of the risks and benefits of a technology tends to focus too narrowly on budget, timing and technical adherence to a specification, says Gotterbarn.
But, this usually disregards the needs of many of the stakeholders involved.
To deal with this problem Gotterbarn has devised a “software development impact statement”. This forces developers to identify anyone who could possibly be affected by a technology change, and for all combinations of project sub-taskers and stakeholders to ask: “Could this development cause harm to this group of people?”
Too often developers are “solving their problem, not your problem”, says Gotterbarn.
“You’re professionals,” he told the audience. “When designers say, ‘Do it this way’ and you don’t think that way will work have the courage and professionalism to tell them you think they’re wrong.”