Developers of software for school administrative tasks and student assessment say they have been blindsided by a new process of accreditation.
In the first round of the new system, introduced by the Ministry of Education last year, nine of the 10 applicants for accreditation failed the test. Allegations swiftly surfaced suggesting that the ministry’s objectives were not made sufficiently clear and that there was no “feedback loop”.
“The software had to be supplied by a deadline date and we had a day to train [ministry] staff how to use it,” says Anne Valance, a director of small-scale provider 3D-Achieve. “There was no provision for them to come back and say ‘We can’t get this part to work’ and for us to advise them.”
Brian Pawson of Musac (Massey University School Administration by Computer), the system with the largest existing customer base, agrees. Musac also failed the accreditation test. Questions concerned the supplier as well as the software, which were clearly aimed at assessing business viability and effectiveness of support, but they were phrased very generally, he says. “And it was difficult to work out what [the ministry] really wanted to know.”
Still, he can appreciate why this was done. “If they’d asked very specific questions, it would have been easy for us to answer yes to every one, whether we had what they wanted or not.”
The answers on software functionality would have been impossible to fake, however, as ministry staff later had hands-on experience with products.
He agrees with Valance that there was no effective feedback loop.
Both admit to a concern that the ministry may be trying to prune the number of approved suppliers. Valance suggests such an exercise may favour the larger companies. Questions were asked, for example, about quality assurance, something that’s done by big companies, he says. “But not when there are two of you sitting in the same room; [formal QA procedures] are not so necessary.”
A good many of the providers are small, she points out.
The one that passed the accreditation test, Renaissance-owned ITAS, sells Integris, which the company says has been chosen by four new secondary schools.
Concerns about interoperability among systems are also evident, with questions on the export and import of data files. This is understandable, say Pawson and Valance, as interoperability makes it easier for the ministry to have an effective interface. But it also makes it easy for schools to transfer from one system to another “and that may not be to our advantage”, Valance says.
Although only one company succeeded in the 2003 accreditation round, the results across all categories tested were published, says Ministry of Education spokeswoman Christine Seymour. This recognised that schools require a broad view of the results to make decisions which take account of their own specific, and varying, needs and priorities.
“SMS [school management system] accreditation is an ongoing process, as the project’s key aim is to improve the quality of software and business practice for the benefit of schools, by raising the bar on quality … and working with SMS vendors to achieve this.”
A second round of the accreditation process has begun. A total of 18 SMS vendors have expressed an interest in applying. The first meeting between ministry representatives and officials for that round was scheduled for February 13. The suppliers intended to make their case for clearer criteria and more feedback.
The ministry expects the results of the second round will be published in November. Suppliers can still sell their software to schools in the meantime, but the ministry will require accreditation as a condition of sale by 2006 for secondary schools. The target date for primary schools is July 2008.