Intelligence systems intended to support investigations into identity fraud, money laundering and other threats were so degraded in the Department of Internal Affairs that most staff avoided using them.
However, attempts to repair the systems failed to the point where data integrity was at risk.
This is revealed in recently released documents under the OIA.
One part of the attempt to create a general intelligence system between 2013 and 2020 finally worked, but another part was abandoned, mostly because the department did not specify what it needed.
The documents also show that officials managed to recover $ 351,000 in a settlement with an anonymous company that was unable to deliver the complete system.
The case of the $ 2 million system is another in a long history of IT upgrade problems for government agencies, many of them far more costly, ranging from the failed $ 100 million Incis police computer system to the massive Novopay headache in education.
An internal DIA review in 2019 says that, in 2013, the department’s units had “poor intelligence tools.”
These units investigate and regulate fraud, anti-money laundering and terrorist financing, gambling, and community and charitable organizations.
“The department’s intelligence systems were no longer fit for purpose, and most teams were not using the systems due to integrity and usability issues,” he said.
The DIA lacked any case management tools for investigations, leading to “a limited ability to rely on reported data.”
Instead, teams managed with manual procedures and using spreadsheets.
An aborted startup was performed in an update, then delayed, and then restarted in 2016.
But two years later, in 2018, the project that was supposed to offer a “robust and efficient investigation process” was in such a state that it was deemed too high a risk to implement.
The reviews show fundamental flaws.
“The vendor didn’t fully understand the requirements, which means the solution was never going to meet DIA’s needs,” said one.
“This should have been collected over several stages.”
A review found that the department was too soft on the supplier.
“When providers make mistakes, DIA tends to fill in the gaps for them,” said one review.
“DIA needs to be able to make decisions to stop paying earlier and delay more when deliverables are not met.”
Part of the outlay was to purchase licenses to use the system, even before it was proven to work, and it didn’t work. DIA recovered $ 116,000 of those license costs.
The flaws appeared over and over again, right from the start.
“The high frequency of problems being discovered means that testing has been done in constant cycles of retesting,” according to a March 2017 report.
This made the system difficult to use and, worse, data integrity “may not be as expected,” the report warned.
This continued until the end of 2018.
Data integrity was at risk from a failure that was duplicating cases and from not automatically providing a report on errors.
There were also typographical errors in the software code.
“The files attached to the entities … were not working.
“Required fields … no validation.”
The department pressured the supplier company to provide proof of the tests, but “there was a general rejection that the problems will be fixed as they arise rather than making sure they are correct the first time.”
The system “could not be rebuilt from scratch,” they told DIA.
His own supervision had been lacking and his legal advice was not good enough.
“The escalation of risk to the board was too slow and the information provided did not give the board a complete enough picture to make effective decisions quickly,” said a review in February 2019.
“There was a lack of ability to solve problems through the contract, as it lacked solidity.
“Better legal advice should have been provided in the creation of the contract and during the ongoing problems.”
DIA had assumed that the main supplier had a partnership with another supplier; instead, this “was later revealed to be a subcontract type relationship that began to break down during testing and implementation.”
Meanwhile, pressure was mounting on specialized personnel whose “time commitments were heavily affected by the project.”
The investigative case management system, which had been as vital as a single place to store case information “in an evidently robust way” and to improve reporting and oversight, was canceled by the board in early 2019. .
The project budget had been exhausted.
There was a consolation with the intelligence part of the system working. Two of the four key benefits expected in 2016 were met, one partially met and the other, automated and standardized processes, not met.
The new system should securely record, capture, analyze, search and share intelligence information, the reviews showed.
But managers have reverted to relying on spreadsheets, in part, for security and auditing, and staff still have to enter a large amount of data on their own.