The hidden risks of government by artificial intelligence

We’re sorry, this feature is currently unavailable. We’re working to restore it. Please try again later.

Advertisement

This was published 2 years ago

Editorial

The hidden risks of government by artificial intelligence

Ever since the computer Hal went rogue in the film 2001: A Space Odyssey, people have worried that one day computers would become so smart they would take over the world.

That is yet to happen but the rapid development of artificial intelligence is starting to raise important issues.

The NSW Ombudsman has just issued a report titled The new machinery of government which acknowledges the technology has uses and benefits but warns governments should be more careful in how they apply it.

AI refers to a disparate range of computer systems that run with little or no human involvement and which replace, or assist, human decision-making.

AI is certainly quicker and cheaper. Some would also argue that it is more objective than human decision-making.

Yet, the technology brings with it a range of problems. The Ombudsman decided to look into AI because it received a number of complaints from financially vulnerable residents who discovered that Revenue NSW had emptied their bank accounts in collecting unpaid debts, leaving them without enough money to buy groceries or pay rent.

It turned out that NSW Revenue was using an AI program which had not been programmed to consider people’s financial situation.

Revenue NSW has now resolved that problem by establishing a minimum balance which is protected from these automatic garnishee orders, but the Ombudsman has questioned whether it is legal for a computer to even make such a decision which is supposed to be made at the discretion of a human being.

While AI is supposed to be more impartial than human beings, the Ombudsman says it carries a different risk of so-called “algorithm bias”.

For example, in the US when courts make sentencing decisions they often use an AI system to assess whether the accused is at risk of reoffending. The investigative media group ProPublica said that the algorithm is racially biased and inaccurate as a predictor of violent crime.

Advertisement

The best known AI failure in Australia was Centrelink’s automated income compliance regime known as robo-debt which collected debts from welfare recipients for overpayments. In fact, the computer program which calculated people’s income and welfare entitlement was flawed and illegal. It demanded repayment from vulnerable people of money which was rightfully paid to them. Centrelink had to pay $1.2 billion to 400,000 people in settlement of a class action.

Loading

The Ombudsman said the general lesson of robo-debt was that governments have to be much more careful when using AI because it can affect so many people and it can be so hard to correct using traditional mechanisms of redress such as complaining to the Ombudsman or an administrative tribunal. Often the only way to resolve problems is an expensive overhaul of software.

Getting it right is all the more important because, as the Ombudsman says, “machine technology is disproportionately used in areas that affect the most vulnerable in society – in areas such as policing, healthcare, welfare eligibility, risk scoring and fraud detection”.

It is, of course, not just in government that algorithms are getting a bad name. Social media companies such as Facebook and Google have been criticised for using machine learning techniques to curate the information sent down people’s feeds in ways that distort democracy. Banks use it in issuing loans.

But governments have to uphold the highest standards of fairness and legality. As the Ombudsman argues, as a start people should be told when a decision is being taken by a machine rather than a person. The government has to use artificial intelligence with extreme care and greater transparency.

The Herald editor writes a weekly newsletter exclusively for subscribers. To have it delivered to your inbox, please sign up here.

Most Viewed in Business

Loading