(The author is a Reuters Breakingviews columnist. The opinions expressed are his own.)

NEW YORK  - Government U-turns don’t come much bigger. Popular fury forced the abandonment of hypothetical calculations of likely grades for the UK’s national A-level exams this week. The decision is a reminder that even well-intentioned algorithms make many harmful mistakes.

The exams, which are critical to university admissions, were canceled thanks to the coronavirus lockdown. The government used an algorithm to estimate what would have happened. The furor erupted after almost 40% of students had their scores downgraded from the predictions of their teachers, with pupils from disadvantaged schools disproportionately affected.

It’s an unusually stark and public example of a far bigger problem. For instance, models that spit out credit scores in the United States and elsewhere can make it hard for less affluent folks to borrow at reasonable interest rates, in part because a decent credit score depends on already having a record of servicing debt.

Algorithms are also used in many U.S. states to help decide how likely a criminal is to offend again. Judges adjust sentencing decisions accordingly. These recidivism models are supposed to neutralize the prejudices of the judges. But their inputs, for example whether friends of the convict have been arrested, can introduce other biases, as former hedge-fund quant Cathy O’Neil describes in her book “Weapons of Math Destruction.”

O’Neil talks about so-called proxies, meaning inputs that substitute for knowledge of a person’s actual behavior. In the UK’s exam-grade algorithm, one proxy was the past record of a student’s school. That proved a sure way to discriminate against star students from less wealthy backgrounds.

As so often, the British model’s goal was rational: to adjust forecast grades to bring them, overall, closer to the distribution of results in prior years. The government of Prime Minister Boris Johnson and its exam regulator, Ofqual, had months to consider possible unintended consequences of their algorithm. Like careless teenagers, they wasted the time.

Stories of striving kids’ heartbreak make great television, and the protests of parents, teachers and a generation of soon-to-be-voters made predictably short work of the government's initial refusal to budge. Teachers’ estimated grades will now stand. If only it were so relatively easy for, say, prisoners handed unfairly long sentences to make themselves heard.

CONTEXT NEWS

- The British government on Aug. 17 bowed to public pressure over its school exam grading system, ditching an algorithm that downgraded the results awarded to students in England after their tests were canceled due to Covid-19.

- The government had faced days of criticism after the mathematical model used to assess predictions made by teachers lowered grades for almost 40% of students taking A-levels, their main school-leaving exams.

- Adjustments by algorithm have also been abandoned for so-called GCSE exams, typically taken two years before A-levels, for which results are due out on Aug. 20.

- Students will now be awarded the grade that their teachers had predicted for them based on past performance, Conservative Prime Minister Boris Johnson’s government said.

- The decision, which applies to England, mirrors those made by devolved administrations in Wales and Northern Ireland on the same day, and in Scotland last week.

- Analysis of the algorithm showed it had resulted in “manifest injustice,” favoring students in fee-paying private schools, said Paul Johnson, director of the Institute for Fiscal Studies think tank, writing for the Times.

 

(The author is a Reuters Breakingviews columnist. The opinions expressed are his own.)

(Editing by Edward Hadas, Leigh Anderson and Oliver Taslic) ((richard.beales@thomsonreuters.com; Reuters Messaging: richard.beales.thomsonreuters.com@reuters.net))