I think, the line, “Sensibly designed, the computer algorithms could have been used to moderate teacher assessments in a constructive way” quite deeply misunderstands what happens when algorithms meet traditionally unfair and high risk situations. https://on.ft.com/2Q4KIdZ 
A levels based on a final exam are never entirely fair, but they are culturally expected. Some people do much better or worse than expected. It’s a gauntlet that society understands.
Whether it is trust in the system, resignation, or suppression, most people are content to live with that. But this same level of acceptance can’t just be transferred to a newly designed automated system. Especially not one that can allegedly be so finely modelled and refined.
I mean, it is *possible* that an algorithm could be fairer than exams *in some respects*, but that would probably require a different social contract, more tolerant of constant surveillance and micro-performance assessments, but there would still bias issues to mitigate.
Anyway, this is why good governance is so important, as well as service design and clear transparent communication. Had I been managing this I would have:
- designed an appeals process
- published the process ahead of time and clearly explained the algo and the appeals
++
- Written to every student, and got a trusted figure to do a press conference
- built in a buffer so universities didn’t accept offers on the day but had a grace period.

Algorithms need different trust mechanisms to established systems - not least because they are new systems.
Of course, hindsight is a wonderful thing, but I guess my position is: automated decisions can’t just be dropped into processes and allowed to trigger social explosions at will. In a responsible democracy, mitigations must be put in place to maintain public trust.
Welcome to my TED talk on one sentence on an FT opinion piece. 🤦🏻‍♀️
You can follow @rachelcoldicutt.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.