Regarding the exam results scandal, I’ve seen some anglican twitter thoughtleaders(? Some have blue ticks, others substantial followings idek) raise themes of technology vs humanity.

This is fundamentally mistaken and unhelpful.

Technology simply embodies social norms.
Well, to clarify, it embodies *and exacerbates*, making it simultaneously a site of problem solving and reflection.

Technology is the outcome of our humanity, not in opposition to it. It’s up to us to accept what it’s revealing about our values-at-scale.
So it’s a little disheartening to see the above mentioned folks describe this as something ‘new’ (disheartening because the critical tech conversations have been happening in earnest since the late 80s if not earlier, so this once again demonstrates the gap between different +
+ sectors who are actually concerned with the same thing).

The idea that Statistics tells Truth is not new, but has been significant to our common understanding of knowledge building for the past 400(?) years, part of the fallout from reductionism as +
+ approach-per-excellence. The idea that if only we have every single data point, we can predict the future as perfectly as we can predict the past is the paradigm which is a dominant belief that drives this, key to pretty much every field.

I’m jumping lots of steps here but +
+ much of the work in Machine learning is based on the premise that having more data will increase accuracy. The fact that more recent discoveries in mathematics and theoretical physics shows this… might be more complicated is a whole other discussion**. +
+ in the west we have been building ever more elaborate structures on these mostly ok but occasionally quite shaky foundations but regardless of technological progress, many of the same questions arise - what counts as accurate enough? +
+ the theorist @/sambarhino had a great lecture describing how even on a hardware level, the need to work with physical limitations of the ‘computer’ requires repeated estimation which lead to tiny inaccuracies that build up over time (of the software running, in this case). +
+ That is a totally different problem from that of whether a) you are trying to solve the ‘right’ problem b) if you even have the right data to justify the model you’ve made. Regarding a), lots of decolonial and critical scholars of tech have pointed out +
+ As for b) quite frankly, a lot of what gets called ‘objectivity’ is just decision by authority. In the same way as someone just has to decide which decimal place counts as accurate, someone has to decide which particular data sets get used (yet alone how they're weighted) +
+ someone has to decide what outcome is ‘good enough’, someone has to decide when the model is ‘close enough’ etc. etc.

What we’re seeing isn’t ‘technology leading policy’, but technology giving policy what it wants. +
This stuff isn’t magic. Its the exact same case for racist, sexist, ableist, transphobic, homophobic and yes, classist algorithms. They’re not leading anything, they are connecting dots in the data they have been given, according to rules they have been commanded to follow. +
Now, it is impossible to know everything about everything, so this isn’t a ‘you should know this already’ rant.

I am going to make specific appeals to those who kind of really do need to know about this and help others dismantle this awful entangled behemoth of a system. +
+ Some of the areas that are most badly effected by this right now are health, prison (esp. probation) services and of course education. Regardless of sect or religion, if you are a chaplain (or equivalent) in a hospital, a prison, a school it would be so great if you could +
+ (when you have time, ha!) look out for automated decision making, or sometimes just ask questions, start a discussion about it. When it comes to stuff that’s being used, which company built it? How does the institution know it’s fair? +
+ What discrepancies have people experienced with it? Is the code even open source? Raise it up within a) your religious organisation as this is *absolutely* an ethical issue b) unions who represent IT workers +
+ c) organisations like the @/techworkersco, @/AJLUnited. Talk about it using hashtags like #TechIsNotNeutral #DecoloniseAI to get more engagement - you don’t need to be an expert in tech on top of what you do, but you can share with other experts out there! +
Whilst you don’t need to be an expert, I do think it’s really important that our religious and ethical leaders don’t just fall for easy narratives and simplistic binaries. We don’t have time for yet another 1960s style Tomorrow’s world debate about potential dangers of robots. +
+ This stuff is already out there, in the real world and right now is costing people their futures and their lives. #ExamShambles #DecoloniseAI #TechActivism
You can follow @bibblings.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.