The algorithm scandal in the UK categorising students based on their social classes is a "great" example of how algorithms reproduce the social system.
This is not a mistake. Most algorithms are designed in a way to oppress-discriminate lower classes and serve to privileged ones
We should have a clear/concrete approach and raise political demands instead of repeating abstract concepts such as fairness.
This is an arena of political struggle.
There can't be an algorithm fair to everyone.
We shall categorise algorithms based on their ideological-political orientation.
This is a capitalist algorithm, and the other is a socialist one, and that one is a feminist.
If an algorithm designed for counterterrorism is used to evaluate asylum-seekers, then this is a fascist algorithm, we should destroy it.
If an HR algorithm is measuring workers performances, then this is a capitalist algorithm. We shouldn't ask for fairness; we should demand an
explanation,so we can decide what to do, shall we sue the company or go on strike?
What happens if the algorithm correctly measures the performance of a worker?Will we accept the dismissal? Is this worker a robot that should perform well all the time? Maybe she has family matters
If we define the ideological-political orientation of algorithms, then we can determine whose interests are protected and who are excluded.
So political struggles can be organised accordingly.
We can demand developing socialist algorithms that support poor students and reduce the grades of elite students.
We can create algorithms that can help judges to show that rich, elite, white guys are more eager to act criminally.
We can also question software developers, data scientists as well.
"We just train algorithms, we don't know the outcome, we don't understand from social aspects" cannot be an excuse, do not develop fascist algorithms, do not work on algorithms oppressing the poor.
That's simple.
When we talk about fair algorithms, algorithms for good/all, they do not address the roots of the system, and just stay as wishful thinking.
The good news is that students directly challenged the myth of the neutrality and objectivity of algorithms which could be considered as a dogma for previous generations
You can follow @Dr_Eren_Korkmaz.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.