More tech bsfckry.
In the middle of a pandemic, KLIA decides to roll out facial recognition s/ware linked to immigration databases as travel checkpoints.
Why is this a problem?
(thread) 1/n
https://www.bernama.com/en/general/news.php?id=1923008
#privacy #AI #FacialRecognition #surveillance #airports
In the middle of a pandemic, KLIA decides to roll out facial recognition s/ware linked to immigration databases as travel checkpoints.
Why is this a problem?
(thread) 1/n
https://www.bernama.com/en/general/news.php?id=1923008
#privacy #AI #FacialRecognition #surveillance #airports
Facial recognition s/ware has been increasingly shown to be deeply problematic for enabling rights-violation. It invades privacy, can be deployed as part of public space infrastructure w/o need for consent, operates at a distance, easily repurposed. http://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/
There's also the problem of bias, accuracy, misrecognition and over-surveillance esp of brown and black bodies. And on gender.
If you haven't seen this yet, do. it sums up the problem pretty clearly:
If you haven't seen this yet, do. it sums up the problem pretty clearly:
====
Here's the research it's based on:
http://proceedings.mlr.press/v81/buolamwini18a.html
And more good research built on this work: https://www.ajl.org/library/home
Tech is never neutral. Our very real, human and biased values are built into them.
Here's the research it's based on:
http://proceedings.mlr.press/v81/buolamwini18a.html
And more good research built on this work: https://www.ajl.org/library/home
Tech is never neutral. Our very real, human and biased values are built into them.
When used unchecked, w/ poor legal safeguards on privacy, facial recognition sw can be used to profile, surveil and enact granular level of discrimination & abuse towards minorities.
Here's how China is using it agst Uighur Muslims: https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html
Read. The. Article.
Here's how China is using it agst Uighur Muslims: https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html
Read. The. Article.
Here's how it can impact on gender and sexuality, and a range of rights and discrimination that can stem from that:
https://genderit.org/editorial/are-we-just-ticking-boxes-bringing-and-expanding-notions-gender-internet-policy-and
And another on misrecognition of trans people:
https://www.vice.com/en/article/7xnwed/facial-recognition-software-regularly-misgenders-trans-people
As though we don't make lives of trans ppl hard enough.
https://genderit.org/editorial/are-we-just-ticking-boxes-bringing-and-expanding-notions-gender-internet-policy-and
And another on misrecognition of trans people:
https://www.vice.com/en/article/7xnwed/facial-recognition-software-regularly-misgenders-trans-people
As though we don't make lives of trans ppl hard enough.
Here's a recent report on how benign efficiency arguments used for implementing facial recog software at borders *doesn't* work, and can have huge human rights implications:
https://cippic.ca/en/news/facial_recognition_transforming_our_borders
https://cippic.ca/en/news/facial_recognition_transforming_our_borders
Some of the problems outlined:
* Inherently sneaky & intrusive
* Operate with deep biases
* Efficiency gains are often overstated
* Highly susceptible to being repurposed for other use (we're not even hiding - already plans to enable shopping - SHOPPING! our rights are so cheap)
* Inherently sneaky & intrusive
* Operate with deep biases
* Efficiency gains are often overstated
* Highly susceptible to being repurposed for other use (we're not even hiding - already plans to enable shopping - SHOPPING! our rights are so cheap)
* MORE invasive than other biometric techniques, and LESS accurate
* Vulnerable to data breach (more below)
* Poor legal safeguards (everywhere)
* Vulnerable to data breach (more below)
* Poor legal safeguards (everywhere)
'Where facial recognition is applied as a gate-keeping technology, travellers are excluded from border control mechanisms on the basis of race, gender and e.g. country" & "be subjected to more intrusive searches, deportation, refoulement & reputation harms."
I don't know what database MAHB (Malaysia Airports Bhd) intends to use, or worse, build up based on the info it collects, but you can bet the same problems of bias, accuracy, misrecognition and over-surveillance would be present.
Whatever racism and prejudice we (i.e. chances are powerful majority who gets to implement these tech) have, we feed into teaching facial recog tech on how to sort ppl according to ideas of race, ethnicity, gender etc.
What is our track record against minorities & discriminated ppl? We step up harassment and arrests of migrants during a pandemic. We launch a nationwide manhunt on the lone migrant worker who spoke up against this. Semalam, news of the annual witchunt against LGBT was announced.
Now imagine this being empowered with biased, inaccurate biometric data that will be used for deciding our ability to move, travel, maybe purchase things, maybe even exchanged/sold/shared w/ 3rd party, other govts - for whatever. Transparency has never been our strong suit
In case you think you're invincible cause you are a happy, law-abiding majority. The market for facial recog s/w is global.
Scroll up to see problems w bias, misrecognition and inacuracy built into the system.
(also, fuvm)
Scroll up to see problems w bias, misrecognition and inacuracy built into the system.
(also, fuvm)
And scroll down for the great equaliser of poor laws & data breach that we are all subject to here.
We have the massive problem of extremely poor safeguard on data privacy and protection in this country. We rank 5th worst in the world.
#MalaysiaBoleh https://www.thestar.com.my/news/nation/2019/10/16/study-malaysia-the-fifth-worst-country-for-personal-data-protection
#MalaysiaBoleh https://www.thestar.com.my/news/nation/2019/10/16/study-malaysia-the-fifth-worst-country-for-personal-data-protection
here was a massive data breach of 46 million Maxis account in 2017. Population is only 32 mill. Wonder why you get targetted spam and scams on your phone? This is most probably why.
https://www.bbc.com/news/technology-41816953 https://www.thestar.com.my/news/nation/2017/10/31/msia-sees-biggest-mobile-data-breach-over-46-million-subscribed-numbers-at-risk-from-scam-attacks-an/
https://www.bbc.com/news/technology-41816953 https://www.thestar.com.my/news/nation/2017/10/31/msia-sees-biggest-mobile-data-breach-over-46-million-subscribed-numbers-at-risk-from-scam-attacks-an/
What has been the action taken? Was Maxis held to account for violating privacy? What happened with the investigation? Did the govt use this as an opp to strengthen data privacy in law or policy?
Nope.
Nope.
We have already raised problems with adequacy of Personal Data Protection Act to protect privacy of indvs with the roll out of #MySejahtera.
Here are other ppl talking abt it: https://www.thestar.com.my/news/nation/2020/08/20/data-breach-is-a-big-concern-say-experts
Here are other ppl talking abt it: https://www.thestar.com.my/news/nation/2020/08/20/data-breach-is-a-big-concern-say-experts
Glad it's not mandatory, but now we want registration to #MySejahtera being the way to get vaccinated?
Is #MySejahtera even working as an effective method to contact trace?
Is #MySejahtera even working as an effective method to contact trace?
"Only 4% of total Covid-19 cases in Malaysia were directly detected through MySejahtera, revealed Health Minister Datuk Seri Dr Adham Baba in a written Parliamentary reply on 12 November." https://www.therakyatpost.com/2020/11/19/health-minister-mysejahtera-directly-traced-only-4-of-covid-19-cases/
Why are we still pushing for techno-solutionism that makes everyone who use it vulnerable to data violation and potential abuse when there is no study, assessment or even case made for its efficacy?
The potential for facial recognition s/w to be used for abuse, profiling and rights-violation has been so problematic that many are putting brakes on its application in law enforcement. Even Amazon and IBM.
https://www.aclu.org/blog/privacy-technology/surveillance-technologies/federal-court-sounds-alarm-privacy-harms-face https://privacyinternational.org/news-analysis/3896/why-amazons-temporary-ban-police-use-facial-recognition-not-enough
https://www.aclu.org/blog/privacy-technology/surveillance-technologies/federal-court-sounds-alarm-privacy-harms-face https://privacyinternational.org/news-analysis/3896/why-amazons-temporary-ban-police-use-facial-recognition-not-enough
Malaysia Airport Berhad's idea to roll out facial recognition tech is done without debate, discussion or even a basic proof of concept that it'll work according to imagined need.
I mean, we're meant to promote mask wearing, *esp* in airports, to prevent Covid. Why on earth introduce a tech that requires you to mask off to work?
Tak faham.
Tak faham.
Enough already. Moratorium on all big data-driven policies or projects by govt and GLCs until better safeguards can be developed. Get all stakeholders into conversation. Talk to civ soc, human rights tech advocates, feminists. Do an actual impact assessment.
Penat.
n/n
Penat.
n/n