We chose this headline as comedic relief to a frankly horrifying story.
@rajiinio & Genevieve Fried looked at 133 face recognition datasets over 43 years & found that researchers gradually abandoned asking for consent to amass ever more data. https://www.technologyreview.com/2021/02/05/1017388/ai-deep-learning-facial-recognition-data-history/
@rajiinio & Genevieve Fried looked at 133 face recognition datasets over 43 years & found that researchers gradually abandoned asking for consent to amass ever more data. https://www.technologyreview.com/2021/02/05/1017388/ai-deep-learning-facial-recognition-data-history/
What shocked me about this paper is not the nonconsensual use of personal photos. We already knew that.
It's that it didn't have to be this way.
It's that it didn't have to be this way.
And the knock on effects of this sloppy chase for big data go well beyond privacy & consent. These datasets are also poorly documented & verified, which partly explains why facial recognition systems fail so often.
As @rajiinio says, “You just can’t keep track of a million faces. After a certain point, you can’t even pretend that you have control.”