THREAD. This week, the NYPD published draft policies for its existing surveillance tools, as required by the #POSTAct. Read them for yourself and submit comments in the link below.
Here's an initial analysis of the disclosures and how they fall short. https://www1.nyc.gov/site/nypd/about/about-nypd/public-comment.page
Here's an initial analysis of the disclosures and how they fall short. https://www1.nyc.gov/site/nypd/about/about-nypd/public-comment.page
In 2019, we published a chart tracking NYPD surveillance based on public data. It was an incomplete snapshot.
The POST Act is an important step forward from a transparency and accountability perspective as we're seeing new disclosures w/out a lawsuit. https://www.brennancenter.org/our-work/research-reports/new-york-city-police-department-surveillance-technology
The POST Act is an important step forward from a transparency and accountability perspective as we're seeing new disclosures w/out a lawsuit. https://www.brennancenter.org/our-work/research-reports/new-york-city-police-department-surveillance-technology
For example, we're learning about:
- new disclosures around covert recording equipment, internet attribution technology, & cryptocurrency analysis tools
- the multiple tools used for social media monitoring
- how surveillance tech avoids judicial oversight
- new disclosures around covert recording equipment, internet attribution technology, & cryptocurrency analysis tools
- the multiple tools used for social media monitoring
- how surveillance tech avoids judicial oversight
Unfortunately, the policies themselves are inadequate across the board. They are largely comprised of boilerplate language that is either overly vague or incorrect. For example, claiming that facial recognition or social media monitoring tools like Dataminr don't involve AI.
One of the biggest and ongoing dangers of surveillance tech is the way it is deployed to target Black and brown communities. Reprinting the law and NYPD's stance re discrimination does nothing to address ongoing racial justice concerns. https://www.brennancenter.org/our-work/research-reports/statement-civil-rights-concerns-about-monitoring-social-media-law
For example, gang policing is one of the primary ways that movements and associations in communities of color are monitored. NYPD says they're not as bad as other cities, but do not engage with problems like criminalizing friendship or relying on flawed social media assumptions.
When the City Council asked NYPD whether the gang database tracked groups like the Proud Boys or the Hells Angels, they said those groups are in a separate database for organized crime. But NYPD didn't disclose any such database. https://www.brennancenter.org/our-work/research-reports/coalition-letter-calls-nypd-inspector-general-audit-nypd-gang-database
A separate disparate impact concern is the way surveillance tech routinely ignores biases that harm POC, women, & other marginalized groups.
NYPD disclosures either ignore these issues or says a human in the loop solves bias. Studies suggest otherwise. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0237855
NYPD disclosures either ignore these issues or says a human in the loop solves bias. Studies suggest otherwise. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0237855
Turning to data sharing, the NYPD makes boilerplate disclosures that they share information with other agencies as permitted by law, policy, or agreement.
This is not a meaningful inventory of data flows, and does not address the numerous opportunities for abuse.
This is not a meaningful inventory of data flows, and does not address the numerous opportunities for abuse.
For example, each policy makes the same disclosure that NYPD doesn't share data "in furtherance of immigration enforcement." But this doesn't address data sharing with DHS as part of fusion centers, joint terrorism task forces, or joint gang operations. https://theintercept.com/2019/04/25/bronx-120-report-mass-gang-prosecution-rico/
Separately, investigations reveal that even where there are laws in place against data sharing with ICE, it can occur through data sharing practices that are either careless or willfully blind to how vendors configure their tools. https://www.aclu.org/blog/immigrants-rights/ice-and-border-patrol-abuses/documents-reveal-ice-using-driver-location-data
Overall, inadequate disclosures plague everything from specific retention policies to naming the vendors that supply each tool. The policies also rarely engage with how these tools are used collectively as part of an integrated *system* of surveillance.
The POST Act is meant to empower the public and lawmakers to engage in informed advocacy. Cops and tech cos are ill-suited to engage in the necessary analysis to evaluate dubious claims and call out spaces for abuse and harm. Tell the NYPD boilerplate disclosures won't cut it.