going to do some tweets about this algorithms panel happening now >> https://twitter.com/donnakiran/status/1333822833793900549
. @Sarah_Brayne: "For centuries, state actors have translated social processes into metrics in order to govern. What's new and important today is that the state is now relying more heavily on private vendors to collect, store, & analyze data about its citizenry."
Brayne adds that the use of data analysis in policing can be read as one tactic to pursue legitimacy and the perception of "mechanical objectivity." Notes "data-driven employee risk management system" at LAPD following federal oversight.
"This new narrative about police data suggests it's disruptive, but I think that data is better understood as a form of capital... a social resource, w/ shifting utility over time and space, differentially leveraged/exchanged/taken away."
"Thinking about data as capital rather than mechanical reflection of the world... [means] accountability does not flow automatically to transparency, if only the police have access. Big data can only be used to police the police if it operates outside of chain of command."
. @AngeleChristin: "When you read media coverage about risk assessment tools, it's like Robocop [objective, just]. That's not the vibe you get when you are in criminal court. It's very dysfunctional, very biased, very racist... How do you reconcile this?"
Fieldwork findings: "Decoupling between the administration of the court and what was actually going on in the day-to-day practices of judges, prosecutors, social workers... Admins say, we are data-driven, on top of tech... [In hearings], I barely ever saw RA tools being used."
Some comments about different risk assessment instruments, incl. Arnold's PSA. "Instead of erasing discretion, risk assessment moves discretion to either before or after the moment when that specific segment is being objectified."
"In crim. court, it was striking to talk with social workers, who fill out instruments for RA. They know how to manipulate tools. They all told me, 'if I wanted to, I could.' One of the values of doing ethnography is, where does discretion go? More unaccountable places?"
. @ProfFerguson describes how in recent history, police are widely trusted [by white people, ostensibly] to use new technologies without much oversight. Many advocates, Ferguson says, cite the racist history of policing to argue that all police tech will be used maliciously.
Ferguson argues for a departure from the technocratic approach (which he says he's been sympathetic to). "Tyrant test" - assume the power you are giving to police is going to potentially destroy liberty, going to be misused. And then, how do you build systems to protect?
<<Jon's editorial break. I think you protect against misuse of tech by not giving them tech in the first place. Policing is a paramilitary structure that works against democratic will to protect its own power at all costs. Many depts are more powerful than their mayor.>>
. @KLdivergence discusses the concept of fairness in models. i.e., if <any number of failure to appear instances> constitutes failure for the model, then someone with 99 appearances and 1 FTA gets marked. Discusses the conflation of flight risk with practical difficulties.
Lum proposes new methods to audit risk assessment models. Accountability for inputs. Overbooking in Arnold Ventures PSA, takes in charges as input. Discretion about what the charges are. "They plug those booking charges into the model without any sort of second look or scrutiny."
. @AngeleChristin questions why we spend so much time and effort trying to fix risk assessment when it's unclear that it would be more efficient than trying to change institutions themselves. "Rearranging chairs on the Titanic... the system as it is, is really bad."
. @Sarah_Brayne discusses the role of abolition in CL system tech. "As social scientists, we can focus more on this investment side of [abolition]. We can acknowledge that there are flaws, but if we build up the same flawed society [post-decarceration], we'll see the same issues."
. @ProfFerguson: "A good argument for surveillance abolition is that most of these surveillance technologies that have never really worked in the way they were [sold]." Always leads to increased social control. But Ferguson thinks that tech adoption moves faster than abolition.
You can follow @jbenmenachem.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.