Tuesday, December 19, 2017

The first bill to examine 'algorithmic bias' in government agencies has just passed in New York City

RTR4O3G5

  • New York City has passed the algorithmic accountability bill, which will assign a task force to examine the way that city government agencies use algorithms.
  • Algorithmic bias is a critical issue in the justice system, which often relies on algorithmic risk assessments to inform criminal sentencing in federal court.
  • The bill is the first of its kind to be passed in the nation, and will attempt to provide transparency in the way that the government uses algorithms. 

 

New York City has unanimously passed a bill that will attempt to provide transparency to the way that city government agencies use algorithms.

It's the first bill examining algorithmic bias to be passed in the country, and could spell out increased scrutiny in the government's use of algorithms nationwide.

The bill, which was signed by Mayor Bill de Blasio last week, will assign a task force to examine the way that New York City government agencies use algorithms to aid the judicial process. 

According to ProPublica, council member James Vacca sponsored the bill as a response to ProPublica's 2016 account of racially-biased algorithms in the American criminal justice system. The investigative story revealed systemic digital bias within judicial risk assessment programs that favored the release of white inmates on the grounds of future good behavior over the release of black defendants. 

Algorithmic source code is typically private, but issues of bias have called for increased transparency. The ACLU has spoken out on behalf of the bill passing, and it described access to institutionalized algorithmic source code as a fundamental step in ensuring fairness within the criminal justice system. 

New York Civil Liberties councilmember Rashida Richardson describes the bill as a watershed moment.

“This bill is the first in the nation to take such a broad view of the problem and recognize that for algorithms to benefit society, they must be subject to public scrutiny...to remedy flaws and biases," Richardson said in a statement to Business Insider. "A flawed algorithm can lead to someone being trapped in jail for no good reason or not receiving a public benefit."

Algorithmic bias is a far-reaching issue in the criminal justice system. Bernard Harcourt, a law professor at Columbia University who has studied risk assessment programs extensively, told Business Insider that "algorithmic bias in government agencies is widespread and growing, especially in areas like policing and criminal adjudication that are getting cannibalized by the facile solution of predictive tools."

Harcourt says the New York City bill is a step forward for the American justice system, but recommends and even greater increase in algorithmic transparency.

"The source codes of all algorithms that directly affect New Yorkers have to be open to the public for us to evaluate their racial and other biases," Harcourt said. "This bill is a critical first step, but it is only a first step and needs to be supplemented with far greater transparency."

Join the conversation about this story »

NOW WATCH: How Area 51 became the center of alien conspiracy theories



source http://www.businessinsider.com/algorithmic-bias-accountability-bill-passes-in-new-york-city-2017-12

No comments:

Post a Comment