Carla Reyes (Southern Methodist University - Dedman School of Law) has posted Emerging Technology’s Language Wars: AI and Criminal Justice (Journal of Law & Innovation (Forthcoming, 2022)) on SSRN. Here is the abstract:
Work at the intersection of Artificial Intelligence systems (AI systems) and criminal justice suffers from a distinct linguistic disadvantage. As a highly interdisciplinary area of inquiry, researchers, law-makers, software developers, engineers, judges, and the public all talk past each other, using the same words, but as different terms of art. Evidence of these language wars largely derives from anecdote. To better assess the nature and scope of the problem, this Article uses corpus linguistics to reveal the inherent value conflicts embedded in definitional differences and debates. Doing so offers a tool for reconciling specific linguistic ambiguities before they are embedded in law and ensures more effective communication of the technical pre-requisites for AI systems that, by design, seek to achieve their intended purpose while also upholding core democratic values in the criminal justice system.
And from the paper:
As such laws have only just been introduced, the question remains: will designers of AI systems understand the message that the legislatures intend to convey with these new “responsible artificial intelligence standards”? Even if AI system designers do understand the message, can they technically achieve that which is required of them? Let’s consider, for example, interest in ensuring that AI systems used in the criminal justice system are “fair” by design. The legal conception of fairness generally ties to antidiscrimination statutes and due process requirements, and stands as “a core principle in the goal of society-wide equilibrium of rights, opportunities, and resources.”71 Meanwhile, AI system creators have at least 21 different technical meanings of fairness to choose from when designing a system that is “fair by design.”72 If the law instructs the AI system designer to preference one such meaning over another, that requirement may result in a technical trade-off that legislatures neither contemplated nor intended.73 For example, if a legislature requires statistical parity as an anti-bias measure, it may force a trade-off in accuracy.74 The same definitional difficulty exists for each of the core concepts legal scholars hope will help shore up gaps between the use of AI systems in criminal justice and important legal and constitutional norms like due process.75
Highly recommended. Download it while it's hot!