Jacob Eisler has posted this draft on SSRN (forthcoming, Emory Law Journal). Here is the abstract:
Data analysis has transformed the legal academy, and is now poised to do the same to constitutional law. In the latest round of partisan gerrymandering litigation, courts have used quantitative tests to define rights violations and strike down legislative districtings across the country. Statistical thresholds have thus been enshrined as constitutional protections, and courts have recast themselves as agents of discretionary policy.
This Article describes how this revolutionary change subverts the judicial role and undermines the rule of law. Constitutional law ensures that government conduct respects principles of neutrality. Government action is unconstitutional when it has intentionality that violates these principles. In other words, constitutional rights are ‘input-monitoring’, whereas data analysis can only produce informational outputs. Because quantitative methods cannot identify the inputs that violate principles, they are inadequate to define constitutional wrongs. The only appropriate role of metrical analysis is to provide evidence of any rights-violating intentionality. If quantitative outcomes are used to define rights, courts act as quasi-regulatory entities that compete with democratically elected branches.
The law of partisan gerrymandering needs a new principle, not new metrics. The best principle to identify partisan gerrymandering is the right to fair representation, which is violated when legislatures seize partisan advantage in democratic process. Quantitative analysis should have the sole function of proving that alleged partisan gerrymanders seek such advantage.
This Article thus identifies a novel and troubling trend in constitutional law and describes how it dominates a topic of immediate practical importance. It then offers a general framework for conceptualizing rights protection and applies it to this pressing doctrinal issue.