How do we know whether judges of different backgrounds are "biased"? We review the substantial political science literature on judicial decision-making, paying close attention to how judges' demographics and ideology can influence or structure their decision-making. As the research shows, characteristics such as race, ethnicity, and gender can sometimes predict judicial decision-making in limited kinds of cases; however, the literature also suggests that these are by far less important in shaping or predicting outcomes than is ideology (or partisanship), which in turn correlates closely with gender, race, and ethnicity. This leads us to conclude that assuming judges of different backgrounds are biased because they rule differently is questionable: given that the application of the law rarely provides a ``correct'' answer, it is no surprise that judges' decisions vary according to their personal backgrounds and, more importantly, according to their ideology.
Though used frequently in machine learning, boosted decision trees are largely unused in political science, despite many useful properties. We explain how to use one variant of boosted decision trees, AdaBoosted decision trees (ADTs), for social science predictions. We illustrate their use by examining a well-known political prediction problem, predicting U.S. Supreme Court rulings. We find that our ADT approach outperforms existing predictive models. We also provide two additional examples of the approach, one predicting the onset of civil wars and the other predicting county-level vote shares in U.S. presidential elections.
Supreme Court justices employ law clerks to help them perform their duties. We study whether these clerks influence how justices vote in the cases they hear. We exploit the timing of the clerkship hiring process to link variation in clerk ideology to variation in judicial voting. To measure clerk ideology, we match clerks to the universe of disclosed political donations. We find that clerks exert modest influence on judicial voting overall, but substantial influence in cases that are high-profile, legally significant, or close decisions. We interpret these results to suggest that clerk influence occurs through persuasion rather than delegation of decision-making authority.
Do Justices telegraph their preferences during oral arguments? We demonstrate that Justices implicitly reveal their leanings during oral arguments, even before arguments and deliberations have concluded. Specifically, we extract the emotional content of over 3,000 hours of audio recordings spanning 30 years of oral arguments before the Court. Using only the level of emotional arousal in each of the Justices’ voices during these arguments, as measured by their vocal pitch, we are able to accurately predict many of their eventual votes, while using none of the text or substantive content. These predictions are statistically and practically significant and robust to including a range of controls. Our findings suggest that mannerisms that may be subconscious, such as vocal pitch, carry information that basic legal, political, and textual information do not, and can be used to predict the decisions of even elite political actors.
JayUlfelderOne of my first projects for the Nonviolent Action Lab involves making it easier for people to find, explore, and use @crowdcounting's fantastic data on political gatherings in the U.S. We've now got a @github repo up with compiled and cleaned CCC data: t.co/3O1E31ifsg
leedrutmanI'm going to go on the record here and say that there's no reason we need any college sports, ever again. Universities are for teaching and research. Sports leagues are for sports. No reason they need to be linked. Go ahead, ratio me, if I am wrong.