New administrative datasets at ICPSR

33761 Analysis of Current Cold-Case Investigation Practices and Factors Associated with Successful Outcomes, 2008-2009

34681 Case Processing in the New York County District Attorney’s Office; 2010-2011

34903 Delivery and Evaluation of the 2012 International Association of Forensic Nurses (IAFN) National Blended Sexual Assault Forensic Examiner (SAFE) Training [UNITED STATES]

34922 Investigating the Impact of In-car Communication on Law Enforcement Officer Patrol Performance in an Advanced Driving Simulator in Mississippi, 2011

Communicating science more effectively

Here’s the new report from the National Academy of Science (free PDF download).

Science and technology are embedded in virtually every aspect of modern life. As a result, people face an increasing need to integrate information from science with their personal values and other considerations as they make important life decisions about medical care, the safety of foods, what to do about climate change, and many other issues. Communicating science effectively, however, is a complex task and an acquired skill. Moreover, the approaches to communicating science that will be most effective for specific audiences and circumstances are not obvious. Fortunately, there is an expanding science base from diverse disciplines that can support science communicators in making these determinations.

Communicating Science Effectively offers a research agenda for science communicators and researchers seeking to apply this research and fill gaps in knowledge about how to communicate effectively about science, focusing in particular on issues that are contentious in the public sphere. To inform this research agenda, this publication identifies important influences – psychological, economic, political, social, cultural, and media-related – on how science related to such issues is understood, perceived, and used.

Doubling down on evidence

Academia isn’t like the public sector or firms or nonprofits. These days, people in those sectors are trying to read the tea leaves about what’s coming next. In a post-truth world, everything is negotiable, so it’s all about reading the fault lines of debates, figuring out who wants what.  

I became an academic because I believe in evidence. It’s easy for critics to wrongly claim that universities are full of informational relativism, but I don’t see it. Instead I see groups of people trying to find the best ways to discover evidence about truth. The most bitter fights are about how we assemble that evidence because it isn’t easy to demonstrate causality.  

Academics are also facing the decision of whether to invest time reading the political fault lines – or to double down on evidence. 

If I was gifted with reading those political tea leaves I would have run for office.  I’m not, so I’m doubling down on evidence. I’m doing so because post-truth, like other movements, is a fad. Assuming we survive it, after it fades, there will be a great demand for evidence. Somewhere, sometime, people will want evidence about how to make policy or manage organizations. 

In the end, this is the primary responsibility of academics – to double down on evidence, not to translate or write opeds or whatever. If we don’t discover, who will?

“Developing knowledge states: Technology and the enhancement of national statistical capacity”

New with Derrick Anderson of Arizona State University, this paper is now forthcoming at the Review of Policy Research. Here’s the abstract:

National statistical systems are enterprises tasked with collecting, validating and reporting societal attributes. These data serve many purposes–they allow governments to improve services, economic actors to traverse markets, and academics to assess social theories. National statistical systems vary in quality, especially in developing countries. This study examines determinants of national statistical capacity in developing countries, focusing on the impact of technological attainment. Just as technological progress helps to explain differences in economic growth, we argue that states with greater technological attainment have greater capacity for gathering and processing quality data. Analysis using panel methods shows a strong, statistically significant positive linear relationship between technological attainment and national statistical capacity.

Please feel free to contact me for a pre-publication version of the paper.

New commentary for Osservatorio AIR

Gary Miller and I have written a short description of our recent Cambridge University Press book Above Politics: Bureaucratic Discretion and Credible Commitment. It is posted on the site of Osservatorio AIR, a center in Rome that specializes in research and studies on impact assessment, simplification, transparency, and participation as ways of improving regulation. The description can be found at

A word of caution about predictive analytics

This week’s events should be interpreted as a word of caution about predictive analytics. Clearly, many models didn’t predict the outcomes of the 2016 election. More importantly, the vast majority of models weren’t predictive. “Models of models” (averages across models) weren’t predictive. Even when models were built on data with high granularity (subnational polls, or polls taken at regular intervals, or polls taken by different houses using a variety of methods).

What’s the upshot? Humility. How much harder is it to get the predictions right when we’re developing policy for new and novel problems? 

Don’t believe those who say that big data will solve everything.