More examples of nudges in the public sector

Here’s an interesting story about the UK’s Behavioural Insights Team. With some useful examples that might make for interesting classroom cases/exercises:

As an example, Moore pointed to a program in Lexington, Ky., where there were 7,000 overdue water bills that totaled about $4 million. A nudge in this situation meant sending a mailer alerting people to their delinquency. Some of them included handwritten notes addressing recipients by name. In less than a month, the campaign resulted in the reconciliation of $139,000 in unpaid bills.

Santiago Garces, chief innovation officer of South Bend, Ind., described BIT’s work in his city as “wizardry.” Using mapping, South Bend found that homeowners in low income areas were less likely to take advantage of mortgage-related tax exemptions they were entitled to. This is, in many ways, what behavioral influence looks like at its best: A hard data study upends a presumption (in this case, that lower income families would be most likely to pursue tax breaks) and city government subsequently works to nudge a behavioral change for the good of its residents.

Nudges aren’t limited to exerting public influence. Andres Lazo, director of citizen-centered design for Gainesville, Fla., said he was working to use the idea internally, a strategy Moore had seen deployed elsewhere to encourage employees to share efficiency ideas by offering incentives like recognition and prizes.

Data for dissertations October 17, 2017

36371 The Attack on America and Civil Liberties Trade-Offs: A Three-Wave National Panel Survey, 2001-2004 http://doi.org/10.3886/ICPSR36371.v1

36622 Johns Hopkins University Prevention Research Center – Risks for Transitions in Drug Use Among Urban Adults, Baltimore City, 2008-2011 http://doi.org/10.3886/ICPSR36622.v1

36652 Afrobarometer Round 6: The Quality of Democracy and Governance in Burkina Faso, 2015 http://doi.org/10.3886/ICPSR36652.v1

36662 Eurobarometer 82.2: Quality of Transport, Cyber Security, Value Added Tax, and Public Health, October 2014 http://doi.org/10.3886/ICPSR36662.v1

36666 Eurobarometer 83.2: Perception of Security, Civil Protection, and Humanitarian Aid, March 2015 http://doi.org/10.3886/ICPSR36666.v1

The problem of ephemeral data

You’ve collected data, analyzed/tortured them, written the paper, chosen a journal, and then submitted the paper for consideration. After a while, you hear back that the reviewers didn’t see enough promise to move forward so the editor has rejected it. You change some things, rinse, and repeat. Maybe it happens again. It’s expected in a world where many journals have acceptance rates lower than 10 percent. 

Finally, at some point, you receive an “R&R” – an opportunity to revise and resubmit your paper for further consideration. But the reviewers complain that the data are no longer “fresh” and require “updating.” What should you do?

Of course, most of us do whatever it takes and whatever is possible to close the R&R. The goal is to publish so it’s natural to jump through the hoops.

The problem of ephemeral data, though, is a philosophical one. If the data are truly “stale”, how does freshening them improve inference? 

If the world is changing that quickly, won’t even fresh data be outdated by the time the paper survives review, is accepted, is typeset, is processed, and finally is printed in the journal several years later? 

And won’t the data be stale when people notice the paper several years later when assembling reading lists for their own papers, syllabi, or students?

There’s no natural solution to this problem. Because researchers and practitioners meld together as one studies the other and the other changes behavior based on research, it is inevitable that data are ephemeral. The social “data generating process” is a moving target – and researchers are themselves embedded mechanisms.

Perhaps I have this wrong, though. Comments are closed, but please feel free to correct my thinking on Twitter at @abwhitford. 

3 reasons dissertations should use mixed methods

  1. Researchers who include interviews or other types of fieldwork in their dissertation projects are more likely to discover hidden or novel quantitative datasets. Researchers with quantitative skills are more likely to be extract useful information from those sorts of datasets.
  2. Getting hired requires satisfying diverse constituencies. It’s good to have many ways to talk with the people you meet during interviews. Some of those constituencies will probably value fieldwork.
  3. The dissertation is like a rehearsal – it’s an opportunity to rehearse, in a large-scale project, all of those methods you learned in class. Why emphasize only a subset?

Are there other, less-particularistic reasons? Comments are closed but feel free to respond to me on Twitter at @abwhitford.