More examples of nudges in the public sector

Here’s an interesting story about the UK’s Behavioural Insights Team. With some useful examples that might make for interesting classroom cases/exercises:

As an example, Moore pointed to a program in Lexington, Ky., where there were 7,000 overdue water bills that totaled about $4 million. A nudge in this situation meant sending a mailer alerting people to their delinquency. Some of them included handwritten notes addressing recipients by name. In less than a month, the campaign resulted in the reconciliation of $139,000 in unpaid bills.

Santiago Garces, chief innovation officer of South Bend, Ind., described BIT’s work in his city as “wizardry.” Using mapping, South Bend found that homeowners in low income areas were less likely to take advantage of mortgage-related tax exemptions they were entitled to. This is, in many ways, what behavioral influence looks like at its best: A hard data study upends a presumption (in this case, that lower income families would be most likely to pursue tax breaks) and city government subsequently works to nudge a behavioral change for the good of its residents.

Nudges aren’t limited to exerting public influence. Andres Lazo, director of citizen-centered design for Gainesville, Fla., said he was working to use the idea internally, a strategy Moore had seen deployed elsewhere to encourage employees to share efficiency ideas by offering incentives like recognition and prizes.

Design thinking in the public sector

I’m currently working on a project that reviews notable examples of the use of design thinking in the public sector. It’s centered on The Lab at OPM, USDS, and 18F, but I’m also learning about great initiatives at CDC, USDA, the VA, and Education, and in cities like Philadelphia and states like Rhode Island. 

This is a quick bleg asking for other hints or directions. Are there people I should know about who are pushing the envelope?

Comments are closed but you can reach me on Twitter at @abwhitford. 

The speed of information and the instability of public affairs

In 2013, it was discovered that 90% of the world’s data were created in the previous two years. I can only imagine how much faster the speed of information is these days. 

Most decision makers, though, process information just as they did 20 or 50 years ago – on paper, in bite-sized pieces. Some might claim that current decision makers, at least in politics, are less capable now at processing complex, high-dimensional information. 

What’s the impact of this imbalance? It’s easy to speculate, but I think there’s an argument to be made that one key outcome will be instability. 

With the addition of new data, at speed, the existing volume of information increases along with the difficulty of comparability. The likelihood of multidimensionality increases. The aggregation (or dimensional reduction) problem gets harder. 

We can hope that machine learning and mining technologies, probably fueled by AI, will save us. But I’m skeptical. Instead, I think it’s likely that instability increases. And that the demand for low-dimensional “rules of thumb” increases. And that the probability of failure of those rules also increases – if only because high-speed data means the world is changing quickly. 

Maybe I’m wrong. Comments are closed but feel free to correct my thinking on Twitter at @abwhitford. 

The problem of ephemeral data

You’ve collected data, analyzed/tortured them, written the paper, chosen a journal, and then submitted the paper for consideration. After a while, you hear back that the reviewers didn’t see enough promise to move forward so the editor has rejected it. You change some things, rinse, and repeat. Maybe it happens again. It’s expected in a world where many journals have acceptance rates lower than 10 percent. 

Finally, at some point, you receive an “R&R” – an opportunity to revise and resubmit your paper for further consideration. But the reviewers complain that the data are no longer “fresh” and require “updating.” What should you do?

Of course, most of us do whatever it takes and whatever is possible to close the R&R. The goal is to publish so it’s natural to jump through the hoops.

The problem of ephemeral data, though, is a philosophical one. If the data are truly “stale”, how does freshening them improve inference? 

If the world is changing that quickly, won’t even fresh data be outdated by the time the paper survives review, is accepted, is typeset, is processed, and finally is printed in the journal several years later? 

And won’t the data be stale when people notice the paper several years later when assembling reading lists for their own papers, syllabi, or students?

There’s no natural solution to this problem. Because researchers and practitioners meld together as one studies the other and the other changes behavior based on research, it is inevitable that data are ephemeral. The social “data generating process” is a moving target – and researchers are themselves embedded mechanisms.

Perhaps I have this wrong, though. Comments are closed, but please feel free to correct my thinking on Twitter at @abwhitford. 

Organization theory symposia in public management?

A quick bleg based on a trend I’m noticing these days. I’m seeing CFPs for OB topics in public management on a regular basis. Have any journals hosted symposia for organization theory papers recently? And when I say OT I’m thinking true theory development papers – not statistical models, experiments, etc. 

I’m sure they’re out there. Comments are closed but please feel free to educate me via Twitter at @abwhitford.