On a recent sunny afternoon, my friend, Mary, and some neighbors sat on her back deck chatting. Suddenly a remote-controlled drone buzzed no more than six feet above them, circling, before darting off above the trees. Was it a neighborhood kid with a toy? An Amazon Prime Air test flight? A police department surveillance sweep? The range of possibilities was broad and, honestly, a bit disconcerting. Drone technology is nascent. What will afternoons in our backyards look like when it scales?
The experience underscores a fundamental challenge we social innovators face in our tech-driven, rapidly evolving modern world. Technology affords incredibly enticing opportunities to scale for positive social impact. Tech tools can motivate collective action (think Ushahidi or Twitter and political protest) or even disrupt entire systems (think Uber and transportation or Airbnb and hospitality). But do we take into account taxi drivers’ livelihoods or the impact on neighborhood culture as we hail a cab with an iPhone swipe? Are we obliged to? Before we focus on scale, have we considered both the intended and unintended consequences of our efforts?
This question of intentions–that which we intend as well as that which we don’t–is one we ask daily in our tech-development work at HopeLab, the health-focused organization of the Omidyar Group. HopeLab was founded in 2001 by Pam Omidyar, who had an idea that many thought outlandish: a video game to help kids with cancer fight their disease. Her intention was clear and provided a guiding principle for the unconventional team of game developers, clinicians, and young cancer patients assembled to realize her vision. At the same time, it was critical that the game be rigorously tested to address possible unintended consequences as well: does the gameplay increase nausea or anxiety? Does it distract kids from their most important task: sticking to their meds? At the very least, does it do no harm?
Omidyar’s idea proved to be profound. Research published in the medical journal Pediatrics demonstrated that the cancer-fighting video game the HopeLab team developed, called Re-Mission, effectively boosted a sense of power and control in kids who played it, which led to better adherence to treatment and greater possibility for survival. The game was distributed to several hundred thousand kids and was a powerful prototype for the work we do today designing technology to support healthy behavior.
So, as mission-minded social innovators looking to tap into the power of technology, how can we set our intentions to help ensure positive impact, before we set our sights on scale? The HopeLab team has refined a recipe, borrowing from the disparate domains of behavioral psychology, design thinking and Silicon Valley product development:
- Identify the observable behavior you want to see in the world.
- Mine existing research and talk to the people you hope to serve to understand the psychology that motivates or prevents that behavior.
- Design technology to create an experience that changes the psychology to support the behavior you seek.
- Test and evaluate to determine the consequences–both intended and unintended– of your work.
It’s relatively straightforward, though it can be tricky to execute. In practice, step three has been a place where the prototype-test-iterate approach common in Silicon Valley has proven valuable. Short cycles of development and evaluation enable alignment between intention and impact by creating opportunities for course correction along the way. Evaluation takes place at every step, including soliciting and incorporating feedback from the people we hope to serve. In fact, engaging target audiences in the design process can make or break a project. Doing so minimizes the likelihood that deleterious unintended consequences will go unrecognized.
Here’s an example we’re living now at HopeLab. In 2014 we joined a coalition of business, government, and education leaders in South Carolina to help create tools to prepare kids for learning. We were eager to participate for two reasons. First, many of the traits that support learning readiness (like responsibility, intention, and self-regulation) are behavioral skills that also support healthy behavior. Second, how often do business, government, and local educators come together to resolve seemingly intractable challenges? We were intrigued. We quickly realized that teachers would be critical stakeholders to engage first in the design of any solutions, because research shows that teachers’ capacities deeply influence student success. In fact, well-intentioned education efforts elsewhere have actually worsened outcomes in students when they fail to focus on enhancing teachers’ capabilities.
Our awareness of this potential unintended consequence led us to spend many rich hours with teachers, in their schools (“in context”) assisting them in creating tools that they can use themselves and test with students in their classrooms. We are elbows deep in our four-step recipe as we refine and improve the offerings. The work is time-intensive, humbling and essential: without the insight from the lived experiences of teachers and their investment in the design process, any idea we might have parachuted in would have felt out of context and invasive. In other words, we could quickly end up in the land of unintended consequences and negative impact.
By taking these steps we can be more confident both in what we’re creating and in its value as a scalable solution to the problem we hope to solve. The road to scale is paved with good intentions, but good intentions alone are not enough. Our work must be tempered with attention to the unintended consequences of our efforts.