Stanford And 69 Scientists Say That Brain-Training Games Make Exaggerated Claims

So what do you do when science won’t back up your product? If you’re Lumosity, you DIY more research.

Stanford And 69 Scientists Say That Brain-Training Games Make Exaggerated Claims
[Photo: Flickr user John Ragai]

When it comes to brain-training games, scientific research is slow.


“Say you want to run an experiment with 30 people,” Lumosity co-founder Michael Scanlon explained to me back in 2012 (the company did not respond to my more recent request for comment). “And each of those people needs to train for at least 10 or 20 hours over the course of a month or two. You start adding it up, and it suddenly becomes a huge burden to run an experiment in the traditional laboratory way. You bring people in every day and spend a couple of hours with them. There have been cognitive training studies going on in the last 30 years, but they tend to develop slowly, and innovations, especially early on, were pretty slow. Basically, because of that format, it slows down the innovation.”

The idea that playing a game could make you smarter, more alert, and able to learn faster is incredibly appealing–so appealing that Lumosity, the best known company in brain training, has been able to sign up 50 million users; subscriptions run $14.95 per month for the service.

What kind of entrepreneur would wait around for scientific consensus to catch up?

Instead, Scanlon and Lumosity took some of the science into their own hands by establishing a research arm of the company, which looks at data from users in order to solidify some of the games’ claims. The team has been able to publish findings in at least one peer-reviewed scientific journal, and its internal research allows Lumosity to promote games “based on what we’ve learned from analyzing aggregated data from 60+ million members” and that “analysis of our database shows that just 10-15 minutes of Lumosity training per day can lead to improvements in Lumosity over time.”

The company also gives its tools away to students so that it can study the links between cognitive training and learning; supports further research around cognitive training through a grant program; and lends its assessment tools to researchers who are studying cognitive training. Other companies, like NeuroAD and Cogmed, have taken similar approaches.

Encouraging more research–by funding it, lending your assessment tools to it, or even doing it yourself–is a smart thing for a business eager to reach (positive) scientific consensus about its products to do. Especially since it can use that research to make the products better.


But, a group of 69 researchers from a swath of prestigious universities wants to remind you, what Lumosity and other companies have accomplished is in no way the same thing as scientific consensus.

This week, those researchers signed a statement from The Stanford Center on Longevity and the Berlin Max Planck Institute for Human Development that explained why they think brain games claims are exaggerated and at times misleading.

Here, it says, is what we can actually say at this point about brain training games: “Cognitive training produces statistically significant improvement in practiced skills that sometimes extends to improvement on other cognitive tasks administered in the lab,” the statement says. “In some studies, such gains endure, while other reports document dissipation over time.”

The statement adds that pretty much any activity–learning a language, navigating a motor skill, playing a game–can change neural systems that make learning easier, and it’s not clear that playing brain games works better than anything else. One study, for instance, showed that people who played the entertainment video game Portal 2 for eight hours showed more improvement on cognitive skills tests than did people who played Lumosity games for the same amount of time.

Lumosity can (and likely will) continue to conduct and promote research around cognitive training. Some of that research has shown positive results, but the academic world is waiting on findings conducted by independent researchers, funded by independent sources, and replicated at multiple sites before it gives brain training approval.

So what should you do?


“Before investing time and money on brain games, consider what economists call opportunity costs,” the scientists’ statement advises. “If an hour spent doing solo software drills is an hour not spent hiking, learning Italian, making a new recipe, or playing with your grandchildren, it may not be worth it.”

About the author

Sarah Kessler is a senior writer at Fast Company, where she writes about the on-demand/gig/sharing "economies" and the future of work.