Skip to content
Neuropsych

Are brain games mostly BS?

Some studies say they help, others say just the opposite. Let’s dig in to find the truth.
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

You’ve probably seen ads for apps promising to make you smarter in just a few minutes a day. Hundreds of so-called “brain training” programs can be purchased for download. These simple games are designed to challenge mental abilities, with the ultimate goal of improving the performance of important everyday tasks.


But can just clicking away at animations of swimming fish or flashed streets signs on your phone really help you improve the way your brain functions?

Two large groups of scientists and mental health practitioners published consensus statements, months apart in 2014, on the effectiveness of these kinds of brain games. Both included people with years of research experience and expertise in cognition, learning, skill acquisition, neuroscience and dementia. Both groups carefully considered the same body of evidence available at the time.

Yet, they issued exactly opposite statements.

One concluded that “there is little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” The other argued that “a substantial and growing body of evidence shows that certain cognitive training regimens can significantly improve cognitive function, including in ways that generalize to everyday life.”

These two competing contradictory statements highlight a deep disagreement among experts, and a fundamental dispute over what counts as convincing evidence for something to be true.

Then, in 2016, the U.S. Federal Trade Commission entered into the fray with a series of rulings, including a US$50 million judgment (later reduced to $2 million) against one of the most heavily advertised brain training packages on the market. The FTC concluded that Lumos Labs’ advertisements—touting the ability of its Lumosity brain training program to improve consumers’ cognition, boost their performance at school and work, protect them against Alzheimer’s disease and help treat symptoms of ADHD—were not grounded in evidence.

What are we to believe?

In light of conflicting claims and scientific statements, advertisements and government rulings, what are consumers supposed to believe? Is it worth your time and money to invest in brain training? What types of benefits, if any, can you expect? Or would your time be better spent doing something else?

I’m a cognitive scientist and member of Florida State University’s Institute for Successful Longevity. I have studied cognition, human performance and the effects of different types of training for nearly two decades. I’ve conducted laboratory studies that have directly put to the test the ideas that are the foundation of the claims made by brain training companies.

Based on these experiences, my optimistic answer to the question of whether brain training is worth it would be “we just don’t know.” But the actual answer may very well be “no.”

My colleagues and I have argued that most of the pertinent studies fall far short of being able to provide definitive evidence either way.

Some of these problems are statistical in nature.

Brain training studies often look at its effect on multiple cognitive tests—of attention, memory, reasoning ability and so on—over time. This strategy makes sense in order to uncover the breadth of potential gains.

But, for every test administered, there’s a chance that scores will improve just by chance alone. The more tests administered, the greater the chance that researchers will see at least one false alarm.

Brain training studies that include many tests and then report only one or two significant results cannot be trusted unless they control for the number of tests being administered. Unfortunately, many studies do not, calling their findings into question.

The placebo effect?

Another design problem has to do with inadequate control groups. To claim that a treatment had an effect, the group receiving the treatment needs to be compared to a group that does not. It’s possible, for example, that people receiving brain training improve on an assessment test just because they’ve already taken it—before and then again after training. Since the control group also takes the test twice, cognitive improvements based on practice effects can be ruled out.

Many studies that have been used to support the effectiveness of brain training have compared the effect of brain training to a control group that did nothing. The problem is any difference observed between the training group and the control group in these cases could easily be explained by a placebo effect.

Placebo effects are improvements that are not the direct result of a treatment, but due to participants expecting to feel or perform better as a result of having received a treatment. This is an important concern in any intervention study, whether aimed at understanding the effect of a new drug or a new brain training product.

Researchers now realize that doing something generates a greater expectation of improvement than doing nothing. Recognition of the likelihood for a placebo effect is shifting standards for testing the effectiveness of brain games. Now studies are much more likely to use an active control group made up of participants who perform some alternative non-brain training activity, rather than doing nothing.

Still, these active controls don’t go far enough to control for expectations. For instance, it’s unlikely that a participant in a control condition that features computerized crossword puzzles or educational videos will expect improvement as much as a participant assigned to try fast-paced and adaptive commercial brain training products—products specifically touted as being able to improve cognition. Yet, studies with these inadequate designs continue to claim to provide evidence that commercial brain training works. It remains rare for studies to measure expectations in order to help understand and counteract potential placebo effects.

Participants in our studies do develop expectations based on their training condition, and are especially optimistic regarding the effects of brain training. Unmatched expectations between groups are a serious concern, because there is growing evidence suggesting cognitive tests are susceptible to placebo effects, including tests of memory, intelligence, and attention.

Should it work?

There’s another important question that needs to be addressed: Should brain training work? That is, given what scientists know about how people learn and acquire new skills, should we expect training on one task to improve the performance of another, untrained task? This is the fundamental claim being made by brain training companies—that engaging in games on a computer or mobile device will improve your performance on all sorts of tasks that are not the game you’re playing.

As one example, “speed of processing training” has been incorporated into commercial brain training products. The goal here is to improve the detection of objects in the periphery, which can be useful in avoiding an automobile crash. A brain game may take the form of nature scenes with birds presented in the periphery; players must locate specific birds, even though the image is presented only briefly. But can finding birds on a screen help you detect and avoid, for example, a pedestrian stepping off the curb while you’re driving?

This is a crucial question. Few people care much about improving their score on an abstract computerized brain training exercise. What is important is improving their ability to perform everyday tasks that relate to their safety, well-being, independence and success in life. But over a century of research suggests that learning and training gains tend to be extremely specific. Transferring gains from one task to another can be a challenge.

Consider the individual known as SF, who was able, with extended practice, to improve his memory for numbers from seven to 79 digits. After training, he was able to hear a list of 79 randomly generated digits and immediately repeat this list of numbers back, perfectly, without delay. But he could still remember and repeat back only about six letters of the alphabet.

This is just one of many examples in which individuals can vastly improve their performance on a task, but demonstrate no training gains at all when presented with an even slightly different challenge. If the benefits of training on remembering digits do not transfer to remembering letters, why would training on virtual bird-spotting transfer to driving, academic performance or everyday memory?

Staying mentally spry

Brain training programs are an appealing shortcut, a “get smart quick” scheme. But improving or maintaining cognition is likely not going to be quick and easy. Instead, it may require a lifetime—or at least an extended period—of cognitive challenge and learning.

If you’re worried about your cognition, what should you do?

First, if you do engage in brain games, and you enjoy them, please continue to play. But keep your expectations realistic. If you’re playing solely to obtain cognitive benefits, instead consider other activities that might be as cognitively stimulating, or at least more fulfilling—like learning a new language, for instance, or learning to play an instrument.

Some evidence suggests that physical exercise can potentially help maintain cognition. Even if exercise had no effect on cognition at all, it has clear benefits to physical health – so why not move your body a bit?

The most important lesson from the literature on training is this: If you want to improve your performance on a task that’s important to you, practice that task. Playing brain games may only make you better at playing brain games.


Walter Boot is Professor of Cognitive Psychology at Florida State University. This article was originally published at The Conversation and has been republished under Creative Commons. Read the original article.

The post Are Brain Games Mostly BS? appeared first on ORBITER.

In this article
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Related

Up Next