When the Mars Rover was in production in the '90s, NASA senior computer scientist Rich Levinson noticed a limitation in its ability to make reactive decisions. The Rover could avoid falling off a cliff, but it didn't have the capability to backtrack or plan other routes of navigation. That's when he learned about a little-known term and much-needed brain process called "executive function."
According to the National Center for Disabilities, executive function is a set of mental processes needed to perform activities such as planning, organizing, strategizing, paying attention to and remembering details, and managing time and space. Ranging from mild to severe, the cognitive impairment of executive function affects more than 16 million people according to the most recent CDC report. The growing senior population is particularly at risk as they are expected to comprise 20% of the total U.S. population by the year 2050, according to the U.S. Census Bureau.
At the time, executive function wasn't talked about much among clinicians, let alone the public, yet Levinson connected the dots.
"I was looking at the brain's operational properties at the same time as we were studying autonomy for robotics and realized there was a connection between executive function and the robotic systems that we were trying to combine planning and reaction to Artificial Intelligence," he recollects.
"If you increase your planning time, you can explore more possibilities and then compile them down into reflexes. Then when you get into a situation where you have to make a very quick reactive response, you actually can do a little more." In 1996 Levinson started NASA spin-off BrainAid to address those very problems.
Enter Planning and Execution Assistant and Trainer (PEAT), Brainaid’s customizable smart-planning software. Unlike other task managing systems, PEAT automatically reorganizes a person's schedule based on their real-time task approval and customizable app integration. PEAT's cloud-based dashboard allows clinicians to view and share data collected from users' actions. The clinical integration helped teachers of autism organization PACE log outbursts and behaviors of autism students.
PEAT's customizable plug-in strategies help patients in overwhelming situations by selecting icons that walks them through therapist-suggested prompts such as "Wait five seconds before you speak," or "take a walk."
"We log when the patient presses the reaction icon and select a coping strategy so the therapist knows when they are doing the strategies on their own," Levinson says.
Entering its seventh year funded by the U.S. Department of Defense, PEAT is helping ameliorate another four-lettered cognitive killer—PTSD. The Palo Alto Department of Veterans Affairs’ clinical neuropsychologist Harriet Katz Zeiner says PEAT is a much-desired invisible aid for the vets. "We don't find PEAT being rejected like others [cognitive aids] because it's great at being unobtrusive in a social setting," she says. "It’s not like a cane and the world doesn't haven’t to know that it truly is an assistant."
PEAT began integrating with wearable assistance in 2007, with an experimental RFID reader bracelet called iBracelet that would fail but set the stage for a new wave of wearables. Today, BrainAid is working with the AFrame Digital smartwatch to monitor heart rates and send the data to PEAT, which will then automatically cue the user of coping strategies. Last summer, the company began integrating with the Pebble smartwatch, which acts as a leash for mobile devices and displays tasks on its simple interface.
By reducing alliance on motor skills, PEAT’s voice input is a game-changer, especially for physically handicapped veterans who can’t use touch screens. But while voicing commands is one thing, holding a conversation is another.
"Siri does a search command but to have a conversation I expect you to reply and there's an overall structure to it," Levinson explains. "We’re trying to build a conversational assistant that understands what having a conversation about adding a task is actually like."
BrainAid is also prepping in-home sensor systems for two veterans where they'll begin a case study this month. Strategically placed motion sensors on household items such as on medicine boxes, doors, and refrigerator pouches will signal PEAT when moved.
"If by a certain time of the day, the meds box hasn't opened, they'll get a cue, otherwise we won't have to cue them to take their meds," Levinson says. Zeiner said these sensors could help with forgetfulness. "A lot of times people will forget where they are and PEAT can remind you what room you're in," she says.
Zeiner also says it takes a big load off of caregivers who often experience "caregiver burnout"—a cause of high elderly abuse rates. "As you walk out the door, it can remind you to bring your keys, cane, and other things," she says. "When the schedule starts slipping, the caregiver doesn’t have to take care of two schedules." Levinson also hopes to use motion sensors to monitor nighttime wandering.
In-home and wearable sensors are just the kind of sustainable, stigma-free assistance that Aging2.0 cofounder Katy Fike wants in new aging accelerator GENerator. Fike says independence is key to sustaining the growing senior population. "Right now we [seniors and caretakers] tend to go to things like 24-hour care or move into memory care units and we're going to need solutions like PEAT to maintain our independence because our health care system cannot support that scale at the rate we’re going right now," Fike says. She also thinks Levinson's ideas could benefit people outside of the disabled community.
"There’s great potential to help a lot of people—not just the cognitive impaired. Since it started with the hardest users first, it can have many uses for planning," she says. "But you couldn't do it the other way around."
With the help of Fike and GENerator, Levinson is looking for more consumer-friendly tweaks and funding to expand his currently high-end business model without losing sight of the bigger, brainier picture.
"The AI community said, 'Let's try to put this [real-time planning and reacting] together for a few years’ then said, 'Ah, that's tough' and put it down so now we have these two different camps doing two different things," Levinson says fervently. "The theoretical concepts of ‘Can we really take the computer science and the neuropsychology and build a better model of the human frontal lobe?’ is still a goal but mostly now focusing on it working for people," he says. "But they [AI community] still have never really, in my mind, resolved this issue."