advertisement
advertisement

Mind-reading technology is closer than you think

Companies and governments are getting closer to mind-reading technology, and right now, no laws prevent the NSA from spying on our brains or from companies collecting brain data and selling the information to third parties.

Mind-reading technology is closer than you think
[Photo: LinkedIn Sales Navigator/Unsplash]

Try to remember every thought that crossed your mind at work today—from the benign, like what to have for lunch, to the inflammatory, like why that supervisor is such a jerk. Now imagine if your boss had access to all of those thoughts and feelings. Sounds crazy, right? It’s rapidly becoming closer to reality.

advertisement
advertisement

Tesla founder Elon Musk’s company Neuralink just this summer announced that human trials will move forward next year for an implantable device that can read a user’s mind; scientists at UCSF recently released the results of a brain activity study, backed by Facebook, that shows it’s possible to use brain-wave technology to decode speech; in 2018, Nissan unveiled Brain-to-Vehicle technology that would allow vehicles to interpret signals from the driver’s brain; and Nielsen is already using neuroscience to capture nonconscious aspects of consumer decision-making.

There are, of course, both therapeutic and nontherapeutic reasons why people are interested in trying to figure out how to decode the brain. Musk’s clinical trial, for example, will focus on patients with complete paralysis due to an upper spinal cord injury. The hope is that the implanted mechanism will allow the user to virtually control any device, like a smartphone or an electric vehicle, with only their mind, which would be revolutionary for patients with physical limitations.

But Musk’s plans for Neuralink aren’t entirely altruistic. He believes every human being will eventually wear his device as a way to keep up with artificial intelligence. For now, though, a consumer-based EEG device, similar to a fitness tracker, is the most likely to impact us in the short-term. It’s noninvasive, relatively inexpensive, and is already being implemented in places like China and Australia.

In April, the South China Morning Post reported that “government-backed surveillance projects are deploying brain-reading technology to detect changes in emotional states in employees on the production line, the military and at the helm of high-speed trains.” According to The Sydney Morning Herald, several Australian mining companies have adopted a SmartCap, a device that looks like a baseball cap but is lined with EEG electrodes on the interior rim, to “reduce the impact of fatigue on the safety and productivity of their staff.”

These detectors, then, in some cases, are being used to improve the safety of workers. The device might warn a miner of exposure to carbon monoxide poisoning before they suffer a traumatic brain injury or send an alert to a truck or train driver to pull over if excessive drowsiness is detected.

But, Nita Farahany, a leading scholar on the ethical, legal, and social implications of emerging technologies, who gave a TED Talk on the subject last November, warns that there are currently no safeguards in place to protect against inappropriate use of the data.

advertisement

“There’s a significant societal interest in being able to listen to the brain activity of, say, a trucker or a pilot,” says Farahany. “But we need space for mental reprieve. It’s fundamental to what it means to be a human. Governments are starting to adopt broad privacy legislation, and some of that may implicate when and if companies can track this information. This is data like any other type of data, but I don’t yet see governments focusing on brain data, in particular. It’s something we need to be thinking about.”

Farahany thinks early adopters in the U.S. will be in industries like gaming, healthcare, aviation, trucking, and neuromarketing. And think about the implications in business: a company could theoretically use the technology to screen candidates in job interviews, track productivity on the job, and even monitor for dissident thoughts.

At Duke, in addition to teaching, Farahany serves as principal investigator of the SLAP Lab, which is designed to bring science to bear on questions of law and policy. In 2018, the Lab ran a study in the U.S. to determine if people appreciated the sensitivity of their brain information. The outcome? Only specific types of thoughts are considered highly sensitive.

“Participants treated their Social Security number or phone conversations as most sensitive,” explains Farahany. “People don’t yet understand both what’s possible with brain technology and then the negative implications if that information was accessible by others.”

There are, however, technological limitations to the EEG that make it unlikely we’ll ever get to the point of complex thought decoding in the brain. It lives on the surface, so it only picks up on the surface of the skull and can’t peer more deeply into the brain, where most complex thoughts or memories live. A true mind-reading device that can decode what you’re literally thinking and feeling with noninvasive technology is about 10 to 15 years away. But, says Farahany, we might never get there.

The most likely and most accessible way to have a true brain-machine interface is via implanted electrodes. That said, we can already tell with noninvasive technology if a person is having positive, negative, or neutral emotions, experiencing anxiety, about to have an epileptic seizure, thinking of a particular shape or simple word or simple set of numbers, or is drowsy while driving.

advertisement

And right now, no laws prevent the NSA from spying on our brains or from companies collecting brain data and selling the information to third parties.

“I worry about humans’ ability to keep up with artificial intelligence,” says Farahany. “It’s fair to think of us as embarking on a new industrial revolution. There’s urgency. You have major players like Facebook and Elon Musk investing heavily in healthy populations. If we’re going to get out ahead of it and create a way to safeguard individuals, the time is now to be able to adopt these protections.”

advertisement
advertisement