Fast company logo
|
advertisement

Under the legislation, the same privacy measures that are extended to fingerprints and facial recognition software now apply to brainwave data.

What Colorado’s new law means for brain-wave privacy in the Neuralink era

[Images: zaie/Adobe Stock; Blake Cheek/Unsplash]

BY Ellie Stevens3 minute read

Lawmakers have long grappled with data privacy as it pertains to our devices and vehicles. But there’s a new battleground emerging in the privacy battles: our brains. 

Governor Jared Polis signed a bill that protects brain waves as “sensitive information” under the Colorado Privacy Act. Under the law, the same privacy measures that are extended to fingerprints and facial recognition software now apply to brainwave data. The law comes on the heels of the release of neurotechnologies such as Elon Musk’s Neuralink, a brain implant that has the ability to translate thought into action, and which could, its supporters argue, help develop treatments for mental diseases and improve people’s focus

The bill states that these technologies “raise particularly pressing privacy concerns given their ability to monitor, decode, and manipulate brain activity,” noting that these technologies cause an involuntary disclosure of information. 

Why Now? 

Neurotechnology remains one of the buzziest technologies around, with the market reaching a $13.47 billion valuation in 2023 and is projected to grow to $15.28 billion in 2024, according to a Yahoo Finance report.

Industry growth brought a shift in the use case for this technology, which the law cited as a reason for urgency. While this technology was previously confined to labs and focused on research and rehabilitation, it’s creeping toward adoption in sectors ranging from therapy to trucking. As the technology scales, supporters of the Colorado law hope to see more states opt for privacy protection. The Colorado law states that when noninvasive neurotechnology is used outside of a medical context, it is considered a consumer product and operates without regulation or data protection standards. A number of states are following Colorado’s lead, with California and Minnesota already working to pass legislation to protect neural data.

Neurotechnologies’ Capabilities and Concerns 

According to the NeuroRights Foundation, a New York-based advocacy group, neurotechnologies put at risk our rights to personal identity, free will, mental privacy, equal access to mental augmentation, and protection from algorithmic bias. More specifically, commercial neurotechnology’s writing and reading capabilities should give us the most cause for concern, says Eran Klein, an associate professor of neurology at Oregon Health and Sciences University. 

Writing technologies, which cause the brain to perform specific tasks by stimulating its electrodes, can be used to treat diseases like ALS and Parkinson’s. But they also could be used to change a person’s actions, or to alter a person’s identity by changing their desires or feelings, Klein says.

On the reading front, neurotechnologies can be used to infer messages about what a person is thinking or feeling, regardless of whether the person wants those thoughts made public, Klein says. The technology can also be used for mental augmentation, which some say will further issues of inequity.

“There is a fine line between treating diseases and enhancing people’s abilities that already fall within a normal range,” Klein says. 

As this neural data now falls under the Colorado Privacy Act, consumers can access, correct, and delete their data, and companies are subject to disclosure policies and data regulations.

advertisement

Doesn’t Social Media Track Me Already?

While many researchers are thrilled with the new legislation, some have taken issue with its narrow focus. 

“We forget there is already sensitive personal information at risk,” says Laura Cabrera, the chair in neuroethics at Pennsylvania State University. “The focus on neurodata is a missed opportunity [to combat other privacy risks].” 

Cabrera doesn’t see much of a difference between neurotechnology’s potential capabilities and the very real actions taken today by internet and social media companies. “Mind control is already here,” she says. “Think about how much social media controls our actions.” 

Supporters of the law, on the other hand, assert that the rapid expansion and capabilities of neurotechnology warrant this sense of urgency. “The Colorado bill is historic,” says NeuroRights cofounder and chair Rafael Yuste. “It’s the first time that neural data is legally defined and protected.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Ellie Stevens is an Editorial Resident at Fast Company and an undergraduate at Northwestern University. More


Explore Topics