Never before have biology and silicon been so well positioned to interact in a way that will unlock new forms of innovation. Former Google CEO Eric Schmidt has said that biology will undoubtedly fuel computing. Conversely, computing is already powering advances in biology. Now is the time for innovators and investors to look carefully at this collision of science and technology that is creating a new frontier.
One of the foundations of the bio revolution now underway is the knowledge base was built over 13 years as scientists mapped the human genome. However, the power of that map to fuel innovation only materialized when it became cheaper and quicker to sequence DNA because of advances in computing. Today, the cost of DNA sequencing is decreasing at a rate faster than Moore’s Law. In 2003, mapping the genome cost about $3 billion; by 2016, that had dropped to less than $1,000 and could be less than $100 in less than a decade. Scientists sequenced the coronavirus responsible for COVID-19 in weeks rather than the months it took to sequence the virus responsible for the original SARS epidemic.
Now the bio revolution is increasingly about using this knowledge to engineer organisms, such as curing incurable diseases or modifying crops to be more heat- or drought-resistant—traits that are ever more important given climate change. Here, data analytics and AI have been a major catalyst. For example, by scanning the correlations between the gene sequences of large numbers of patients, their medical histories, and their responses to various therapies, researchers can discover which genetic variations are associated with particular diseases (“genome-wide association studies” also known as GWAS), and which interventions could be most effective for individual patients. In the lab, biotech and research institutes are using robotic automation and sensors that could increase throughput up to 10 times.
New research from the McKinsey Global Institute finds that this revolution could affect virtually every sector over time. Already a pipeline of about 400 applications, most of which are scientifically feasible today, is visible, and could have direct economic impact of $2 trillion to $4 trillion a year globally between 2030 and 2040.
But silicon is not only an enabler of biological advances—some applications are founded on unprecedentedly close interaction between the two. Bio-machine interfaces, in which machines can be linked to the brain, have become possible over the past decade. The detection of neural signals is improving and generating higher-quality, more detailed brain data, and advances in analytics and machine learning are enabling better interpretation of those signals.
The most advanced bio-machine interfaces are in healthcare. Neuroprosethetics have been developing for many years but are becoming increasingly sophisticated. Traditional prosthetics are body-powered: A prosthetic arm could be linked to the shoulder and depending on the movement of the shoulder, the hand may open or close. The next step was prosthetics that use myoelectric sensors that respond to electric signals from muscles. The new frontier is neuroprosthetic limbs that take signals from a surgically implanted chip in the patient’s brain. The Biomechatronics Group at MIT headed by Hugh Herr, himself a double amputee, is refining a method that replicates muscle pairings so that the brain thinks that the limb is still there, making it easier to move the bionic limb.
Beyond neuroprosthetics, researchers are exploring brain-to-device communications in which language from patients’ brains is translated directly to computers either using a surgically inserted neuroprosthetic device or a wearable headband. Such a technique could be used for patients suffering from “locked-in syndrome” paraplegia.
Deep-brain stimulation, in which a pulse stimulator is connected to electrodes in the brain, is being used to help control essential tremor often observed in Parkinson’s disease, but now scientists are researching the technique for possible use in patients with Alzheimer’s, depression, and anxiety.
Versions of these techniques are finding their way into consumer markets, such as gaming devices and wellness monitors. Headbands that use electric signals from the brain can measure stress, for instance. Some futuristic applications are being explored that could be of interest to the defense and airline industries. Experiments have demonstrated that brain-based control of vehicles is feasible, potentially of interest to the military. Researchers at a university in Munich have demonstrated an algorithm that is able to record brain signals in a cap worn by pilots—in wearing that cap, even people with no flight experience were able to fly the plane in a simulation.
Further into the future, biology could come to the aid of one of the challenges of the digital revolution—how to store all the data being generated. DNA is a possible answer. Raw DNA is about a million times denser than conventional hard-disk storage; engineering constraints aside, one kilogram of raw DNA could store all the world’s data today. This type of storage doesn’t degrade like silicon, and could store data safely potentially for hundreds of years.
A 659-kilobyte book was encoded on DNA in 2011 by Harvard University, and the team encoded more than five megabytes of data a year later, and much more accurately. In 2013, the European Bioinformatics Institute reported accuracy of 100%. Intel, Micron, and Microsoft are now investing in this technology.
Humans are getting used to the idea of working alongside machines; the bio revolution makes that relationship many times closer. Impact may be some way off, but those players who see the potential and get in on the ground level can create dynamic new opportunities.
Michael Chui is a partner at the McKinsey Global Institute (MGI), McKinsey’s business and economics research arm. He leads research on the impact of disruptive technologies and innovation on business, the economy, and society.