advertisement
advertisement

A Self-Repairing Computer System Debuts After 15 Years Of Research

A computer scientist is using biomimicry to build a computer at University College London that self-corrects operations the same way the body does in neural networks and DNA. “That’s an entirely different paradigm than the centralized and linear nature of traditional computers,” says Peter Bentley, its creator.

A Self-Repairing Computer System Debuts After 15 Years Of Research
Image by euthman on Flickr

A few years ago, Peter Bentley got stuck on a problem: No matter how hard he tried, he couldn’t get software to heal itself. “We were trying to make them survive damage, show graceful degradation instead of just crashing, says the University College, London computer scientist, “or even reconfigure themselves to recover lost code. Conventional computers just couldn’t do it.”

advertisement
advertisement

In fact, the hardware and the software he was using at the time couldn’t even tolerate damage to a single bit; 99.9 percent of the time, the system would crash.

Bentley is a polymath. A popular science author as well as a geek, he views computer science as a lens through which to get to know other disciplines, from history to, especially, biology. This kind of synthetic approach has led him through 15 years of exploration toward a breakthrough working prototype of a computer that never crashes, created with UCL’s Christos Sakellariou and debuting this April.

Bentley and his team looked to the natural world for inspiration. Wetware systems are far more robust and redundant than centralized computer systems, says Bentley: “They use multiple elements which interact in a distributed and somewhat random manner to produce an emergent result. Think of the brain, the immune system, an ant colony, a flock of birds.”

One-upping Von Neumann

The team set out to follow these rules to build a “systemic” computer like one that might appear in nature. That’s an entirely different paradigm than the centralized and linear nature of traditional computers, which are based on Von Neumann architecture.

advertisement

What makes decentralized computing systems so robust is redundancy. If a traditional computer wanted to add numbers together, it uses a program with a single add instruction–if that single program throws an error, it brings the whole operation to a halt.

“In a systemic computer, it might have several ‘adds’ floating about, any of which might be used to perform that calculation,” says Bentley. “It’s the combination of redundancy at a low level–multiple copies of instructions and data–and decentralisation, plus that randomness” that enables the systemic computer to be robust against damage.

If part of the computer is damaged or hits a bug, the randomized selector hits on another part with some compatible instructions to keep working. This is similar to what happens in the brain. “In our brain we lose neurons every day but we’re fine–our brains have the redundancy to cope and the ability to reconfigure themselves to make use of what is left. The systemic computer does the same thing. The systemic computer uses a pool of systems where its equivalent of instructions may be duplicated several times.”

The systemic computer’s ability to reconfigure its own code is similar to what is found in individual cells as well. DNA’s double strands are complementary like the two halves of a zipper, so one half can be used to replicate and repair the other.

“One way in which DNA repairs itself it to use an undamaged complementary strand of the DNA as a template,” says Bentley. “The systemic computer uses multiple copies of instructions floating about–if one is damaged, the others can be used as templates to fix the damaged one.”

advertisement

What the pioneers would have wanted

Bentley says his computer is part of a new wave of non-Von Neumann computers that feature specialized applications in areas like AI, drone control, and more broadly, the entire so-called Internet of Things: disaster-proofing infrastructure, modeling weather systems, or synthesizing chemicals.

The theory that superintends Bentley’s organic computer is laid out more fully in his book Digitized, a history of the theory and practice of computation.

“Tracing many original documents for the book, I realized that lots of my ideas in the systemic computer date back to the birth of computers,” he says. “Turing, Shannon, von Neumann–all these pioneers were informed by biology. They all wanted to make a computer that worked more like a biological brain. In the 1940s and ’50s the technology was too primitive to allow it. It turns out that pioneers from then to the present day all have the same ambition: to make a parallel, distributed, adaptive, brain-like computer.”

While Bentley’s computer may be called a non-Von Neumann architecture, Von Neumann himself actually thought of his 80-ton computer as prototypical of biology. As Bentley reports in his book, Von Neumann wrote this while on a train in 1945:

First: Since the device is primarily a computer, it will have to perform the elementary operations of arithmetics most frequently. These are addition, multiplication and division. It is therefore reasonable that it should contain specialized organs for just these operations… a central arithmetic part of the device will probably have to exist and this constitutes the first specific part: CA.

Second: The logical control of the device, that is the proper sequencing of its operations can be most efficiently carried out by a central control organ… this constitutes the second specific part: CC.

Third: Any device which is to carry out long and complicated sequences of operations (specifically of calculations) must have a considerable memory… this constitutes the third specific part: M…The three specific parts CA, CC and M correspond to the associative neurons in the human nervous system. It remains to discuss the equivalents of the sensory or afferent and the motor or efferent neurons. These are the input and the output organs of the device.

advertisement
advertisement

About the author

Anya Kamenetz is the author of Generation Debt (Riverhead, 2006) and DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education, (Chelsea Green, 2010). Her 2011 ebook The Edupunks’ Guide was funded by the Gates Foundation

More