The Memristor Revolution: Chips Can Work Like Brains Too

memristor

Memristors: If you don't know about 'em, you certainly will over the next few years. We're probably not talking about the same kind of revolution as the transistor sparked off, but new research has shown they can mimic brain cells, so you never know...

The memristor is actually an amazing little beast, dubbed the electronic "missing link" between transistors (which act as electronic switches or valves) and resistors (which merely stifle the flow of electrical current through them.) The most important effect these tiny slivers of exotic chemicals and silicon can produce is a type of latency--whereby a previous action performed by the memristor can change its behavior to influence the next electrical trick it's used for. Though that doesn't sound like much, it really is important--it means each memristor has a kind of "memory" and that a complex network of them all strung together can do dynamic, clever processing tricks that a mere transistor-based network could never aspire to.

So here's where it gets freaky: A University of Michigan team, headed up by Dr. Wei Lu, has been performing research on the devices, and has just published research showing that memristors actually can be made to react pretty much the same way synapses do. A synapse, lest we forget, is the little biological gizmo that allows one neuron in your brain to send a signal (electrical or chemical) to another one. In other words, Lu's team has demonstrated that it should be possible to craft semiconductor circuits into patterns that behave like a biological brain.

Their invention works by using a silicon and silver mix to connect two metal electrodes--the resulting memristor can strengthen or weaken the signal sent through it to the next "neuron" depending on when the system was last fired-up, enabling real synapse-like behavior.

Why should you get excited about biological-inspired computing advances though, when turning millions of memristors into a brain-like chip is years away? Because the advantages for lots of the computing that's increasingly touching your day-to-day life are enormous: Neuron-based chips can be astonishingly better at pattern recognition at high speed, meaning ultra-fast and extremely clever machine vision and object recognition. It could also enable genuine chip-level learning computers to be designed, with all the attendant androidy excitement that that might stir up. DARPA's involved in this research too, should it need an extra spine-tingling factor.

To keep up with this news and more like it, follow me, Kit Eaton, on Twitter. That QR code on the left will take you to my Twitter feed too.

Add New Comment

0 Comments