High school physics will tell you that if you pass a signal down a wire, by the time it gets to the end the signal will have degraded. The same is true with computers, which typically use copper wires as a way of connecting together internal components, or even entire computers in data centers. According to researchers at Stanford, roughly 80% of the power used by a computer is lost thanks to the use of these copper wires.
But there’s a solution in sight.
New research by Stanford’s Nanoscale and Quantum Photonics Lab demonstrates how we may soon be able to use optics, rather than electricity, to send data, making computers more efficient, faster, and more reliable by orders of magnitude.
The lab designs, builds, and tests extremely small optical devices–usually just a few microns across, or even smaller–for applications including high-speed telecommunications and quantum computing.
The project described in a new paper details a groundbreaking prism-like device, able to split different wavelengths of light and control them to a degree that has never previously been possible. In doing so, the researchers hope to allow computers to run exponentially faster and more efficiently than they do today, for use in intensive applications like high-bandwidth image processing and video streaming.
“If you were able to change electrical connections to optical ones you would be able to reduce energy consumption dramatically, which would also allow you to increase the operating speed,” says Jelena Vuckovic, professor of electrical engineering, who led the research.
The reason for this is that light can carry more data than wires can, and it takes less energy to transmit photons than it does to transmit electrons. Computer networks running on fiber-optic lines rely on this principle. Some have also proposed using light superfast optical Wi-Fi, called “LiFi.”
“Energy consumption is really the biggest bottleneck in computing,” Vuckovic continues. “As you increase the speed of processors, they heat up more. As a result there is a limit to just how fast they can operate. You see a related problem in data centers, which are consuming pretty much all the power they are producing from sources like hydroelectric power–and yet they continue to expand. We need to dramatically change the way we approach this problem.”
As wires get smaller and signals are sent at higher frequencies, this problem only gets worse. At some point, there is a barrier which stops data being sent any faster because too much heat is created, ultimately risking damage to the processor.
Optical computing suggests this might not be a problem. While a copper wire can achieve data transfer at speeds of up to 20 gigabits per second, optically there is no limit to how fast this could take place.
The prism-like device created by the Nanoscale and Quantum Photonics Lab is referred to as an “optical link.” It is an incredibly thin silicon chip just eight microns long (eight millionths of a meter) and intricately patterned with nanoscale etchings that resemble a bar code.
Unlike the simple geometric patterns created by previous nanophotonics researchers, the complex pattern on the optical link isn’t the result of human intuition. Instead, it is created by an algorithm, which reduces a time-consuming design process down to 15 minutes.
The algorithm was originally developed by Jesse Lu, a former student in the Nanoscale and Quantum Photonics Lab. It is based around the concept of convex optimization, a generic term covering a wide range of local optimization techniques. These include optimizing stock market portfolios in finance, optimizing wing profiles in aerospace design, routing optimization for Internet traffic, and designing large electrical circuits.
“The basic idea is to have an ‘objective function,’ which describes the ‘fitness’ of a particular system you are trying to optimize with a single number,” explains Alex Piggott. “We want to minimize the value of this objective function, and once we have done so, we have found an optimal solution.”
The optimization algorithm allowed the researchers to design and build the optical link to take advantage of the fact that as light travels through different materials, it is transmitted and reflected in different ways depending on the medium–e.g., air or silicon. At the start of the process, the algorithm uses a simple design that consists of only silicon. By having the researchers enter their desired output, the algorithm can then make hundreds of tiny adjustments to the prism’s surface, with the aim of producing exactly the right output light.
For the current paper, the chips were then fabricated and tested to demonstrate that they would work.
Both 1300-nanometer light and 1550-nanometer light–corresponding to C-band and O-band wavelengths widely used in fiber optic networks–were beamed at the chip from above. The bar code-like surface redirected the C-band light one way and O-band light the other, directly from the chip.
The result is a step forward for optical computing, a field that is also being explored by giants like Intel and IBM. (A separate proposal for superfast optical Wi-Fi–LiFi–relies on the same property.)
Some are exploring more sophisticated, multiple wavelength connections. The Phoxtrot project in Europe is fostering work on guiding light waves and embedded micro-mirrors to solve corner-turning issues. Others have imagined combining photonic chips with a material like graphene–a form of carbon that comes in sheets just one atom thick–to make chips even faster and more efficient.
And others, including Google and Microsoft, are experimenting with using light inside processors–what’s called quantum computing–a budding and contested field that promises even more dramatic improvements.
The applications for much faster data links within computers–for instance, in the large energy-intensive data centers of companies like Amazon and Facebook–are tantalizing.
“Looking to the future, there are sets of problems where computing can benefit from using fully optical components,” Jelena Vuckovic says. “Image processing is one such area. Another is pattern recognition. Both of these can be carried out more efficiently by building optical computing systems, rather than using standard binary architecture. I believe that one day we will be able to do this fully optically.”
Three building blocks are needed for this to happen.
The first is a method of converting electrical into optical signals, such as a laser, that can convert a stream of bits into a stream of optical pulses. The second component is a way of routing optical signals between different end points. The third component is a way of converting the optical signals back into electrical ones once they arrive at their destination. This can be achieved by way of a high speed photodetector.
Parts one and three of this work have previously been carried out by collaborators of the Nanoscale and Quantum Photonics Lab. Stanford’s new algorithm is the last piece of the puzzle. With it, the team can make any type of optical connectors, routers, and hubs necessary to route a complex network of optical signals around the chip.
“Of course, we have to put all of these building blocks together and integrate with the processor platform, which we haven’t done yet,” says Vuckovic. “But I am optimistic that we should be able to do it within five years or so.”
It won’t speed up computers immediately, but true optical computing isn’t light years away, either.