9 Things Computers Can Do Now That They Couldn’t Do A Year Ago

2014 had its fair share of “firsts” in hardware, software, and robotics. Here’s our highly subjective selection.


Software and silicon are sometimes the poor relations of the science world, their advances eclipsed by more glamorous breakthroughs in physics, genetics, and space exploration. Progress in AI and robotics, in particular, is often greeted with as much with trepidation as praise. Yet some amazing leaps were made in 2014 alone, from a robotic hand which an amputee can “feel” to a realistic virtual universe.


Here’s our nine best new advances:

1. Play “Emotionally Engaging” Music

In April, electronic artist Squarepusher released an EP called Music for Robots, which was played by actual robots with musical superpowers. The guitarist of Z-machines, Mach, plays two guitars with the aid of 78 fingers and 12 picks. Cosmos triggers notes on his keyboard with lasers and drummer Ashura uses his six arms to wield 21 drumsticks. Z-Machines were created at the University of Tokyo by CGI artist Yoichiro Kawaguchi, robotics engineer Naofumi Yonetsuka, and media artist Kenjiro Matsuo.

Squarepusher’s objective was to see if robot musicians could play emotionally engaging music. “Part of what interests me is when we listen to a robot, do we listen to it as if we’re listening to a human?” he said. “I wasn’t trying to make it emulate a human being, but I was trying to make it do something which I wanted to hear. Now the question remains, is the thing which I want to hear a human being?”


2. Use “Right-Brained” Chips

Chips inspired by the billions of neurons in the human brain made a splash this year. Current hardware architectures separate computation and storage of information and operate sequentially, limiting the amount of data which can be processed and synthesized. So neuromorphic chips integrate data storage and processing and can operate in parallel, mimicking the way the human brain processes sensory information like images and sound in a massively parallel manner. Such chips could recognize patterns in large amounts of data more efficiently than current linear or “left-brained” architectures.

IBM announced in August that it had packed the largest number of chips ever on to its latest chip, the TrueNorth processor. Powered by a million artificial neurons and 256 million synapses (in the brain a synapse allows electrical charge to pass between neurons) the chip is laid out in a network of 4,096 neurosynaptic cores which integrate memory and computation and operate in parallel in an event-driven fashion. TrueNorth uses a mere 70 milliwatts in operation, giving it a power density (power consumption per cm2) 10,000 lower than most microprocessors. This allows it to efficiently perform power-hungry tasks like detecting and classifying objects in a video stream.

3. Beat The Turing Test

In June, a chatbot program called Eugene Goostman persuaded 33% of human interrogators that it was actually a 13-year-old boy, making it the first piece of software to pass the Turing test. Alan Turing predicted in a 1950 paper that by the year 2000 a computer would play the imitation game well enough that “an average interrogator will not have more than 70% chance of making the right identification after five minutes of questioning.” Developers Vladimir Veselov and Eugene Demchenko gave Eugene the personality of a teenage Ukrainian boy in order to make gaps in his knowledge seem more plausible.


4. Perform Accurate Quantum Calculations

In October Australian researchers claimed a quantum computing breakthrough when they created two new types of quantum bit, or “qubit”. A bit is always in one of two states—0 or 1— while a qubit can be in superpositions, i.e., in both of its possible states at once. Once a qubit is measured, however, it has one known state. A quantum computer maintains a sequence of qubits which can be in every possible combination of 1s and 0s at once, giving it the potential to perform complex calculations exponentially faster than classical computers.

The first type of qubit created by the researchers exploits an atom made of phosphorous, which achieved 99.99% accuracy in quantum operations, while the second relies on an artificial atom made of conventional silicon transistors. Both qubits were housed in a very thin layer of silicon from which magnetic isotopes had been removed to eliminate noise in the quantum calculations. (Quantum states are very fragile and prone to interference, a fact that has proved to be one of the major obstacles to the development of a practical quantum computer.) The team also set a new world record by preserving a quantum state for a full 35 seconds.

5. Break The Broadband Barrier

In September Akamai announced that the average global Internet connection speed had smashed the 4 megabit-per-second broadband threshold for the first time, hitting 4.6 Mbps during the second quarter of 2014. The global average peak connection speed also increased 20% to 25.4 Mbps between the first and second quarter of 2014.


South Korea had the highest average connection speed at a blistering 24.6 Mbps followed by Hong Kong (15.7 Mbps) and Japan (14.9 Mbps). The average connection speed in the United States was a relatively sluggish 11.4 Mbps.

6. Read Your Emotions

Researchers in Bangladesh used keystroke dynamics and text-pattern analysis to detect the emotions of users. The software searched for seven emotional states: joy, fear, anger, sadness, disgust, shame, and guilt. Joy was the easiest emotion to detect—the software achieved 87% accuracy—followed by anger at 81%.

Affective computing aims to recognize, model, and simulate human emotions so that software can adapt its behavior to the emotional state of its user. Potential applications include learning, mood-monitoring, and robots that interact with humans.


7. Create A Realistic Virtual Universe

An international research team this year simulated 13 billion years of cosmic evolution for the first time. The simulation ran on supercomputers using software called Arepo. An average laptop would require nearly 2,000 years to run the same simulation.

Cosmologists use such models to test various theories by comparing the outcome of a simulation based on particular assumptions to the universe as we see it now. This particular simulation is the first to show dark matter clumping into visible matter which later form the first galaxies. After simulating 13 billion years of time, the virtual universe which emerges looks very similar to our own, supporting the theory that dark matter was crucial to the origin of our universe.

8. Give A Robot Hand “Feeling”

Dennis Aabo Sørensen can now feel different types of pressure on three fingers of his prosthetic, robotic hand using a device which interacts with the nerves in his arm. The Swiss researchers who built the device translated forces detected on the robotic fingertips into electrical pulses sent to the ulnar (linked to the pinky finger) and median (linked to the index finger and thumb) nerves in the flesh and blood arm.


This allows Sørensen to feel the difference between light and forceful pressure and detect the texture and shape of the object gripped by his hand, for example, a cloth versus a wooden object. The researchers claimed that it was the first time that an amputee could “feel” in real time via a sensory-enhanced prosthetic.

9. Start Up Instantly

Most computers today use a volatile form of random access memory (RAM) which requires electrical current in order to encode data. When a computer is shut down and current no longer flows, all data contained in RAM is lost. This also makes true instant start impossible since the RAM needs to be re-populated on startup. Passing current also consumes a considerable amount of energy, much of which dissipates as heat.

But this is changing. In December researchers at Cornell University announced that they had developed non-volatile magnetoelectric memory technology that uses a low voltage rather than current, hugely reducing power consumption. The device is made from a ferroic material called bismuth ferrite, which has the rare characteristic of being both magnetic and electrically polarized. The polarization can be switched by applying an electrical field, for example by switching the value of a bit from 0 to 1. In contrast with rival technologies, the device works at room temperature and uses an order of magnitude less energy. The researchers claim that this new form of memory could make low-power, instant-on computing a ubiquitous reality.

About the author

Lapsed software developer, tech journalist, wannabe data scientist. Ciara has a B.Sc. in Computer Science and and M.Sc in Artificial Intelligence


The Fast Company Innovation Festival is happening now! Join us LIVE for FREE now.