The latest advancement was revealed in the company’s most recent Self-Driving Car Project Monthly Report:
Through observing cyclists on the roads and private test track, we’ve taught our software to recognize some common riding behaviors, helping our car better predict a cyclist’s course. Our sensors can detect a cyclist’s’ hand signals as an indication of an intention to make a turn or shift over. Cyclists often make hand signals far in advance of a turn, and our software is designed to remember previous signals from a rider so it can better anticipate a rider’s turn down the road.
The sensors on Google’s cars allow them to see in 360 degrees, enabling a wide field of image recognition. The software powering such recognition sensors will become critical in the success and safety of self-driving cars, as is evidenced in the wake of a Tesla crash in which software confused the side of white truck for gray sky.