|Former Apple Engineers Working on New Eyes for Driverless Cars|
By CADE METZ
New York Times
SEPT. 20, 2017
Soroush Salehian and Mina Rezk, of Aeva, a Silicon Valley start-up making new guidance systems for driverless vehicles. Credit Jason Henry for The New York Times
PALO ALTO, Calif. — Soroush Salehian raised both arms and spun in circles as if celebrating a touchdown.
Across the room, perched on a tripod, a small black device monitored this little dance and streamed it to a nearby laptop. Mr. Salehian appeared as a collection of tiny colored dots, some red, some blue, some green. Each dot showed the precise distance to a particular point on his body, while the colors showed the speed of his movements. As his right arm spun forward, it turned blue. His left arm, spinning away, turned red.
“See how the arms are different?” said his business partner, Mina Rezk, pointing at the laptop. “It’s measuring different velocities.”
Messrs. Salehian and Rezk are the founders of a new Silicon Valley start-up called Aeva, and their small black device is designed for self-driving cars. The veterans of Apple’s secretive Special Projects Group aim to give these autonomous vehicles a more complete, detailed and reliable view of the world around them — something that is essential to their evolution.
A sign warns that a laser is in use at Aeva. Credit Jason Henry for The New York Times
Today’s driverless cars under development at companies like General Motors, Toyota, Uber and the Google spinoff Waymo track their surroundings using a wide variety of sensors, including cameras, radar, GPS antennas and lidar (short for “light detection and ranging”) devices that measure distances using pulses of light.
But there are gaps in the way these sensors operate, and combining their disparate streams of data is difficult. Aeva’s prototype — a breed of lidar that measures distances more accurately and also captures speed — aims to fill several of these sizable holes.
“I don’t even think of this as a new kind of lidar,” said Tarin Ziyaee, co-founder and chief technology officer at the self-driving taxi start-up Voyage, who has seen the Aeva prototype. “It’s a whole different animal.”
Founded in January and funded by the Silicon Valley venture capital firm Lux Capital, among others, Aeva joins a widespread effort to build more effective sensors for autonomous vehicles, a trend that extends from start-ups like Luminar, Echodyne and Metawave to established hardware makers like the German multinational Robert Bosch.
The company’s name, Aeva, is a play on “Eve,” the name of the robot in the Pixar movie “WALL-E.”
Mr. Rezk in Palo Alto. He and his business partner Mr. Salehian are veterans of the secretive Special Projects Group at Apple, which they left late in 2016. Credit Jason Henry for The New York Times
The market for autonomous vehicles will grow to $42 billion by 2025, according to research by the Boston Consulting Group. But for that to happen, the vehicles will need new and more powerful sensors. Today’s autonomous cars are ill prepared for high-speed driving, bad weather and other common situations.
The recent improvements in self-driving cars coincided with the improvements offered by new lidar sensors from a Silicon Valley company called Velodyne. These sensors gave cars a way of measuring distances to nearby vehicles, pedestrians and other objects. They also provided Google and other companies with a way of mapping urban roadways in three dimensions, so that cars will know exactly where they are at any given moment — something GPS cannot always provide.
But these lidar sensors have additional shortcomings. They can gather information only about objects that are relatively close to them, which limits how fast the cars can travel. Their measurements aren’t always detailed enough to distinguish one object from another. And when multiple driverless cars are close together, their signals can become garbled.
Other devices can pick up some of slack. Cameras are a better way of identifying pedestrians and street signs, for example, and radar works over longer distances. That’s why today’s self-driving cars track their surroundings through so many different sensors. But despite this wide array of hardware — which can cost hundreds of thousands of dollars per vehicle — even the best autonomous vehicles still have trouble in so many situations that humans can navigate with ease.
With their new sensor, Messrs. Salehian and Rezk are working to change that. Mr. Rezk is an engineer who designed optical hardware for Nikon, and presumably, he was among those who handled optical sensors for Apple’s driverless car project, though he and Mr. Salehian declined to say which “special project” they worked on at the company. They left Apple late last year.
New devices are hidden under a black sheet in a research and development room at Aeva in Palo Alto, Calif. Credit Jason Henry for The New York Times
Where current lidar sensors send out individual pulses, Aeva’s device sends out a continuous wave of light. By reading the way this far more complex signal bounces off surrounding objects, Mr. Rezk said, the device can capture a far more detailed image while also tracking velocity. You can think of it as a cross between lidar, which is so good at measuring depth, and radar, which is so good at measuring speed.
Mr. Rezk also said the device’s continuous wave would provide greater range and resolution than existing lidar devices, deal better with weather and highly reflective objects like bridge railings, and avoid interference with other optical sensors.
Cars will continue to use multiple kinds of sensors, in part because redundancy helps ensure that these cars are safe. But Aeva aims to give these cars a better view of the world from a smaller and less expensive set of sensors.
Researchers at the University of California, Berkeley, have built similar hardware, and companies like Velodyne and the start-ups Oryx Vision and Quanergy say they are exploring similar ideas. Like these efforts, the Aeva prototype is still under development, and the company plans to sell devices next year. But it shows how autonomous car sensors need to evolve — and that they are indeed evolving.
Ultimately, new sensors will allow cars to make better decisions. “With autonomous cars, 90 percent of the time, you are trying to infer what is happening,” Mr. Ziyaee said. “But what if you can just measure it?”
Follow Cade Metz on Twitter: @CadeMetz
A version of this article appears in print on September 21, 2017, on Page B2 of the New York edition with the headline: Seeking Keener Vision for Driverless Cars.