We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor. We ask that you disable ad blocking while on Silicon
Investor in the best interests of our community. For example, here is how to disable FireFox ad content blocking while on Silicon Investor.
For the first time, Amazon today showed off its newest fully electric delivery drone at its first re:Mars conference in Las Vegas. Chances are, it neither looks nor flies like what you’d expect from a drone. It’s an ingenious hexagonal hybrid design, though, that has very few moving parts and uses the shroud that protects its blades as its wings when it transitions from vertical, helicopter-like flight at takeoff to its airplane-like mode.
These drones, Amazon says, will start making deliveries in the coming months, though it’s not yet clear where exactly that will happen.
What’s maybe even more important, though, is that the drone is chock-full of sensors and a suite of compute modules that run a variety of machine learning models to keep the drone safe. Today’s announcement marks the first time Amazon is publicly talking about those visual, thermal and ultrasonic sensors, which it designed in-house, and how the drone’s autonomous flight systems maneuver it to its landing spot. The focus here was on building a drone that is as safe as possible and able to be independently safe. Even when it’s not connected to a network and it encounters a new situation, it’ll be able to react appropriately and safely.
When you see it fly in airplane mode, it looks a little bit like a TIE fighter, where the core holds all the sensors and navigation technology, as well as the package. The new drone can fly up to 15 miles and carry packages that weigh up to five pounds.
This new design is quite a departure from earlier models. I got a chance to see it ahead of today’s announcement and I admit that I expected a far more conventional design — more like a refined version of the last, almost sled-like, design.
Besides the cool factor of the drone, though, which is probably a bit larger than you may expect, what Amazon is really emphasizing this week is the sensor suite and safety features it developed for the drone.
Ahead of today’s announcement, I sat down with Gur Kimchi, Amazon’s VP for its Prime Air program, to talk about the progress the company has made in recent years and what makes this new drone special.
“Our sense and avoid technology is what makes the drone independently safe,” he told me. “I say independently safe because that’s in contrast to other approaches where some of the safety features are off the aircraft. In our case, they are on the aircraft.”
Kimchi also stressed that Amazon designed virtually all of the drone’s software and hardware stack in-house. “We control the aircraft technologies from the raw materials to the hardware, to software, to the structures, to the factory to the supply chain and eventually to the delivery,” he said. “And finally the aircraft itself has controls and capabilities to react to the world that are unique.”
What’s clear is that the team tried to keep the actual flight surfaces as simple as possible. There are four traditional airplane control surfaces and six rotors. That’s it. The autopilot, which evaluates all of the sensor data and which Amazon also developed in-house, gives the drone six degrees of freedom to maneuver to its destination. The angled box at the center of the drone, which houses most of the drone’s smarts and the package it delivers, doesn’t pivot. It sits rigidly within the aircraft.
It’s unclear how loud the drone will be. Kimchi would only say that it’s well within established safety standards and that the profile of the noise also matters. He likened it to the difference between hearing a dentist’s drill and classical music. Either way, though, the drone is likely loud enough that it’s hard to miss when it approaches your backyard.
To see what’s happening around it, the new drone uses a number of sensors and machine learning models — all running independently — that constantly monitor the drone’s flight envelope (which, thanks to its unique shape and controls, is far more flexible than that of a regular drone) and environment. These include regular camera images and infrared cameras to get a view of its surroundings. There are multiple sensors on all sides of the aircraft so that it can spot things that are far away, like an oncoming aircraft, as well as objects that are close, when the drone is landing, for example.
The drone also uses various machine learning models to, for example, detect other air traffic around it and react accordingly, or to detect people in the landing zone or to see a line over it (which is a really hard problem to solve, given that lines tend to be rather hard to detect). To do this, the team uses photogrammetrical models, segmentation models and neural networks. “We probably have the state of the art algorithms in all of these domains,” Kimchi argued.
Whenever the drone detects an object or a person in the landing zone, it obviously aborts — or at least delays — the delivery attempt.
“The most important thing the aircraft can do is make the correct safe decision when it’s exposed to an event that isn’t in the planning — that it has never been programmed for,” Kimchi said.
The team also uses a technique known as Visual Simultaneous Localization and Mapping (VSLAM), which helps the drone build a map of its current environment, even when it doesn’t have any other previous information about a location or any GPS information.
“That combination of perception and algorithmic diversity is what we think makes our system uniquely safe,” said Kimchi. As the drone makes its way to the delivery location or back to the warehouse, all of the sensors and algorithms always have to be in agreement. When one fails or detects an issue, the drone will abort the mission. “Every part of the system has to agree that it’s okay to proceed,” Kimchi said.
What Kimchi stressed throughout our conversation is that Amazon’s approach goes beyond redundancy, which is a pretty obvious concept in aviation and involves having multiple instances of the same hardware on board. Kimchi argues that having a diversity of sensors that are completely independent of each other is also important. The drone only has one angle of attack sensor, for example, but it also has a number of other ways to measure the same value.
Amazon isn’t quite ready to delve into all the details of what the actual on-board hardware looks like, though. Kimchi did tell me that the system uses more than one operating system and CPU architecture, though.
It’s the integration of all of those sensors, AI smarts and the actual design of the drone that makes the whole unit work. At some point, though, things will go wrong. The drone can easily handle a rotor that stops working, which is pretty standard these days. In some circumstances, it can even handle two failed units. And unlike most other drones, it can glide if necessary, just like any other airplane. But when it needs to find a place to land, its AI smarts kick in and the drone will try to find a safe place to land, away from people and objects — and it has to do so without having any prior knowledge of its surroundings.
To get to this point, the team actually used an AI system to evaluate more than 50,000 different configurations. Just the computational fluid dynamics simulations took up 30 million hours of AWS compute time (it’s good to own a large cloud when you want to build a novel, highly optimized drone, it seems). The team also ran millions of simulations, of course, with all of the sensors, and looked at all of the possible positions and sensor ranges — and even different lenses for the cameras — to find an optimal solution. “The optimization is what is the right, diverse set of sensors and how they are configured on the aircraft,” Kimchi noted. “You always have both redundancy and diversity, both from the physical domain — sonar versus photons — and the algorithmic domain.”
The team also ran thousands of hardware-in-the-loop simulations where all the flight services are actuating and all the sensors are perceiving the simulated environment. Here, too, Kimchi wasn’t quite ready to give away the secret sauce the team uses to make that work.
And the team obviously tested the drones in the real world to validate its models. “The analytical models, the computational models are very rich and are very deep, but they are not calibrated against the real world. The real world is the ultimate random event generator,” he said.
It remains to be seen where the new drone will make its first deliveries