SI
SI
discoversearch

   Technology StocksDrones, Autonomous Vehicles and Flying Cars


Previous 10 Next 10 
From: Glenn Petersen9/19/2017 6:47:13 AM
1 Recommendation   of 1467
 
Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in the context of cellular robotic systems. [1]

SI systems consist typically of a population of simple agents or boids interacting locally with one another and with their environment. The inspiration often comes from nature, especially biological systems. The agents follow very simple rules, and although there is no centralized control structure dictating how individual agents should behave, local, and to a certain degree random, interactions between such agents lead to the emergence of "intelligent" global behavior, unknown to the individual agents. Examples in natural systems of SI include ant colonies, bird flocking, animal herding, bacterial growth, fish schooling and microbial intelligence.

The application of swarm principles to robots is called swarm robotics, while 'swarm intelligence' refers to the more general set of algorithms. 'Swarm prediction' has been used in the context of forecasting problems.

en.wikipedia.org

Swarm intelligence key to successful operation of drones, says study

Sam Clark
The Stack
Tue 22 Aug 2017 5.27pm



Using swarm intelligence may be a way to combat many of the challenges faced by drone operators, both in business and by governments, according to a group of academics.

Researchers from British and French universities have written a paper noting how drones can now add value to these organisations due to low cost and mobilisation time – for example, checking the structural integrity of a building in a much cheaper way than with helicopters or high-rise cranes. They are also cheaper and quicker than helicopters for use by police services.

However, the study argues that drones rarely work effectively as individual units in these scenarios – they are best operated as part of a ‘fleet’, often with humans involved in a centralised control environment. This means ‘fleet participants’ need to cooperate with other drones in the same fleet. Similarly, if there are members of other fleets in the same airspace, they need to be able to communicate in order to operate safely.



The paper proposes different ways of managing drones. All decisions could be taken by a control centre – requiring real-time communication between drones in the air and the control centre on the ground. But there may be a number of situations where drones may need to behave autonomously, requiring a level of AI that a single drone is unlikely to have the computing capacity for.

Therefore, the authors suggest that ‘AI algorithms designed for the Swarm Intelligence paradigm can be applied.’ There are many challenges for a swarm of drones, and the paper finds what it calls mission-critical operations – which are time-sensitive and vital to the successful operation of the fleet.

When approaching these challenges, the authors note that it would be possible to create a set of pre-defined rules based on expected conditions and outcomes before the drones set off, but this does not account for the fact that drones are likely to encounter unexpected situations while in the air.

Instead, they propose using swarm intelligence so the drones can learn and adapt on the basis of the situation in which they find themselves.


Given that assessing the best way for a fleet of drones to operate presents considerable challenges, the paper assesses why a number of drones are preferable to a single drone in many business and emergency applications.

Firstly, there is better resiliency against failure – that is, if something fails on one drone, such as a temperature sensor, it would still be possible to get the same data from other drones.

Groups of drones can cover larger geographic locations, and carry out various, specialised tasks concurrently. They also form a network – meaning if drone B is too far away from the control centre to communicate with, but is close enough to drone A, it can effectively pass a message down the line.

On the advice of this study, organisations employing UAVs may be best utilising ‘self-aware, mission-focussed and independent fleets of drones.’

thestack.com

Share RecommendKeepReplyMark as Last Read


From: kidl9/19/2017 9:01:48 AM
2 Recommendations   of 1467
 

3 Reasons Why the Drone Industry Is Hiring Thousands of New Workers


A billion dollars in venture capital this year and deregulation are fueling a hot job market.


For the first time in history, investments in drones--those buzzy airborne gadgets that used to be called UAVs--have crossed a billion dollars. $1.2 billion has already been invested in unmanned aerial tech in 2017 so far, according to CB Insights. When I invested in a drone startup last year, there was only about $600 million in venture capital deployed. Drone pilots still had to be licensed like fixed-wing aircraft.

"In the past several years, the drone space has crossed the threshold from being driven by hobbyists and experimentalists to being largely enterprise-dominated," says John Frankel, founder at FFVC, a New York venture capital firm and early drone investor. He invested in Top Flight Technologies in 2015, and later, Skycatch, an autonomous aerial mapping startup that helps international construction and mining companies like Komatsu automate the collection, processing, and analysis of aerial data."

The doubling of investment and FAA deregulation now are driving a mini job boom. It's fast creating a workforce as large as that of private school teachers in the U.S.--about 400,000.

Deregulation is driving the drone market. Seeing the need for more licensed drone operators, about this time last year, the FAA created a new commercial drone pilot licensing program that requires no hands-on demonstration and onboards commercially licensed pilots fast. How fast? Plunk down $150, a 70-question multiple choice test, and the license could be yours. In the first 3 months, 300 new drone operators were minted every business day. Out of the first 28,000 applicants, some 22,000 passed. Those numbers pale, however, in comparison to the number of commercial drones registered in the same period--2,000 a day.

The FAA's young drone pilot license program is also creating sub-industries that create jobs. For example, the University of South Dakota just added drone operations to its academic curricula. Instructor Byron Noel shared recently with the Brookings Register, "It's useful in any field where an aerial perspective is useful."

"The drone industry is a great place to find a job," says Aerobo's Brian Streem, an Inc. 30 Under 30 honoree and the founder of that startup I invested in last year. His offices in Los Angeles and Brooklyn are hiring drone operators, production, and sales talent. Streem believes there are lots more changes to come in how users interact with drones. "A lot of people underestimate the complexities of actually pulling off a drone operation because of the 'unfun' stuff--charging batteries, performing flight maintenance, checking airspace. Automation only gets us so far. There is still manual work to do."

Many drone startup CEOs are pushing the FAA to ease back even further on regulation and further open the market. In a recent meeting with President Trump on emerging technology, former founder of Blackboard and current CEO of Precision Hawk of Raleigh, NC, Michael Chasen, lobbied Trump to relax regulations that are "limiting what drone technology can do," as reported by Recode.

Another big driver of the drone age is the promise of easy access to hard-to-reach locations. Whether you're Qualcomm and AT&T, which are piloting drones for cell tower inspection, or Amazon and Wal-mart, which are both investigated airship warehouses, not every business location has a highway. Drones are making a significant business case for extending the eyes, ears and operations of a number of enterprises at a cost that's worth exploring. Amazon is even looking at an " Alexa in the sky."

Corporations are on pace to acquire a drone startup a month this year. Increasingly, drone jobs aren't just at startups--they're at big corporations. Already in 2017, Intel acquired MaVinci. Boeing bought Liquid Robotics. Verizon snapped up Skyward and Snap snipped Ctrl Me Robotic.

FAA predicts the U.S. registered commercial drone feel to climb to between 442,000 and1.6 million in the next few years. Every few new drones create at least one job, at a minimum, to maintain, deploy and operationalize the asset, so we're looking at a new job force of a few hundred thousand created in just a few years.


The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.
inc.com




Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen9/20/2017 7:19:33 AM
   of 1467
 
    The first autonomous drone delivery network will fly above Switzerland starting next month

    Like a futuristic postal service... that delivers blood

    by Thuy Ong @ThuyOng
    The Verge
    Sep 20, 2017, 6:00am EDT



    Photo: Matternet
    ________________________________

    Logistics company Matternet has announced a permanent autonomous drone network in Switzerland that will now see lab samples like blood tests and other diagnostics flown between hospital facilities, clinics, and labs. The first delivery network will be operational from next month, with several more to be introduced later in the year. Matternet says medical items can be delivered to hospitals within 30 minutes.

    Matternet, based in Menlo Park, California, was granted authorization to operate its drones over densely populated areas in Switzerland in March and says that approval was a world first. Today, the company unveiled a Matternet Station; a kind of white, futuristic looking postbox with a footprint measuring about two square meters, that can be installed on rooftops or on the ground to send and receive packages by drone.



    The drone network is part of a partnership with Swiss Post, and is significant because it’s the first operational drone network flying in dense urban areas that’s not a pilot run or in testing. Last month, Zipline announced plans to operate its blood delivering service by drone in Tanzania by early next year as well. A pair of hospitals in Lugano in Switzerland had previously tested Matternet drone flights to deliver lab samples. Matternet plans to establish a regular service starting in early 2018.

    “These types of diagnostics that need to be transported are urgent in nature and they are on demand,” Andreas Raptopoulos co-founder and CEO of Matternet told The Verge. “They have to wait for a courier, sometimes they get taxis to do this type of thing — and when you have a system like this, that is autonomous and reliable, it completely transforms operations.”

    Users operate the system via an app to create shipment details. Items are placed into a compartment box in the station before being loaded into a drone for delivery. Currently the drones can hold up to 2kg (4.4 pounds). Packages are then flown to another Matternet station, where receivers can obtain their package by scanning a QR code.

    <img src="https://cdn.vox-cdn.com/uploads/chorus_asset/file/9286273/Matternet_Station_4.jpg" alt=""> Photo: Matternet After a drone lands, it also swaps out its battery so the drone remains charged. The maximum distance a drone can travel is 20km (12.4 miles) depending on weather conditions like high winds, and they have cruising speeds of 70 kilometers per hour (43.5 miles per hour). Matternet says initially, about one to two drones will operate per network. Each station features an “automated aerial deconfliction system” that manages drone traffic over the station.

    Matternet also envisions that in the future the stations could also be placed at grocery stores or gas stations for deliveries. Matternet says the next markets it wants to tackle are Germany and the UK, once it has a solid footing in Switzerland.

    theverge.com

Share RecommendKeepReplyMark as Last Read


From: kidl9/21/2017 4:30:16 AM
   of 1467
 
Forget English and Maths, DRONE FLYING will be introduced as an HSC subject for students next year

Students in New South Wales will be able to take drone flying as an HSC subject
As of next year a the new course will allow students to learn the new technology
NSW is the first Australian state to introduce the subject after 12 months of trials

By Sam Duncan For Daily Mail Australia

Published: 12:34 BST, 9 September 2017



Read more: dailymail.co.uk

Share RecommendKeepReplyMark as Last Read


From: kidl9/21/2017 4:53:01 AM
1 Recommendation   of 1467
 
Commercial drone use to expand tenfold by 2021

  • The FAA expects the U.S. commercial drone fleet to grow from 42K at the end of 2016 to about 442K aircraft by 2021 (but there could also be as many as 1.6M commercial drones in use by then).

  • The key difference in its growth estimates is in “how quickly the regulatory environment will evolve, enabling more widespread routine uses of (drones) for commercial purposes.”

    Since August, the FAA has approved more than 300 waivers for drone use without some restrictions for firms including Union Pacific (NYSE: UNP), BNSF Railway ( BRK.A, BRK.B), Intel (NASDAQ: INTC), Walt Disney (NYSE: DIS) and Time Warner (NYSE: TWX).

    dronevibes.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/22/2017 5:56:02 AM
       of 1467
     
    China's Baidu launches $1.5 billion autonomous driving fund

    Reuters Staff
    September 21, 2017 / 1:51 AM



    FILE PHOTO: A woman is silhouetted against the Baidu logo at a new product launch from Baidu, in Shanghai, China, November 26, 2015. REUTERS/Aly Song/File Photo
    _________________________________________

    BEIJING (Reuters) - Chinese search engine Baidu Inc ( BIDU.O) announced a 10 billion yuan ($1.52 billion) autonomous driving fund on Thursday as part of a wider plan to speed up its technical development and compete with U.S. rivals.

    The “Apollo Fund” will invest in 100 autonomous driving projects over the next three years, Baidu said in a statement.

    The fund’s launch coincides with the release of Apollo 1.5, the second generation of the company’s open-source autonomous vehicle software.

    After years of internal development, Baidu in April decided to open its autonomous driving technology to third parties, a move it hopes will accelerate development and help it compete with U.S. firms Tesla Inc ( TSLA.O) and Google project Waymo.

    In the latest update to its platform, Baidu says partners can access new obstacle perception technology and high-definition maps, among other features.



    FILE PHOTO: A employee uses his mobile phone as he walks past the company logo of Baidu at its headquarters in Beijing, August 5, 2010. REUTERS/Barry Huang/File Photo
    __________________________________

    It comes amid a wider reshuffle of Baidu’s corporate strategy as it looks for new profit streams outside its core search business, which lost a large chunk of ad revenue in 2016 following strict new government regulations on medical advertising.[

    Baidu’s Apollo project - named after the NASA moon landing - aims to create technology for completely autonomous cars, which it says will be ready for city roads in China by 2020.

    It now has 70 partners across several fields in the auto industry, up from 50 in July, it says. Existing partners include microprocessors firm Nvidia Corp ( NVDA.O) and mapping service TomTom NV.

    Despite the rapid growth of its partner ecosystem, Baidu has faced challenges negotiating local Chinese regulations, which have previously stopped the company from testing on highways.

    In July local Beijing police said it was investigating whether the company had broken city traffic rules by testing a driverless car on public roads as part of a demonstration for a press event.

    Reporting by Cate Cadell; Editing by Stephen Coates

    reuters.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/25/2017 5:47:13 AM
       of 1467
     
    Finally, a Driverless Car with Some Common Sense

    A startup called iSee thinks a new approach to AI will make self-driving cars better at dealing with unexpected situations.

    by Will Knight
    MIT Technology Review
    September 20, 2017

    Boston’s notoriously unfriendly drivers and chaotic roads may be the perfect testing ground for a fundamentally different kind of self-driving car.

    An MIT spin-off called iSee is developing and testing the autonomous driving system using a novel approach to artificial intelligence. Instead of relying on simple rules or machine-learning algorithms to train cars to drive, the startup is taking inspiration from cognitive science to give machines a kind of common sense and the ability to quickly deal with new situations. It is developing algorithms that try to match the way humans understand and learn about the physical world, including interacting with other people. The approach could lead to self-driving vehicles that are much better equipped to deal with unfamiliar scenes and complex interactions on the road.

    “The human mind is super-sensitive to physics and social cues,” says Yibiao Zhao, cofounder of iSee. “Current AI is relatively limited in those domains, and we think that is actually the missing piece in driving.”

    Zhao’s company doesn’t look like a world beater just yet. A small team of engineers works out of a modest lab space at the Engine, a new investment company created by MIT to fund innovative local tech companies. Located just a short walk from the MIT campus, the Engine overlooks a street on which drivers jostle for parking spots and edge aggressively into traffic.

    With massive amounts of computational power, machines can now recognize objects and translate speech in real time. Artificial intelligence is finally getting smart.

    The desks inside iSee’s space are covered with sensors and pieces of hardware the team has put together to take control of its first prototype, a Lexus hybrid SUV that originally belonged to one of the company’s cofounders. Several engineers sit behind large computer monitors staring intently at lines of code.

    iSee might seem laughably small compared to the driverless-car efforts at companies like Waymo, Uber, or Ford, but the technology it’s developing could have a big impact on many areas where AI is applied today. By enabling machines to learn from less data, and to build some form of common sense, their technology could make industrial robots smarter, especially about new situations. Spectacular progress has already been made in AI recently thanks to deep learning, a technique that employs vast data-hungry neural networks (see “ 10 Breakthrough Technologies 2013: Deep Learning”).

    When fed large amounts of data, very large or deep neural networks can recognize subtle patterns. Give a deep neural network lots of pictures of dogs, for instance, and it will figure out how to spot a dog in just about any image. But there are limits to what deep learning can do, and some radical new ideas may well be needed to bring about the next leap forward. For example, a dog-spotting deep-learning system doesn’t understand that dogs typically have four legs, fur, and a wet nose. And it cannot recognize other types of animals, or a drawing of a dog, without further training.

    Driving involves considerably more than just pattern recognition. Human drivers rely constantly on a commonsense understanding of the world. They know that buses take longer to stop, for example, and can suddenly produce lots of pedestrians. It would be impossible to program a self-driving car with every possible scenario it might encounter. But people are able to use their commonsense understanding of the world, built up through lifelong experience, to act sensibly in all sorts of new situations.

    “Deep learning is great, and you can learn a lot from previous experience, but you can’t have a data set that includes the whole world,” Zhao says. “Current AI, which is mostly data-driven, has difficulties understanding common sense; that’s the key thing that’s missing.” Zhao illustrates the point by opening his laptop to show several real-world road situations on YouTube, including complex traffic-merging situations and some hairy-looking accidents.

    A lack of commonsense knowledge has certainly caused some problems for autonomous driving systems. An accident involving a Tesla driving in semi-autonomous mode in Florida last year, for instance, occurred when the car’s sensors were temporarily confused as a truck crossed the highway (see “ Fatal Tesla Crash Is a Reminder Autonomous Cars Will Sometimes Screw Up”). A human driver would have likely quickly and safely figured out what was going on.

    A death behind the wheel with Tesla’s Autopilot on raises the question of how safe automated cars must be.

    Zhao and Debbie Yu, one of his cofounders, show a clip of an accident involving a Tesla in China, in which the car drove straight into a street-cleaning truck. “The system is trained on Israel or Europe, and they don’t have this kind of truck,” Zhao says. “It’s only based on detection; it doesn’t really understand what’s going on,” he says.

    iSee is built on efforts to understand how humans make sense of the world, and to design machines that mimic this. Zhao and other founders of iSee come from the lab of Josh Tenenbaum, a professor in the department of brain and cognitive science at MIT who now serves as an advisor to the company.

    Tenenbaum specializes in exploring how human intelligence works, and using that insight to engineer novel types of AI systems. This includes work on the intuitive sense of physics exhibited even by young children, for instance. Children’s ability to understand how the physical world behaves enables them to predict how unfamiliar situations may unfold. And, Tenenbaum explains, this understanding of the physical world is intimately connected with an intuitive understanding of psychology and the ability to infer what a person is trying to achieve, such as reaching for a cup, by watching his or her actions.

    The ability to transfer learning between situations is also a hallmark of human intelligence, and even the smartest machine-learning systems are still very limited by comparison. Tenenbaum’s lab combines conventional machine learning with novel “probabilistic programming” approaches. This makes it possible for machines to learn to infer things about the physics of the world as well as the intentions of others despite uncertainty.

    Software that learns to recognize written characters from just one example may point the way towards more powerful, more humanlike artificial intelligence.

    Trying to reverse-engineer the ways in which even a young baby is smarter than the cleverest existing AI system could eventually lead to many smarter AI systems, Tenenbaum says. In 2015, together with researchers from New York University and Carnegie Mellon University, Tenenbaum used some of these ideas to develop a landmark computer program capable of learning to recognize handwriting from just a few examples (see “ This AI Algorithm Learns Simple Tasks As Fast As We Do”).

    A related approach might eventually give a self-driving car something approaching a rudimentary form of common sense in unfamiliar scenarios. Such a car may be able to determine that a driver who’s edging out into the road probably wants to merge into traffic.

    When it comes to autonomous driving, in fact, Tenenbaum says the ability to infer what another driver is trying to achieve could be especially important. Another of iSee’s cofounders, Chris Baker, developed computational models of human psychology while at MIT. “Taking engineering-style models of how humans understand other humans, and being able to put those into autonomous driving, could really provide a missing piece of the puzzle,” Tenenbaum says.

    Tenenbaum says he was not initially interested in applying ideas from cognitive psychology to autonomous driving, but the founders of iSee convinced him that the impact would be significant, and that they were up to the engineering challenges.

    “This is a very different approach, and I completely applaud it,” says Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, a research institute created by Microsoft cofounder Paul Allen to explore new ideas in AI, including ones inspired by cognitive psychology.

    Etzioni says the field of AI needs to explore ideas beyond deep learning. He says the main issue for iSee will be demonstrating that the techniques employed can perform well in critical situations. “Probabilistic programming is pretty new,” he notes, “so there are questions about the performance and robustness.”

    Those involved with iSee would seem to agree. Besides aiming to shake up the car industry and perhaps reshape transportation in the process, Tenenbaum says, iSee has a chance to explore how a new AI approach works in a particularly unforgiving practical situation.

    “In some sense, self-driving cars are going to be the first autonomous robots that interact with people in the real world,” he says. “The real challenge is, how do you take these models and make them work robustly?”

    technologyreview.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/25/2017 7:05:20 AM
       of 1467
     
    Former Apple Engineers Working on New Eyes for Driverless Cars

    By CADE METZ
    New York Times
    SEPT. 20, 2017



    Soroush Salehian and Mina Rezk, of Aeva, a Silicon Valley start-up making new guidance systems for driverless vehicles. Credit Jason Henry for The New York Times
    _________________________________

    PALO ALTO, Calif. — Soroush Salehian raised both arms and spun in circles as if celebrating a touchdown.

    Across the room, perched on a tripod, a small black device monitored this little dance and streamed it to a nearby laptop. Mr. Salehian appeared as a collection of tiny colored dots, some red, some blue, some green. Each dot showed the precise distance to a particular point on his body, while the colors showed the speed of his movements. As his right arm spun forward, it turned blue. His left arm, spinning away, turned red.

    “See how the arms are different?” said his business partner, Mina Rezk, pointing at the laptop. “It’s measuring different velocities.”

    Messrs. Salehian and Rezk are the founders of a new Silicon Valley start-up called Aeva, and their small black device is designed for self-driving cars. The veterans of Apple’s secretive Special Projects Group aim to give these autonomous vehicles a more complete, detailed and reliable view of the world around them — something that is essential to their evolution.



    A sign warns that a laser is in use at Aeva. Credit Jason Henry for The New York Times
    _______________________

    Today’s driverless cars under development at companies like General Motors, Toyota, Uber and the Google spinoff Waymo track their surroundings using a wide variety of sensors, including cameras, radar, GPS antennas and lidar (short for “light detection and ranging”) devices that measure distances using pulses of light.

    But there are gaps in the way these sensors operate, and combining their disparate streams of data is difficult. Aeva’s prototype — a breed of lidar that measures distances more accurately and also captures speed — aims to fill several of these sizable holes.

    “I don’t even think of this as a new kind of lidar,” said Tarin Ziyaee, co-founder and chief technology officer at the self-driving taxi start-up Voyage, who has seen the Aeva prototype. “It’s a whole different animal.”

    Founded in January and funded by the Silicon Valley venture capital firm Lux Capital, among others, Aeva joins a widespread effort to build more effective sensors for autonomous vehicles, a trend that extends from start-ups like Luminar, Echodyne and Metawave to established hardware makers like the German multinational Robert Bosch.

    The company’s name, Aeva, is a play on “Eve,” the name of the robot in the Pixar movie “WALL-E.”



    Mr. Rezk in Palo Alto. He and his business partner Mr. Salehian are veterans of the secretive Special Projects Group at Apple, which they left late in 2016. Credit Jason Henry for The New York Times
    _____________________________________

    The market for autonomous vehicles will grow to $42 billion by 2025, according to research by the Boston Consulting Group. But for that to happen, the vehicles will need new and more powerful sensors. Today’s autonomous cars are ill prepared for high-speed driving, bad weather and other common situations.

    The recent improvements in self-driving cars coincided with the improvements offered by new lidar sensors from a Silicon Valley company called Velodyne. These sensors gave cars a way of measuring distances to nearby vehicles, pedestrians and other objects. They also provided Google and other companies with a way of mapping urban roadways in three dimensions, so that cars will know exactly where they are at any given moment — something GPS cannot always provide.

    But these lidar sensors have additional shortcomings. They can gather information only about objects that are relatively close to them, which limits how fast the cars can travel. Their measurements aren’t always detailed enough to distinguish one object from another. And when multiple driverless cars are close together, their signals can become garbled.

    Other devices can pick up some of slack. Cameras are a better way of identifying pedestrians and street signs, for example, and radar works over longer distances. That’s why today’s self-driving cars track their surroundings through so many different sensors. But despite this wide array of hardware — which can cost hundreds of thousands of dollars per vehicle — even the best autonomous vehicles still have trouble in so many situations that humans can navigate with ease.

    With their new sensor, Messrs. Salehian and Rezk are working to change that. Mr. Rezk is an engineer who designed optical hardware for Nikon, and presumably, he was among those who handled optical sensors for Apple’s driverless car project, though he and Mr. Salehian declined to say which “special project” they worked on at the company. They left Apple late last year.



    New devices are hidden under a black sheet in a research and development room at Aeva in Palo Alto, Calif. Credit Jason Henry for The New York Times
    ___________________________

    Where current lidar sensors send out individual pulses, Aeva’s device sends out a continuous wave of light. By reading the way this far more complex signal bounces off surrounding objects, Mr. Rezk said, the device can capture a far more detailed image while also tracking velocity. You can think of it as a cross between lidar, which is so good at measuring depth, and radar, which is so good at measuring speed.

    Mr. Rezk also said the device’s continuous wave would provide greater range and resolution than existing lidar devices, deal better with weather and highly reflective objects like bridge railings, and avoid interference with other optical sensors.

    Cars will continue to use multiple kinds of sensors, in part because redundancy helps ensure that these cars are safe. But Aeva aims to give these cars a better view of the world from a smaller and less expensive set of sensors.

    Researchers at the University of California, Berkeley, have built similar hardware, and companies like Velodyne and the start-ups Oryx Vision and Quanergy say they are exploring similar ideas. Like these efforts, the Aeva prototype is still under development, and the company plans to sell devices next year. But it shows how autonomous car sensors need to evolve — and that they are indeed evolving.

    Ultimately, new sensors will allow cars to make better decisions. “With autonomous cars, 90 percent of the time, you are trying to infer what is happening,” Mr. Ziyaee said. “But what if you can just measure it?”

    Follow Cade Metz on Twitter: @CadeMetz

    A version of this article appears in print on September 21, 2017, on Page B2 of the New York edition with the headline: Seeking Keener Vision for Driverless Cars.

    Share RecommendKeepReplyMark as Last Read


    From: Savant9/26/2017 4:14:14 PM
       of 1467
     
    Drones in archaeology...
    60's drone photography found Alexander the Great's lost city

    usatoday.com

    Share RecommendKeepReplyMark as Last ReadRead Replies (1)


    To: Savant who wrote (1439)9/26/2017 7:18:31 PM
    From: Savant
    1 Recommendation   of 1467
     

    Turns out a more through article sez>>

    Spy satellite footage in '60s, and drone use since then and now

    dailymail.co.uk

    a much better article

    Share RecommendKeepReplyMark as Last Read
    Previous 10 Next 10 

    Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.