SI
SI
discoversearch

   Technology StocksDrones, Autonomous Vehicles and Flying Cars


Previous 10 Next 10 
From: supernova239/14/2017 2:50:40 PM
   of 1472
 

The No.1 reason Uber's flying car is just a fantasy
cnbc.com

Share RecommendKeepReplyMark as Last Read


To: kidl who wrote (1427)9/15/2017 4:56:38 PM
From: Glenn Petersen
   of 1472
 
Lots of money is pouring into the sector, some of it smarter and more efficient than others.

Google Has Spent Over $1.1 Billion on Self-Driving Tech

By Mark Harris
IEEE Spectrum
Posted 15 Sep 2017 | 15:00 GMT



Photo: Waymo
__________________________

Google has never publicly shared how much it spends on its self-driving cars. At first, Project Chauffeur was hidden away in Google’s ultra-secret X moonshot program. When that went public, its costs were bundled together in a vague “Other Bets” category that includes the company’s fiber Internet service, home automation, and life science spin-offs.

Now, a court filing in Waymo’s epic and ongoing lawsuit against Uber has accidentally revealed just how big a bet Google placed on autonomous vehicles. Between Project Chauffeur’s inception in 2009 and the end of 2015, Google spent $1.1 billion on developing its self-driving software and hardware, according to a recent deposition of Shawn Bananzadeh, a financial analyst at Waymo.

Bananzadeh was testifying as part of the lawsuit, in which Uber stands accused misappropriating trade secrets and violating patents from Waymo, Google’s self-driving-car offshoot. Because Waymo has yet to commercialize any of its technology in a meaningful way, the company thinks any damages in the case should be calculated on the basis of how much it spent building the technology in question.

When asked by an Uber lawyer how an estimate for developing one of the trade secrets, number 90, was arrived at, Bananzadeh replied: “My understanding is that it is a cost that captures the entire program spend from inception to the period of time where it stops.” He later clarified that meant from 2009, when Sebastian Thrun got the go-ahead for the project from Larry Page, to the end of 2015.

Throughout Bananzadeh’s deposition, every dollar amount was redacted to protect Waymo’s confidential commercial information. Every time, that is, except in the Uber lawyer’s very next question: “The calculation that was the basis of the $1.1 billion cost estimate for Trade Secret 90 is the same calculation that was done for Trade Secret 2 and Trade Secret 25?”

Waymo had apparently given an identical $1.1 billion cost estimate for each of the trade secrets being discussed. Bananzadeh was unable to provide a clear answer as to why that might be, except to say, “Insofar as it is part of the entirety of this self-driving system…. therefore, all of the costs of the program since inception… are what then informs that number.”

Waymo’s position seems to be that all of its trade secrets are inextricably linked to the whole self-driving car project, and any damages should reflect that fact.

In a filing, Otto Trucking called Waymo’s damages theory “entirely speculative” and “over the top,” and called on the court to forbid Waymo from offering any evidence or argument beyond the actual damages it has incurred.

Though $1.1 billion is unquestionably a massive figure, it actually seems quite reasonable compared to the recent over-heated market for self-driving car acquisitions. In March 2016, General Motors paid a billion for San Francisco–based Cruise Automation, a company that was a seller of after-market semi-autonomous vehicle kits. In February of this year, Ford invested the same amount in a joint venture with Argo AI, a two-month-old Pittsburgh start-up headed by a former Google self-driving car engineer. The largest self-driving acquisition to date, however, was Intel’s $15.3 billion purchase of Mobileye in March. The Israeli company had originally provided vision-based semi-autonomous technology for Tesla vehicles.

Uber shelled out a reported $680 million for self-driving truck maker Otto in August 2016, sight almost unseen. But it’s the circumstances surrounding the acquisition of Otto, and in particular its lidar technology, that are at the heart of Waymo’s case against Uber. Otto’s founder, Anthony Levandowski, allegedly had a draft contract for the purchase of the company before he even quit Google.

By spending its money earlier than others and mostly in-house, Google’s billion-dollar investment now looks relatively modest—almost a bargain. Waymo has, by far, the most sophisticated self-driving software. It has simulated over a billion miles of driving, and its cars have had the most self-driving experience on real streets (over 3 million miles in multiple cities).

The court case seems to suggest that Waymo has also built up an enviably solid platform of intellectual property. So, undesirable as this peek into its books might be for Google today, the company should pride itself on demonstrating that in-house R&D can still make a lot of financial sense.

spectrum.ieee.org

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen9/17/2017 3:20:06 PM
   of 1472
 
This drone maker’s stock has doubled in the past year

By Sally French
MarketWatch
Published: Sept 12, 2017 9:28 a.m. ET

Courtesy AeroVironment, Inc.
The Snipe drone from Aerovironment.
_______________________________

One American drone maker’s stock is flying especially high this year.

Shares of Los Angeles-based AeroVironment Inc. AVAV, -0.33% , which makes drones for military and enterprise customers, are up 81% this year and 102% in the past 12 months. That was boosted by a 9% rally after AeroVironment reported better-than-expected first-quarter earnings last month.

While Chinese drone maker DJI has crushed much of the competition for consumer drones, commercial drone makers are finding success, especially in military and enterprise applications.

AeroVironment, which makes drones both for reconnaissance and as lethal weapons, is increasingly turning to small drones, about the size of a water bottle.

“The demand for our small UAS products and solutions internationally is strong and continues to be strong,” AeroVironment Chief Executive Wahid Nawabi said during the company’s quarterly earnings call last week.

Aerovironment has delivered 30 of its Snipe drones to the U.S. Department of Defense. The 5-ounce, cell phone-sized camera drone has motors so quiet that it is difficult to detect, and is used for reconnaissance. AeroVironment sells its drones to 40 other countries, and this quarter signed contracts with the Australian military and an unnamed Middle East customer.

“We are impressed with continued international demand for small UAS,” analysts at Raymond James said in a note last week.

Courtesy

Aerovironment
Aerovironment’s Snipe drone
______________________________________

Another drone maker focused on enterprise and defense, Israeli-based Airobotics, announced on Thursday that it had nabbed $32.5 million in funding.

Meanwhile, China’s DJI has been the subject of controversy recently after a leaked memo indicated that the U.S. Army would discontinue using its drones due to “cyber vulnerabilities.” Some drone service providers have also said their clients will not allow them to use DJI products because of related security concerns. And while DJI has a massive market share in the consumer market, those security concerns could be an opportunity for other drone makers to leap into the commercial side of the market.

“Our small UAS solutions are designed to be highly secure and reliable,” AeroVironment’s Nawabi said, in reference to the DJI news.

The U.S. Army later clarified that it could continue using DJI drones if the software passes a security check. DJI also followed up on the news by releasing a new flight mode that stops internet traffic to and from the app that operates it, in order to provide enhanced data-privacy assurances for sensitive government and enterprise customers.

Nawabi indicated that recent rulings by the Federal Aviation Administration are holding back aspects of the commercial drone market. The FAA currently does not allow commercial pilots to fly at night, over people, or to fly drones beyond their line of sight without special permission.

So while a farmer might be able to fly a drone over their farmland, an electric utilities company that has thousands of miles of transmission lines would not be able to legally use a drone.

“The FAA will play, and is going to play, a critical role as an enabling factor for the adoption of this market,” Nawabi said.

There are more than 60,000 licensed commercial drone operators in the United States, according to the FAA. The FAA said in a statement Wednesday that it “is using a risk-based approach to enable increasingly more complex UAS operations,” but added that current restrictions on commercial drone use would likely be eased.

marketwatch.com

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen9/19/2017 6:47:13 AM
1 Recommendation   of 1472
 
Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in the context of cellular robotic systems. [1]

SI systems consist typically of a population of simple agents or boids interacting locally with one another and with their environment. The inspiration often comes from nature, especially biological systems. The agents follow very simple rules, and although there is no centralized control structure dictating how individual agents should behave, local, and to a certain degree random, interactions between such agents lead to the emergence of "intelligent" global behavior, unknown to the individual agents. Examples in natural systems of SI include ant colonies, bird flocking, animal herding, bacterial growth, fish schooling and microbial intelligence.

The application of swarm principles to robots is called swarm robotics, while 'swarm intelligence' refers to the more general set of algorithms. 'Swarm prediction' has been used in the context of forecasting problems.

en.wikipedia.org

Swarm intelligence key to successful operation of drones, says study

Sam Clark
The Stack
Tue 22 Aug 2017 5.27pm



Using swarm intelligence may be a way to combat many of the challenges faced by drone operators, both in business and by governments, according to a group of academics.

Researchers from British and French universities have written a paper noting how drones can now add value to these organisations due to low cost and mobilisation time – for example, checking the structural integrity of a building in a much cheaper way than with helicopters or high-rise cranes. They are also cheaper and quicker than helicopters for use by police services.

However, the study argues that drones rarely work effectively as individual units in these scenarios – they are best operated as part of a ‘fleet’, often with humans involved in a centralised control environment. This means ‘fleet participants’ need to cooperate with other drones in the same fleet. Similarly, if there are members of other fleets in the same airspace, they need to be able to communicate in order to operate safely.



The paper proposes different ways of managing drones. All decisions could be taken by a control centre – requiring real-time communication between drones in the air and the control centre on the ground. But there may be a number of situations where drones may need to behave autonomously, requiring a level of AI that a single drone is unlikely to have the computing capacity for.

Therefore, the authors suggest that ‘AI algorithms designed for the Swarm Intelligence paradigm can be applied.’ There are many challenges for a swarm of drones, and the paper finds what it calls mission-critical operations – which are time-sensitive and vital to the successful operation of the fleet.

When approaching these challenges, the authors note that it would be possible to create a set of pre-defined rules based on expected conditions and outcomes before the drones set off, but this does not account for the fact that drones are likely to encounter unexpected situations while in the air.

Instead, they propose using swarm intelligence so the drones can learn and adapt on the basis of the situation in which they find themselves.


Given that assessing the best way for a fleet of drones to operate presents considerable challenges, the paper assesses why a number of drones are preferable to a single drone in many business and emergency applications.

Firstly, there is better resiliency against failure – that is, if something fails on one drone, such as a temperature sensor, it would still be possible to get the same data from other drones.

Groups of drones can cover larger geographic locations, and carry out various, specialised tasks concurrently. They also form a network – meaning if drone B is too far away from the control centre to communicate with, but is close enough to drone A, it can effectively pass a message down the line.

On the advice of this study, organisations employing UAVs may be best utilising ‘self-aware, mission-focussed and independent fleets of drones.’

thestack.com

Share RecommendKeepReplyMark as Last Read


From: kidl9/19/2017 9:01:48 AM
2 Recommendations   of 1472
 

3 Reasons Why the Drone Industry Is Hiring Thousands of New Workers


A billion dollars in venture capital this year and deregulation are fueling a hot job market.


For the first time in history, investments in drones--those buzzy airborne gadgets that used to be called UAVs--have crossed a billion dollars. $1.2 billion has already been invested in unmanned aerial tech in 2017 so far, according to CB Insights. When I invested in a drone startup last year, there was only about $600 million in venture capital deployed. Drone pilots still had to be licensed like fixed-wing aircraft.

"In the past several years, the drone space has crossed the threshold from being driven by hobbyists and experimentalists to being largely enterprise-dominated," says John Frankel, founder at FFVC, a New York venture capital firm and early drone investor. He invested in Top Flight Technologies in 2015, and later, Skycatch, an autonomous aerial mapping startup that helps international construction and mining companies like Komatsu automate the collection, processing, and analysis of aerial data."

The doubling of investment and FAA deregulation now are driving a mini job boom. It's fast creating a workforce as large as that of private school teachers in the U.S.--about 400,000.

Deregulation is driving the drone market. Seeing the need for more licensed drone operators, about this time last year, the FAA created a new commercial drone pilot licensing program that requires no hands-on demonstration and onboards commercially licensed pilots fast. How fast? Plunk down $150, a 70-question multiple choice test, and the license could be yours. In the first 3 months, 300 new drone operators were minted every business day. Out of the first 28,000 applicants, some 22,000 passed. Those numbers pale, however, in comparison to the number of commercial drones registered in the same period--2,000 a day.

The FAA's young drone pilot license program is also creating sub-industries that create jobs. For example, the University of South Dakota just added drone operations to its academic curricula. Instructor Byron Noel shared recently with the Brookings Register, "It's useful in any field where an aerial perspective is useful."

"The drone industry is a great place to find a job," says Aerobo's Brian Streem, an Inc. 30 Under 30 honoree and the founder of that startup I invested in last year. His offices in Los Angeles and Brooklyn are hiring drone operators, production, and sales talent. Streem believes there are lots more changes to come in how users interact with drones. "A lot of people underestimate the complexities of actually pulling off a drone operation because of the 'unfun' stuff--charging batteries, performing flight maintenance, checking airspace. Automation only gets us so far. There is still manual work to do."

Many drone startup CEOs are pushing the FAA to ease back even further on regulation and further open the market. In a recent meeting with President Trump on emerging technology, former founder of Blackboard and current CEO of Precision Hawk of Raleigh, NC, Michael Chasen, lobbied Trump to relax regulations that are "limiting what drone technology can do," as reported by Recode.

Another big driver of the drone age is the promise of easy access to hard-to-reach locations. Whether you're Qualcomm and AT&T, which are piloting drones for cell tower inspection, or Amazon and Wal-mart, which are both investigated airship warehouses, not every business location has a highway. Drones are making a significant business case for extending the eyes, ears and operations of a number of enterprises at a cost that's worth exploring. Amazon is even looking at an " Alexa in the sky."

Corporations are on pace to acquire a drone startup a month this year. Increasingly, drone jobs aren't just at startups--they're at big corporations. Already in 2017, Intel acquired MaVinci. Boeing bought Liquid Robotics. Verizon snapped up Skyward and Snap snipped Ctrl Me Robotic.

FAA predicts the U.S. registered commercial drone feel to climb to between 442,000 and1.6 million in the next few years. Every few new drones create at least one job, at a minimum, to maintain, deploy and operationalize the asset, so we're looking at a new job force of a few hundred thousand created in just a few years.


The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.
inc.com




Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen9/20/2017 7:19:33 AM
   of 1472
 
    The first autonomous drone delivery network will fly above Switzerland starting next month

    Like a futuristic postal service... that delivers blood

    by Thuy Ong @ThuyOng
    The Verge
    Sep 20, 2017, 6:00am EDT



    Photo: Matternet
    ________________________________

    Logistics company Matternet has announced a permanent autonomous drone network in Switzerland that will now see lab samples like blood tests and other diagnostics flown between hospital facilities, clinics, and labs. The first delivery network will be operational from next month, with several more to be introduced later in the year. Matternet says medical items can be delivered to hospitals within 30 minutes.

    Matternet, based in Menlo Park, California, was granted authorization to operate its drones over densely populated areas in Switzerland in March and says that approval was a world first. Today, the company unveiled a Matternet Station; a kind of white, futuristic looking postbox with a footprint measuring about two square meters, that can be installed on rooftops or on the ground to send and receive packages by drone.



    The drone network is part of a partnership with Swiss Post, and is significant because it’s the first operational drone network flying in dense urban areas that’s not a pilot run or in testing. Last month, Zipline announced plans to operate its blood delivering service by drone in Tanzania by early next year as well. A pair of hospitals in Lugano in Switzerland had previously tested Matternet drone flights to deliver lab samples. Matternet plans to establish a regular service starting in early 2018.

    “These types of diagnostics that need to be transported are urgent in nature and they are on demand,” Andreas Raptopoulos co-founder and CEO of Matternet told The Verge. “They have to wait for a courier, sometimes they get taxis to do this type of thing — and when you have a system like this, that is autonomous and reliable, it completely transforms operations.”

    Users operate the system via an app to create shipment details. Items are placed into a compartment box in the station before being loaded into a drone for delivery. Currently the drones can hold up to 2kg (4.4 pounds). Packages are then flown to another Matternet station, where receivers can obtain their package by scanning a QR code.

    <img src="https://cdn.vox-cdn.com/uploads/chorus_asset/file/9286273/Matternet_Station_4.jpg" alt=""> Photo: Matternet After a drone lands, it also swaps out its battery so the drone remains charged. The maximum distance a drone can travel is 20km (12.4 miles) depending on weather conditions like high winds, and they have cruising speeds of 70 kilometers per hour (43.5 miles per hour). Matternet says initially, about one to two drones will operate per network. Each station features an “automated aerial deconfliction system” that manages drone traffic over the station.

    Matternet also envisions that in the future the stations could also be placed at grocery stores or gas stations for deliveries. Matternet says the next markets it wants to tackle are Germany and the UK, once it has a solid footing in Switzerland.

    theverge.com

Share RecommendKeepReplyMark as Last Read


From: kidl9/21/2017 4:30:16 AM
   of 1472
 
Forget English and Maths, DRONE FLYING will be introduced as an HSC subject for students next year

Students in New South Wales will be able to take drone flying as an HSC subject
As of next year a the new course will allow students to learn the new technology
NSW is the first Australian state to introduce the subject after 12 months of trials

By Sam Duncan For Daily Mail Australia

Published: 12:34 BST, 9 September 2017



Read more: dailymail.co.uk

Share RecommendKeepReplyMark as Last Read


From: kidl9/21/2017 4:53:01 AM
1 Recommendation   of 1472
 
Commercial drone use to expand tenfold by 2021

  • The FAA expects the U.S. commercial drone fleet to grow from 42K at the end of 2016 to about 442K aircraft by 2021 (but there could also be as many as 1.6M commercial drones in use by then).

  • The key difference in its growth estimates is in “how quickly the regulatory environment will evolve, enabling more widespread routine uses of (drones) for commercial purposes.”

    Since August, the FAA has approved more than 300 waivers for drone use without some restrictions for firms including Union Pacific (NYSE: UNP), BNSF Railway ( BRK.A, BRK.B), Intel (NASDAQ: INTC), Walt Disney (NYSE: DIS) and Time Warner (NYSE: TWX).

    dronevibes.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/22/2017 5:56:02 AM
       of 1472
     
    China's Baidu launches $1.5 billion autonomous driving fund

    Reuters Staff
    September 21, 2017 / 1:51 AM



    FILE PHOTO: A woman is silhouetted against the Baidu logo at a new product launch from Baidu, in Shanghai, China, November 26, 2015. REUTERS/Aly Song/File Photo
    _________________________________________

    BEIJING (Reuters) - Chinese search engine Baidu Inc ( BIDU.O) announced a 10 billion yuan ($1.52 billion) autonomous driving fund on Thursday as part of a wider plan to speed up its technical development and compete with U.S. rivals.

    The “Apollo Fund” will invest in 100 autonomous driving projects over the next three years, Baidu said in a statement.

    The fund’s launch coincides with the release of Apollo 1.5, the second generation of the company’s open-source autonomous vehicle software.

    After years of internal development, Baidu in April decided to open its autonomous driving technology to third parties, a move it hopes will accelerate development and help it compete with U.S. firms Tesla Inc ( TSLA.O) and Google project Waymo.

    In the latest update to its platform, Baidu says partners can access new obstacle perception technology and high-definition maps, among other features.



    FILE PHOTO: A employee uses his mobile phone as he walks past the company logo of Baidu at its headquarters in Beijing, August 5, 2010. REUTERS/Barry Huang/File Photo
    __________________________________

    It comes amid a wider reshuffle of Baidu’s corporate strategy as it looks for new profit streams outside its core search business, which lost a large chunk of ad revenue in 2016 following strict new government regulations on medical advertising.[

    Baidu’s Apollo project - named after the NASA moon landing - aims to create technology for completely autonomous cars, which it says will be ready for city roads in China by 2020.

    It now has 70 partners across several fields in the auto industry, up from 50 in July, it says. Existing partners include microprocessors firm Nvidia Corp ( NVDA.O) and mapping service TomTom NV.

    Despite the rapid growth of its partner ecosystem, Baidu has faced challenges negotiating local Chinese regulations, which have previously stopped the company from testing on highways.

    In July local Beijing police said it was investigating whether the company had broken city traffic rules by testing a driverless car on public roads as part of a demonstration for a press event.

    Reporting by Cate Cadell; Editing by Stephen Coates

    reuters.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/25/2017 5:47:13 AM
       of 1472
     
    Finally, a Driverless Car with Some Common Sense

    A startup called iSee thinks a new approach to AI will make self-driving cars better at dealing with unexpected situations.

    by Will Knight
    MIT Technology Review
    September 20, 2017

    Boston’s notoriously unfriendly drivers and chaotic roads may be the perfect testing ground for a fundamentally different kind of self-driving car.

    An MIT spin-off called iSee is developing and testing the autonomous driving system using a novel approach to artificial intelligence. Instead of relying on simple rules or machine-learning algorithms to train cars to drive, the startup is taking inspiration from cognitive science to give machines a kind of common sense and the ability to quickly deal with new situations. It is developing algorithms that try to match the way humans understand and learn about the physical world, including interacting with other people. The approach could lead to self-driving vehicles that are much better equipped to deal with unfamiliar scenes and complex interactions on the road.

    “The human mind is super-sensitive to physics and social cues,” says Yibiao Zhao, cofounder of iSee. “Current AI is relatively limited in those domains, and we think that is actually the missing piece in driving.”

    Zhao’s company doesn’t look like a world beater just yet. A small team of engineers works out of a modest lab space at the Engine, a new investment company created by MIT to fund innovative local tech companies. Located just a short walk from the MIT campus, the Engine overlooks a street on which drivers jostle for parking spots and edge aggressively into traffic.

    With massive amounts of computational power, machines can now recognize objects and translate speech in real time. Artificial intelligence is finally getting smart.

    The desks inside iSee’s space are covered with sensors and pieces of hardware the team has put together to take control of its first prototype, a Lexus hybrid SUV that originally belonged to one of the company’s cofounders. Several engineers sit behind large computer monitors staring intently at lines of code.

    iSee might seem laughably small compared to the driverless-car efforts at companies like Waymo, Uber, or Ford, but the technology it’s developing could have a big impact on many areas where AI is applied today. By enabling machines to learn from less data, and to build some form of common sense, their technology could make industrial robots smarter, especially about new situations. Spectacular progress has already been made in AI recently thanks to deep learning, a technique that employs vast data-hungry neural networks (see “ 10 Breakthrough Technologies 2013: Deep Learning”).

    When fed large amounts of data, very large or deep neural networks can recognize subtle patterns. Give a deep neural network lots of pictures of dogs, for instance, and it will figure out how to spot a dog in just about any image. But there are limits to what deep learning can do, and some radical new ideas may well be needed to bring about the next leap forward. For example, a dog-spotting deep-learning system doesn’t understand that dogs typically have four legs, fur, and a wet nose. And it cannot recognize other types of animals, or a drawing of a dog, without further training.

    Driving involves considerably more than just pattern recognition. Human drivers rely constantly on a commonsense understanding of the world. They know that buses take longer to stop, for example, and can suddenly produce lots of pedestrians. It would be impossible to program a self-driving car with every possible scenario it might encounter. But people are able to use their commonsense understanding of the world, built up through lifelong experience, to act sensibly in all sorts of new situations.

    “Deep learning is great, and you can learn a lot from previous experience, but you can’t have a data set that includes the whole world,” Zhao says. “Current AI, which is mostly data-driven, has difficulties understanding common sense; that’s the key thing that’s missing.” Zhao illustrates the point by opening his laptop to show several real-world road situations on YouTube, including complex traffic-merging situations and some hairy-looking accidents.

    A lack of commonsense knowledge has certainly caused some problems for autonomous driving systems. An accident involving a Tesla driving in semi-autonomous mode in Florida last year, for instance, occurred when the car’s sensors were temporarily confused as a truck crossed the highway (see “ Fatal Tesla Crash Is a Reminder Autonomous Cars Will Sometimes Screw Up”). A human driver would have likely quickly and safely figured out what was going on.

    A death behind the wheel with Tesla’s Autopilot on raises the question of how safe automated cars must be.

    Zhao and Debbie Yu, one of his cofounders, show a clip of an accident involving a Tesla in China, in which the car drove straight into a street-cleaning truck. “The system is trained on Israel or Europe, and they don’t have this kind of truck,” Zhao says. “It’s only based on detection; it doesn’t really understand what’s going on,” he says.

    iSee is built on efforts to understand how humans make sense of the world, and to design machines that mimic this. Zhao and other founders of iSee come from the lab of Josh Tenenbaum, a professor in the department of brain and cognitive science at MIT who now serves as an advisor to the company.

    Tenenbaum specializes in exploring how human intelligence works, and using that insight to engineer novel types of AI systems. This includes work on the intuitive sense of physics exhibited even by young children, for instance. Children’s ability to understand how the physical world behaves enables them to predict how unfamiliar situations may unfold. And, Tenenbaum explains, this understanding of the physical world is intimately connected with an intuitive understanding of psychology and the ability to infer what a person is trying to achieve, such as reaching for a cup, by watching his or her actions.

    The ability to transfer learning between situations is also a hallmark of human intelligence, and even the smartest machine-learning systems are still very limited by comparison. Tenenbaum’s lab combines conventional machine learning with novel “probabilistic programming” approaches. This makes it possible for machines to learn to infer things about the physics of the world as well as the intentions of others despite uncertainty.

    Software that learns to recognize written characters from just one example may point the way towards more powerful, more humanlike artificial intelligence.

    Trying to reverse-engineer the ways in which even a young baby is smarter than the cleverest existing AI system could eventually lead to many smarter AI systems, Tenenbaum says. In 2015, together with researchers from New York University and Carnegie Mellon University, Tenenbaum used some of these ideas to develop a landmark computer program capable of learning to recognize handwriting from just a few examples (see “ This AI Algorithm Learns Simple Tasks As Fast As We Do”).

    A related approach might eventually give a self-driving car something approaching a rudimentary form of common sense in unfamiliar scenarios. Such a car may be able to determine that a driver who’s edging out into the road probably wants to merge into traffic.

    When it comes to autonomous driving, in fact, Tenenbaum says the ability to infer what another driver is trying to achieve could be especially important. Another of iSee’s cofounders, Chris Baker, developed computational models of human psychology while at MIT. “Taking engineering-style models of how humans understand other humans, and being able to put those into autonomous driving, could really provide a missing piece of the puzzle,” Tenenbaum says.

    Tenenbaum says he was not initially interested in applying ideas from cognitive psychology to autonomous driving, but the founders of iSee convinced him that the impact would be significant, and that they were up to the engineering challenges.

    “This is a very different approach, and I completely applaud it,” says Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, a research institute created by Microsoft cofounder Paul Allen to explore new ideas in AI, including ones inspired by cognitive psychology.

    Etzioni says the field of AI needs to explore ideas beyond deep learning. He says the main issue for iSee will be demonstrating that the techniques employed can perform well in critical situations. “Probabilistic programming is pretty new,” he notes, “so there are questions about the performance and robustness.”

    Those involved with iSee would seem to agree. Besides aiming to shake up the car industry and perhaps reshape transportation in the process, Tenenbaum says, iSee has a chance to explore how a new AI approach works in a particularly unforgiving practical situation.

    “In some sense, self-driving cars are going to be the first autonomous robots that interact with people in the real world,” he says. “The real challenge is, how do you take these models and make them work robustly?”

    technologyreview.com

    Share RecommendKeepReplyMark as Last Read
    Previous 10 Next 10 

    Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.