SI
SI
discoversearch

   Technology StocksDrones, Autonomous Vehicles and Flying Cars


Previous 10 Next 10 
From: Glenn Petersen9/20/2017 7:19:33 AM
   of 1467
 
    The first autonomous drone delivery network will fly above Switzerland starting next month

    Like a futuristic postal service... that delivers blood

    by Thuy Ong @ThuyOng
    The Verge
    Sep 20, 2017, 6:00am EDT



    Photo: Matternet
    ________________________________

    Logistics company Matternet has announced a permanent autonomous drone network in Switzerland that will now see lab samples like blood tests and other diagnostics flown between hospital facilities, clinics, and labs. The first delivery network will be operational from next month, with several more to be introduced later in the year. Matternet says medical items can be delivered to hospitals within 30 minutes.

    Matternet, based in Menlo Park, California, was granted authorization to operate its drones over densely populated areas in Switzerland in March and says that approval was a world first. Today, the company unveiled a Matternet Station; a kind of white, futuristic looking postbox with a footprint measuring about two square meters, that can be installed on rooftops or on the ground to send and receive packages by drone.



    The drone network is part of a partnership with Swiss Post, and is significant because it’s the first operational drone network flying in dense urban areas that’s not a pilot run or in testing. Last month, Zipline announced plans to operate its blood delivering service by drone in Tanzania by early next year as well. A pair of hospitals in Lugano in Switzerland had previously tested Matternet drone flights to deliver lab samples. Matternet plans to establish a regular service starting in early 2018.

    “These types of diagnostics that need to be transported are urgent in nature and they are on demand,” Andreas Raptopoulos co-founder and CEO of Matternet told The Verge. “They have to wait for a courier, sometimes they get taxis to do this type of thing — and when you have a system like this, that is autonomous and reliable, it completely transforms operations.”

    Users operate the system via an app to create shipment details. Items are placed into a compartment box in the station before being loaded into a drone for delivery. Currently the drones can hold up to 2kg (4.4 pounds). Packages are then flown to another Matternet station, where receivers can obtain their package by scanning a QR code.

    <img src="https://cdn.vox-cdn.com/uploads/chorus_asset/file/9286273/Matternet_Station_4.jpg" alt=""> Photo: Matternet After a drone lands, it also swaps out its battery so the drone remains charged. The maximum distance a drone can travel is 20km (12.4 miles) depending on weather conditions like high winds, and they have cruising speeds of 70 kilometers per hour (43.5 miles per hour). Matternet says initially, about one to two drones will operate per network. Each station features an “automated aerial deconfliction system” that manages drone traffic over the station.

    Matternet also envisions that in the future the stations could also be placed at grocery stores or gas stations for deliveries. Matternet says the next markets it wants to tackle are Germany and the UK, once it has a solid footing in Switzerland.

    theverge.com

Share RecommendKeepReplyMark as Last Read


From: kidl9/21/2017 4:30:16 AM
   of 1467
 
Forget English and Maths, DRONE FLYING will be introduced as an HSC subject for students next year

Students in New South Wales will be able to take drone flying as an HSC subject
As of next year a the new course will allow students to learn the new technology
NSW is the first Australian state to introduce the subject after 12 months of trials

By Sam Duncan For Daily Mail Australia

Published: 12:34 BST, 9 September 2017



Read more: dailymail.co.uk

Share RecommendKeepReplyMark as Last Read


From: kidl9/21/2017 4:53:01 AM
1 Recommendation   of 1467
 
Commercial drone use to expand tenfold by 2021

  • The FAA expects the U.S. commercial drone fleet to grow from 42K at the end of 2016 to about 442K aircraft by 2021 (but there could also be as many as 1.6M commercial drones in use by then).

  • The key difference in its growth estimates is in “how quickly the regulatory environment will evolve, enabling more widespread routine uses of (drones) for commercial purposes.”

    Since August, the FAA has approved more than 300 waivers for drone use without some restrictions for firms including Union Pacific (NYSE: UNP), BNSF Railway ( BRK.A, BRK.B), Intel (NASDAQ: INTC), Walt Disney (NYSE: DIS) and Time Warner (NYSE: TWX).

    dronevibes.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/22/2017 5:56:02 AM
       of 1467
     
    China's Baidu launches $1.5 billion autonomous driving fund

    Reuters Staff
    September 21, 2017 / 1:51 AM



    FILE PHOTO: A woman is silhouetted against the Baidu logo at a new product launch from Baidu, in Shanghai, China, November 26, 2015. REUTERS/Aly Song/File Photo
    _________________________________________

    BEIJING (Reuters) - Chinese search engine Baidu Inc ( BIDU.O) announced a 10 billion yuan ($1.52 billion) autonomous driving fund on Thursday as part of a wider plan to speed up its technical development and compete with U.S. rivals.

    The “Apollo Fund” will invest in 100 autonomous driving projects over the next three years, Baidu said in a statement.

    The fund’s launch coincides with the release of Apollo 1.5, the second generation of the company’s open-source autonomous vehicle software.

    After years of internal development, Baidu in April decided to open its autonomous driving technology to third parties, a move it hopes will accelerate development and help it compete with U.S. firms Tesla Inc ( TSLA.O) and Google project Waymo.

    In the latest update to its platform, Baidu says partners can access new obstacle perception technology and high-definition maps, among other features.



    FILE PHOTO: A employee uses his mobile phone as he walks past the company logo of Baidu at its headquarters in Beijing, August 5, 2010. REUTERS/Barry Huang/File Photo
    __________________________________

    It comes amid a wider reshuffle of Baidu’s corporate strategy as it looks for new profit streams outside its core search business, which lost a large chunk of ad revenue in 2016 following strict new government regulations on medical advertising.[

    Baidu’s Apollo project - named after the NASA moon landing - aims to create technology for completely autonomous cars, which it says will be ready for city roads in China by 2020.

    It now has 70 partners across several fields in the auto industry, up from 50 in July, it says. Existing partners include microprocessors firm Nvidia Corp ( NVDA.O) and mapping service TomTom NV.

    Despite the rapid growth of its partner ecosystem, Baidu has faced challenges negotiating local Chinese regulations, which have previously stopped the company from testing on highways.

    In July local Beijing police said it was investigating whether the company had broken city traffic rules by testing a driverless car on public roads as part of a demonstration for a press event.

    Reporting by Cate Cadell; Editing by Stephen Coates

    reuters.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/25/2017 5:47:13 AM
       of 1467
     
    Finally, a Driverless Car with Some Common Sense

    A startup called iSee thinks a new approach to AI will make self-driving cars better at dealing with unexpected situations.

    by Will Knight
    MIT Technology Review
    September 20, 2017

    Boston’s notoriously unfriendly drivers and chaotic roads may be the perfect testing ground for a fundamentally different kind of self-driving car.

    An MIT spin-off called iSee is developing and testing the autonomous driving system using a novel approach to artificial intelligence. Instead of relying on simple rules or machine-learning algorithms to train cars to drive, the startup is taking inspiration from cognitive science to give machines a kind of common sense and the ability to quickly deal with new situations. It is developing algorithms that try to match the way humans understand and learn about the physical world, including interacting with other people. The approach could lead to self-driving vehicles that are much better equipped to deal with unfamiliar scenes and complex interactions on the road.

    “The human mind is super-sensitive to physics and social cues,” says Yibiao Zhao, cofounder of iSee. “Current AI is relatively limited in those domains, and we think that is actually the missing piece in driving.”

    Zhao’s company doesn’t look like a world beater just yet. A small team of engineers works out of a modest lab space at the Engine, a new investment company created by MIT to fund innovative local tech companies. Located just a short walk from the MIT campus, the Engine overlooks a street on which drivers jostle for parking spots and edge aggressively into traffic.

    With massive amounts of computational power, machines can now recognize objects and translate speech in real time. Artificial intelligence is finally getting smart.

    The desks inside iSee’s space are covered with sensors and pieces of hardware the team has put together to take control of its first prototype, a Lexus hybrid SUV that originally belonged to one of the company’s cofounders. Several engineers sit behind large computer monitors staring intently at lines of code.

    iSee might seem laughably small compared to the driverless-car efforts at companies like Waymo, Uber, or Ford, but the technology it’s developing could have a big impact on many areas where AI is applied today. By enabling machines to learn from less data, and to build some form of common sense, their technology could make industrial robots smarter, especially about new situations. Spectacular progress has already been made in AI recently thanks to deep learning, a technique that employs vast data-hungry neural networks (see “ 10 Breakthrough Technologies 2013: Deep Learning”).

    When fed large amounts of data, very large or deep neural networks can recognize subtle patterns. Give a deep neural network lots of pictures of dogs, for instance, and it will figure out how to spot a dog in just about any image. But there are limits to what deep learning can do, and some radical new ideas may well be needed to bring about the next leap forward. For example, a dog-spotting deep-learning system doesn’t understand that dogs typically have four legs, fur, and a wet nose. And it cannot recognize other types of animals, or a drawing of a dog, without further training.

    Driving involves considerably more than just pattern recognition. Human drivers rely constantly on a commonsense understanding of the world. They know that buses take longer to stop, for example, and can suddenly produce lots of pedestrians. It would be impossible to program a self-driving car with every possible scenario it might encounter. But people are able to use their commonsense understanding of the world, built up through lifelong experience, to act sensibly in all sorts of new situations.

    “Deep learning is great, and you can learn a lot from previous experience, but you can’t have a data set that includes the whole world,” Zhao says. “Current AI, which is mostly data-driven, has difficulties understanding common sense; that’s the key thing that’s missing.” Zhao illustrates the point by opening his laptop to show several real-world road situations on YouTube, including complex traffic-merging situations and some hairy-looking accidents.

    A lack of commonsense knowledge has certainly caused some problems for autonomous driving systems. An accident involving a Tesla driving in semi-autonomous mode in Florida last year, for instance, occurred when the car’s sensors were temporarily confused as a truck crossed the highway (see “ Fatal Tesla Crash Is a Reminder Autonomous Cars Will Sometimes Screw Up”). A human driver would have likely quickly and safely figured out what was going on.

    A death behind the wheel with Tesla’s Autopilot on raises the question of how safe automated cars must be.

    Zhao and Debbie Yu, one of his cofounders, show a clip of an accident involving a Tesla in China, in which the car drove straight into a street-cleaning truck. “The system is trained on Israel or Europe, and they don’t have this kind of truck,” Zhao says. “It’s only based on detection; it doesn’t really understand what’s going on,” he says.

    iSee is built on efforts to understand how humans make sense of the world, and to design machines that mimic this. Zhao and other founders of iSee come from the lab of Josh Tenenbaum, a professor in the department of brain and cognitive science at MIT who now serves as an advisor to the company.

    Tenenbaum specializes in exploring how human intelligence works, and using that insight to engineer novel types of AI systems. This includes work on the intuitive sense of physics exhibited even by young children, for instance. Children’s ability to understand how the physical world behaves enables them to predict how unfamiliar situations may unfold. And, Tenenbaum explains, this understanding of the physical world is intimately connected with an intuitive understanding of psychology and the ability to infer what a person is trying to achieve, such as reaching for a cup, by watching his or her actions.

    The ability to transfer learning between situations is also a hallmark of human intelligence, and even the smartest machine-learning systems are still very limited by comparison. Tenenbaum’s lab combines conventional machine learning with novel “probabilistic programming” approaches. This makes it possible for machines to learn to infer things about the physics of the world as well as the intentions of others despite uncertainty.

    Software that learns to recognize written characters from just one example may point the way towards more powerful, more humanlike artificial intelligence.

    Trying to reverse-engineer the ways in which even a young baby is smarter than the cleverest existing AI system could eventually lead to many smarter AI systems, Tenenbaum says. In 2015, together with researchers from New York University and Carnegie Mellon University, Tenenbaum used some of these ideas to develop a landmark computer program capable of learning to recognize handwriting from just a few examples (see “ This AI Algorithm Learns Simple Tasks As Fast As We Do”).

    A related approach might eventually give a self-driving car something approaching a rudimentary form of common sense in unfamiliar scenarios. Such a car may be able to determine that a driver who’s edging out into the road probably wants to merge into traffic.

    When it comes to autonomous driving, in fact, Tenenbaum says the ability to infer what another driver is trying to achieve could be especially important. Another of iSee’s cofounders, Chris Baker, developed computational models of human psychology while at MIT. “Taking engineering-style models of how humans understand other humans, and being able to put those into autonomous driving, could really provide a missing piece of the puzzle,” Tenenbaum says.

    Tenenbaum says he was not initially interested in applying ideas from cognitive psychology to autonomous driving, but the founders of iSee convinced him that the impact would be significant, and that they were up to the engineering challenges.

    “This is a very different approach, and I completely applaud it,” says Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, a research institute created by Microsoft cofounder Paul Allen to explore new ideas in AI, including ones inspired by cognitive psychology.

    Etzioni says the field of AI needs to explore ideas beyond deep learning. He says the main issue for iSee will be demonstrating that the techniques employed can perform well in critical situations. “Probabilistic programming is pretty new,” he notes, “so there are questions about the performance and robustness.”

    Those involved with iSee would seem to agree. Besides aiming to shake up the car industry and perhaps reshape transportation in the process, Tenenbaum says, iSee has a chance to explore how a new AI approach works in a particularly unforgiving practical situation.

    “In some sense, self-driving cars are going to be the first autonomous robots that interact with people in the real world,” he says. “The real challenge is, how do you take these models and make them work robustly?”

    technologyreview.com

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/25/2017 7:05:20 AM
       of 1467
     
    Former Apple Engineers Working on New Eyes for Driverless Cars

    By CADE METZ
    New York Times
    SEPT. 20, 2017



    Soroush Salehian and Mina Rezk, of Aeva, a Silicon Valley start-up making new guidance systems for driverless vehicles. Credit Jason Henry for The New York Times
    _________________________________

    PALO ALTO, Calif. — Soroush Salehian raised both arms and spun in circles as if celebrating a touchdown.

    Across the room, perched on a tripod, a small black device monitored this little dance and streamed it to a nearby laptop. Mr. Salehian appeared as a collection of tiny colored dots, some red, some blue, some green. Each dot showed the precise distance to a particular point on his body, while the colors showed the speed of his movements. As his right arm spun forward, it turned blue. His left arm, spinning away, turned red.

    “See how the arms are different?” said his business partner, Mina Rezk, pointing at the laptop. “It’s measuring different velocities.”

    Messrs. Salehian and Rezk are the founders of a new Silicon Valley start-up called Aeva, and their small black device is designed for self-driving cars. The veterans of Apple’s secretive Special Projects Group aim to give these autonomous vehicles a more complete, detailed and reliable view of the world around them — something that is essential to their evolution.



    A sign warns that a laser is in use at Aeva. Credit Jason Henry for The New York Times
    _______________________

    Today’s driverless cars under development at companies like General Motors, Toyota, Uber and the Google spinoff Waymo track their surroundings using a wide variety of sensors, including cameras, radar, GPS antennas and lidar (short for “light detection and ranging”) devices that measure distances using pulses of light.

    But there are gaps in the way these sensors operate, and combining their disparate streams of data is difficult. Aeva’s prototype — a breed of lidar that measures distances more accurately and also captures speed — aims to fill several of these sizable holes.

    “I don’t even think of this as a new kind of lidar,” said Tarin Ziyaee, co-founder and chief technology officer at the self-driving taxi start-up Voyage, who has seen the Aeva prototype. “It’s a whole different animal.”

    Founded in January and funded by the Silicon Valley venture capital firm Lux Capital, among others, Aeva joins a widespread effort to build more effective sensors for autonomous vehicles, a trend that extends from start-ups like Luminar, Echodyne and Metawave to established hardware makers like the German multinational Robert Bosch.

    The company’s name, Aeva, is a play on “Eve,” the name of the robot in the Pixar movie “WALL-E.”



    Mr. Rezk in Palo Alto. He and his business partner Mr. Salehian are veterans of the secretive Special Projects Group at Apple, which they left late in 2016. Credit Jason Henry for The New York Times
    _____________________________________

    The market for autonomous vehicles will grow to $42 billion by 2025, according to research by the Boston Consulting Group. But for that to happen, the vehicles will need new and more powerful sensors. Today’s autonomous cars are ill prepared for high-speed driving, bad weather and other common situations.

    The recent improvements in self-driving cars coincided with the improvements offered by new lidar sensors from a Silicon Valley company called Velodyne. These sensors gave cars a way of measuring distances to nearby vehicles, pedestrians and other objects. They also provided Google and other companies with a way of mapping urban roadways in three dimensions, so that cars will know exactly where they are at any given moment — something GPS cannot always provide.

    But these lidar sensors have additional shortcomings. They can gather information only about objects that are relatively close to them, which limits how fast the cars can travel. Their measurements aren’t always detailed enough to distinguish one object from another. And when multiple driverless cars are close together, their signals can become garbled.

    Other devices can pick up some of slack. Cameras are a better way of identifying pedestrians and street signs, for example, and radar works over longer distances. That’s why today’s self-driving cars track their surroundings through so many different sensors. But despite this wide array of hardware — which can cost hundreds of thousands of dollars per vehicle — even the best autonomous vehicles still have trouble in so many situations that humans can navigate with ease.

    With their new sensor, Messrs. Salehian and Rezk are working to change that. Mr. Rezk is an engineer who designed optical hardware for Nikon, and presumably, he was among those who handled optical sensors for Apple’s driverless car project, though he and Mr. Salehian declined to say which “special project” they worked on at the company. They left Apple late last year.



    New devices are hidden under a black sheet in a research and development room at Aeva in Palo Alto, Calif. Credit Jason Henry for The New York Times
    ___________________________

    Where current lidar sensors send out individual pulses, Aeva’s device sends out a continuous wave of light. By reading the way this far more complex signal bounces off surrounding objects, Mr. Rezk said, the device can capture a far more detailed image while also tracking velocity. You can think of it as a cross between lidar, which is so good at measuring depth, and radar, which is so good at measuring speed.

    Mr. Rezk also said the device’s continuous wave would provide greater range and resolution than existing lidar devices, deal better with weather and highly reflective objects like bridge railings, and avoid interference with other optical sensors.

    Cars will continue to use multiple kinds of sensors, in part because redundancy helps ensure that these cars are safe. But Aeva aims to give these cars a better view of the world from a smaller and less expensive set of sensors.

    Researchers at the University of California, Berkeley, have built similar hardware, and companies like Velodyne and the start-ups Oryx Vision and Quanergy say they are exploring similar ideas. Like these efforts, the Aeva prototype is still under development, and the company plans to sell devices next year. But it shows how autonomous car sensors need to evolve — and that they are indeed evolving.

    Ultimately, new sensors will allow cars to make better decisions. “With autonomous cars, 90 percent of the time, you are trying to infer what is happening,” Mr. Ziyaee said. “But what if you can just measure it?”

    Follow Cade Metz on Twitter: @CadeMetz

    A version of this article appears in print on September 21, 2017, on Page B2 of the New York edition with the headline: Seeking Keener Vision for Driverless Cars.

    Share RecommendKeepReplyMark as Last Read


    From: Savant9/26/2017 4:14:14 PM
       of 1467
     
    Drones in archaeology...
    60's drone photography found Alexander the Great's lost city

    usatoday.com

    Share RecommendKeepReplyMark as Last ReadRead Replies (1)


    To: Savant who wrote (1439)9/26/2017 7:18:31 PM
    From: Savant
    1 Recommendation   of 1467
     

    Turns out a more through article sez>>

    Spy satellite footage in '60s, and drone use since then and now

    dailymail.co.uk

    a much better article

    Share RecommendKeepReplyMark as Last Read


    From: Glenn Petersen9/28/2017 11:07:39 PM
    2 Recommendations   of 1467
     
    A Field Farmed Only by Drones

    By Nicola Twilley
    The New Yorker
    September 28, 2017



    The experience of the Hands Free Hectare team suggests that drone agriculture offers some substantial benefits.
    Photograph by Debbie Heeks Courtesy Harper Adams University
    _____________________

    Across the United Kingdom, the last of the spring barley has been brought in from the fields, the culmination of an agricultural calendar whose rhythm has remained unchanged for millennia. But when the nineteenth-century poet John Clare wrote, in his month-by-month description of the rural year, that in September “harvest’s busy hum declines,” it seems unlikely that he was imagining the particular buzz—akin to an amplified mosquito—of a drone.

    “The drone barley snatch was actually the thing that made it for me,” Jonathan Gill, a robotics engineer at Harper Adams University, told me recently. Gill is one of three self-described “lads” behind a small, underfunded initiative called Hands Free Hectare. Earlier this month, he and his associates became the first people in the world to grow, tend, and harvest a crop without direct human intervention. The “snatch” occurred on a blustery Tuesday, when Gill piloted his heavy-duty octocopter out over the middle of a field, and, as the barley whipped from side to side in the propellers’ downdraft, used a clamshell dangling from the drone to take a grain sample, which would determine whether the crop was ready for harvesting. (It was.) “Essentially, it’s the grab-the-teddy-with-the-claw game on steroids,” Gill’s colleague, the agricultural engineer Kit Franklin, said. “But it had never been done before. And we did it.”

    The idea for the project came about over a glass of barley’s best self: beer. Gill and Franklin were down the pub, lamenting the fact that, although big equipment manufacturers such as John Deere likely have all the technology they need to farm completely autonomously, none of them seem to actually be doing it. Gill knew that drones could be programmed, using open-source code, to move over a field on autopilot, changing altitude as needed. What if you could take the same software, he and Franklin wondered, and make it control off-the-shelf agricultural machinery? Together Gill, Franklin, and Martin Abell, a recent Harper Adams graduate, rustled up just over a quarter million dollars in grant money. Then they got hold of some basic equipment—a small Japanese tractor designed for use in rice paddies, a similarly undersized twenty-five-year-old combine harvester, a sprayer boom, and a seed drill—and connected the drone software to a series of motors, which, with a little tinkering, made it capable of turning the tractor’s steering wheel, switching the spray nozzles on and off, raising and lowering the drill, and choreographing the complex mechanized ballet of the combine.

    “There were lots of people who thought the project wasn’t going to work,” Gill said. “Lots.” Hands Free Hectare’s budget was so small that the team had no test field or backup machinery; indeed, they didn’t secure the tractor until last December, just a few months before the barley was due to be sown. This left little time for the necessary trial and error; often, Gill and his colleagues would be midway through configuring their setup to perform one task, only to have to take it apart again to accomplish another. When they finally managed to get the crop in the ground, their rows looked wobbly. “It turns out that the autopilots in these drone systems weren’t designed to travel in a very straight line,” Gill said. “There’s no need for it—it’s just designed to get from point A to point B as efficiently as possible.” When the software hit a rock, it would navigate around the obstruction, following the path of least resistance rather than plowing through. Gill adjusted the code for straighter steering, regardless of the terrain, but not in time for the initial planting, which meant that subsequent tractor runs crushed hundreds of precious barley seedlings.

    In order to live up to the name Hands Free Hectare, the team had decided that no one would set foot in the field until the harvest was brought in. This posed a problem for Kieran Walsh, its crop-health adviser, who was accustomed to collecting soil and plant samples manually and scrutinizing them for signs of infestation and illness. Walsh was aware that robotic weed detectors were already commercially available, but, to the best of his knowledge, there was nothing that could provide all the information he needed. “My initial thought was, Gosh, this is exciting,” he told me. “And then I thought, Right, actually, this is going to be quite tricky.” In the end, Gill flew drones over the field on a weekly basis, gathering spectral data that Walsh could use to measure the barley’s photosynthetic activity and assess soil moisture. With Abell and Franklin, he also built a robotic sample collector. “Ninety-five per cent of the information I wanted, they got for me,” Walsh said. “But there was that five per cent where I had to make an educated guess.”

    Hands Free Hectare’s final yield was a couple of metric tons lower than the average from conventionally farmed land—and the costs in both time and money were, unsurprisingly for a pilot project, stratospherically higher. Nevertheless, the team’s experience suggests that drone agriculture offers some substantial benefits. “For starters,” Abell said, “the opportunity for doing the right thing at the right time is much higher with automated machines.” Many of a farmer’s duties are weather-dependent; an autonomous tractor could, for instance, tap into live forecast data and choose to go out and apply fungicide when conditions are ideal, even if it’s four o’clock in the morning.

    More important, once the machinery no longer requires a person to sit on top of it, a farmer could deploy a fleet of small tractors to do the same work that he currently does riding one of today’s state-of-the-art, two-story-tall tractors. “That has massive, massive implications,” Walsh said. For one thing, Abell explained, it would make the application of fertilizers and herbicides far more precise. “All the boom sprayers in this country these days are at least twenty-four metres wide,” he said. “So you’ve essentially got a twenty-four-metre paintbrush to apply chemicals to a field that you’ve surveyed at two- or three- centimetre resolution.” Smaller tractors can spray crops over a smaller area, coming closer to matching the pixel-by-pixel picture of crop health provided by drone imagery. Abell, who comes from a farming family, added that transitioning away from big, heavy machinery would be better for the land, too. “All that weight is knocking the life out of the soil,” he said. “Our yields have plateaued, and that’s a big part of why.”

    The Hands Free Hectare team envisions a future in which farmers are fleet managers, programming their vehicles from a central mission control and using the time saved to focus on areas that need extra attention. “The actual driving of a tractor—I didn’t miss that at all,” Abell said. “And, by not spending all your time going in a straight line on auto-steer, it gives you more time to learn about your crop and hopefully manage it better.” So far, he said, the response among farmers has been generally enthusiastic; most of the reticence appears to come from the younger, early-career crowd. “The older guys that have been sat on a tractor for the last thirty years, seeing that it’s not the best use of their time—they appreciate it more,” Abell said.

    Self-driving tractors face many of the same safety issues as self-driving cars, in terms of cybersecurity and liability for accidents, so a good deal more work remains to be done before they will enter widespread use. Gill predicted that the first adopters will be in Japan, where the average farmer is seventy years old. Abell expects that commercial farmers in the U.K. will be automating at least some aspects of their operations within the next five years. The team’s focus, however, is on the even shorter term: first, a much needed vacation; then a new crop (winter wheat) in the ground by the end of October; and, finally, a special beer brewed from their hands-free harvest. “I’m hoping for a festive pint,” Gill said. “We’ll probably sell the rest to fund the project.”

    newyorker.com

    Share RecommendKeepReplyMark as Last Read


    From: Savant9/29/2017 5:53:10 PM
    1 Recommendation   of 1467
     

    RT...Outlaw Drones no match for Athena..
    foxnews.com

    also, Spider-Man Drone

    foxnews.com


    *Hmm, no doubt they'll employ it against people, too...

    Share RecommendKeepReplyMark as Last ReadRead Replies (1)
    Previous 10 Next 10 

    Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.