|From: Frank Sully||8/3/2021 7:46:39 AM|
|I have become moderator of the Baidu (BIDU) board. Read the Introduction header as I have updated it. Also, I posted a representative sample of the important news in the last six months, starting with message 1814. If you have any concerns, suggestions or questions please PM me.|
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||8/4/2021 6:25:28 AM|
|Why Nvidia’s $40 billion bid for Arm could be in jeopardy|
PUBLISHED WED, AUG 4 20215:29 AM EDT
UPDATED 47 MIN AGO
Sam Shead @SAM_L_SHEAD
-- The deal, one of the biggest semiconductor takeovers ever seen, was announced last September to much fanfare, although competition regulators around the world soon announced plans to investigate the acquisition.
-- Probes were launched in the U.S., the U.K., China and Europe after companies like Qualcomm, Microsoft, Google and Huawei complained that the deal was bad for the semiconductor industry.
-- The U.K. is reportedly considering blocking the deal on national security grounds, while China and Europe’s probes are reportedly subject to delays.
LONDON — Nvidia’s $40 billion bid to buy U.K.-based chip designer Arm from Japan’s SoftBank has started to look increasingly uncertain in recent weeks.
The deal, one of the biggest semiconductor takeovers ever seen, was announced last September to much fanfare, although competition regulators around the world soon announced plans to investigate the acquisition. Probes were launched in the U.S., the U.K., China and Europe after companies like Qualcomm, Microsoft, Google and Huawei complained that the deal was bad for the semiconductor industry.
The U.K. investigation, being led by the Competition and Markets Authority, is also taking national security concerns into account. The CMA submitted its initial report to U.K. Culture Secretary Oliver Dowden on July 20.
The assessment contains worrying implications for national security and the U.K. is currently inclined to reject the takeover, according to a report from Bloomberg on Tuesday, citing an unnamed source familiar with the matter. A separate unnamed source said the U.K. was likely to conduct a deeper review into the merger as a result of national security concerns, Bloomberg reported. CNBC was unable to independently verify the report.
It’s unclear how U.K. national security will be impacted if Arm goes from being Japanese-owned to U.S.- owned but governments have come to view semiconductor technology as a vital asset amid the global chip shortage.
An Nvidia spokesperson told CNBC: “We continue to work through the regulatory process with the U.K. government. We look forward to their questions and expect to resolve any issues they may have.” Arm and the U.K. government did not immediately respond to CNBC’s request for comment.
The deal, which was initially expected to close by March 2022, also risks being held up elsewhere. In June, Chinese antitrust lawyers reportedly told The Financial Times that China’s investigation could take the deal beyond the 18-month window given by Nvidia in Sept. 2020.
Meanwhile, European regulators are thought to be reluctant to consider the case until after the summer holidays, according to a Reuters report published in June that cites people familiar with the matter, who say this could make it difficult for Nvidia to close the deal by March next year.
The purchase agreement gives the two companies the option to extend the deadline to September 2022. But, at that point, either company can walk away if the deal does not receive government approval.
What is Arm?
Cambridge-based Arm sells its chip blueprints and licenses to chip manufacturers around the world; it is viewed as a “neutral player” and is sometimes referred to as the “Switzerland of the chip industry.”
Some of these manufacturers, which compete with Nvidia, are concerned that the Santa Clara-headquartered chip giant could make it harder for them to access Arm’s technology.
Nvidia has repeatedly insisted that it won’t change Arm’s business model and that it will invest heavily in the company to help it meet increasing demand.
Nvidia’s share price does not seem to have been affected following the Bloomberg report. It closed at $198.15 on Tuesday, up almost 1% for the day.
Elsewhere, another semiconductor acquisition is also being scrutinized. U.K. Prime Minister Boris Johnson has ordered the national security adviser, Stephen Longrove, to review the takeover of Newport Wafer Fab, the U.K.’s largest semiconductor wafer manufacturing facility. The company is being acquired by Chinese-owned Nexperia for £63 million ($88 million).
Nvidia’s $40 billion bid for Arm could be in jeopardy (cnbc.com)
|RecommendKeepReplyMark as Last Read|
|From: Frank Sully||8/4/2021 11:12:43 PM|
|Nvidia stock gains after Rosenblatt price target boost|
Aug. 04, 2021 2:32 PM ET
NVIDIA Corporation (NVDA)
by Brandy Betz, SA News Editor
Justin Sullivan/Getty Images News
- Calling the company a best-in-class artificial intelligence play, Rosenblatt Securities reiterates a Buy rating on Nvidia (NASDAQ: NVDA)and raises the price target from $200 to $250.
- Analyst Hans Mosesmann makes the move the day after a fireside chat with company management.
- The analyst notes that Nvidia ( NVDA) also has "growth vectors into next generation networking/DPU adoption and early-days of autonomous driving."
- Mosesmann thinks the $40B acquisition of SoftBank's Arm chip unit is unlikely to happen, which the Street is slowly realizing, but says the stock "will work nonetheless."
- NVIDIA ( NVDA) shares are up 2.2% to $202.41.
|RecommendKeepReplyMark as Last Read|
|From: Frank Sully||8/5/2021 1:46:22 AM|
|AutoX Robotaxis Now Using NVIDIA DRIVE, NVIDIA Acquiring DeepMap, & DiDi Booming On NVIDIA DRIVE’s Back|
Five years ago, in a blogging competition about “what will be the most important technological development over the next 10 years that will have the greatest impact in reducing climate change risks,” I concluded that the answer was robotaxis. If true robotaxis, broadly available and deployed in cities around the world, come to fruition, the potential reduction in emissions is immense. This is assuming they are electrically powered, but that seems most sensible for several reasons — especially by the middle of the decade.
Naturally, many of us think that Tesla is quite far ahead in the development of broadly applicable, cost-competitive robotaxi hardware and firmware. However, it certainly isn’t the only name in town, and there are also many who think that Tesla’s approach cannot lead to true robotaxis. One other tech company you have to keep on the table of possibilities is NVIDIA. Aside from being a tech giant in other realms, one of the advantages NVIDIA has is that it supplies hardware — and increasingly software services — for a bunch of automakers. Also, as the industry has evolved, NVIDIA has looked more seriously at providing integrated, robust technology partnerships with these automakers — not just as a supplier, but as a team working with automakers’ driver-assist or self-driving teams.
With all of that in mind, NVIDIA has rolled out a series of 6 news stories in the past two months related to autonomous driving. In this article, I’m going through 3 of those that relate to the tech giant’s NVIDIA DRIVE solutions. Let’s catch up and check those out.
AutoX Robotaxis in Service NowProbably the biggest story of the batch is that AutoX, a self-driving vehicle startup out of China, has launched its 5th generation robotaxi platform and the platform uses NVIDIA DRIVE. The system’s “automotive-grade GPUs to reach up to 2,200 trillion operations per second (TOPS) of AI compute performance.”
We did cover the rollout of AutoX robotaxis in January, when they launched to the public in Shenzhen, the 5th largest city in China (population over 12 million). It’s a solid testament to NVIDIA that a company with robotaxis on the road just upgraded to the new NVIDIA DRIVE platform. “Safety is key,” said Jianxiong Xiao, founder and CEO of AutoX. “We need higher processing performance for safe and scalable robotaxi operations. With NVIDIA DRIVE, we now have power for more redundancy in a form factor that is automotive grade and more compact.”
Even more impressive that this service is in place in the high-traffic, highly complex streets of Shenzhen. NVIDIA notes, “Safely navigating such chaotic streets requires sensors that can detect obstacles and other road users with the highest levels of accuracy. The Gen5 system relies on 28 automotive-grade camera sensors generating more than 200 million pixels per frame 360-degrees around the car. (For comparison, a single high-definition video frame contains about 2 million pixels.)” Mind blowing. “In addition to cameras, the robotaxi system includes six high-resolution lidar sensors that produce 15 million points per second and surround 4D radar.”
Now, Tesla fans will quickly point out that Tesla recently ditched radar because it basically just got in the way, and that Tesla is working to solve broad, general AI challenges. Nonetheless, let’s not miss the fact that NVIDIA DRIVE is being used in robotaxis that are in service right this moment in one of the largest and most traffic-heavy cities on Earth.
“At the center of the Gen5 system are two NVIDIA Ampere architecture GPUs that deliver 900 TOPS each for a truly level 4 autonomous, production platform. With this unprecedented level of AI compute at the core, Gen5 has enough performance to power ultra complex self-driving DNNs while maintaining the compute headroom for more advanced upgrades.
“This capability makes it possible for the vehicles to react to high-traffic situations — like dozens of motorcycles and scooters cutting in or riding the opposite way at the same time — in real time, and continually improving, learning how to manage new scenarios as they arise.”
See — other systems can learn, too.
AutoX is just getting started, with plans to roll out robotaxis in cities around the world and with large automotive partners like Honda and Stellantis. And NVIDIA is just getting started, as well.
NVIDIA Acquires DeepMap
To further improve its mapping solutions for aforementioned autonomous driving systems, NVIDIA announced in June that it was acquiring DeepMap. Clearly, there’s an implication of deep learning in that name — it’s all AI all the time these days. The summary highlights from that announcement: “DeepMap expected to extend NVIDIA mapping products, scale worldwide map operations and expand NVIDIA’s full-self driving expertise.”
NVIDIA is an amazing, world-changing company that shares our vision to accelerate safe autonomy,” said James Wu, co-founder and CEO of DeepMap. “Joining forces with NVIDIA will allow our technology to scale more quickly and benefit more people sooner. We look forward to continuing our journey as part of the NVIDIA team.” DeepMap cofounders James Wu and Mark Wheeler previously worked at Google, Apple, and Baidu, so going back into a tech giant must feel a little bit like going home after getting DeepMap off the ground and acquired by NVIDIA.
What’s so special about DeepMap? Well, we don’t have an insight into the coding (and seeing it wouldn’t help me much anyway), but the key appears to be the crowdsourced data collection from a broad fleet of vehicles, which “lets DeepMap build a high-definition map that’s continuously updated as the car drives.” Naturally, the coding must be good, too. Getting integrated into NVIDIA DRIVE, it will certainly collect a lot more data and benefit from fast-growing deployment.
The acquisition hasn’t closed yet — going through all of the paperwork and lawyers necessary, it’s expected to close this quarter.
DiDi Goes Public, Also Benefiting From NVIDIA DRIVE
DiDi robotaxis, courtesy of DiDi & NVIDIA.
“ Gigantic Chinese ride-hailing company DiDi just went public about a month ago, raising a ginormous $4.4 billion. Not too shabby, but note that DiDi has nearly 500 million active users across 71 countries and 10,000 cities. NVIDIA took the moment to note that DiDi “is developing its upcoming robotaxi platform on NVIDIA DRIVE AGX Pegasus.”
The question is, who isn’t using NVIDIA DRIVE?
|RecommendKeepReplyMark as Last Read|
|From: Frank Sully||8/5/2021 11:04:57 AM|
|Interview With Murali Gopalakrishna, GM, Robotics @ NVIDIA |
NVIDIA created the Isaac robotics platform, including the Isaac Sim application on the NVIDIA Omniverse platform for simulation and training of robots in virtual environments
For this week’s practitioners series, Analytics India Magazine (AIM) got in touch with Murali Gopalakrishna, Head of Product Management, Autonomous Machines and General Manager for Robotics. He also leads the business development team focusing on robots, drones, industrial IoT and enterprise collaboration products at NVIDIA. In this interview, we discuss in detail the robotics solutions developed by NVIDIA and their significance.
AIM: Can you tell us about how NVIDIA is building robotics solutions to be used at scale?Murali: Robotics algorithms can be mainly classified into (1) sensing/perception, (2) mobility (motion/path planning), and (3) robot control. All these fields are seeing significant innovation in the recent past with AI/Deep Learning playing an important role. With NVIDIA GPU-accelerated AI at the edge computing platforms, manufacturers will be able develop complex algorithms and deploy robotics at scale.
Robots have to sense, plan and act. To develop robots that are autonomous and efficient, developers have to accelerate algorithms for the complete stack. Algorithms such as object detection, pose estimation and depth estimation are used to perceive the environment, create a map of the environment and localise the robot in the environment. Algorithms such as free space segmentation are used for planning the efficient path for the robot, while control algorithms determine the commands for the robot to go on the planned path. Advances in AI and GPU-accelerated computing are making all these algorithms more accurate and faster, creating robots that are more capable and safe.
Ease of use and deployment have made the NVIDIA Jetson platform a logical choice for over half a million developers, researchers, and manufacturers building and deploying robots worldwide. We provide a full suite of tools and SDKs for developers and companies scaling robotics and automation applications:
- Open source packages for ROS/ROS2 (Human Pose Estimation, Accelerated AprilTags), Docker containers, Cuda library support and more.
- For training: NVIDIA Transfer Learning Toolkit (TLT) helps reduce costs associated with large scale data collection, labeling, and eliminates the burden of training AI/ML models from the ground up. This enables developers to build and scale production quality pre-trained models faster with no code. Auto Mixed Precision allows developers to train with half precision while maintaining the network accuracy achieved with single precision, enabling significantly faster training time.
- For real-time inference: NVIDIA TensorRT is a high-performance SDK for deep learning, including a DL inference optimizer and runtime that delivers low latency and high throughput for inference applications. NVIDIA Triton Inference Server simplifies the deployment of AI models at scale in production. It is an open source inference serving software that lets teams deploy trained AI models from any framework on any GPU or CPU-based infrastructure (cloud, data center, or edge).
- For perception: NVIDIA DeepStream SDK helps developers build and scale AI-powered Intelligent Video Analytics apps and services. DeepStream offers a multi-platform scalable framework with TLS security to deploy on the edge and connect to any cloud.
- NVIDIA Fleet Command is a hybrid-cloud platform for managing and scaling AI at the edge. From one control plane, anyone with a browser and internet connection can deploy applications, update software over the air, and monitor location health.
- NVIDIA Jarvis is an application framework for multimodal conversational AI services that delivers real-time performance on GPUs.
AIM: What is the scope of these solutions?
Muralli: Powerful GPU-based AI-at-the-edge computing, along with a full spectrum of sensors, are widely implemented in the field today. Fueled by AI and DL, sensor technologies that power perception for real-time decision making have revolutionised several areas of robotics, including navigation, visual recognition and object manipulation.
Today’s AI-enabled robots perform myriad tasks and functions, allowing them to work as “cobots” in close collaboration with humans in complex environments including warehouses, retail stores, hospitals and industrial environments as well as in our homes. AI and DL continue to play a significant role in the programming of robots, speeding development time for roboticists and helping advance these systems from single functionality to multi functionality.
And there’s no arguing the pandemic accelerated the need and urgency for robotics deployment, especially in healthcare, logistics, manufacturing and retail.
- Healthcare: To minimise contact and support shortage of staff and resources, robots have found invaluable use in the delivery of medicine/supplies, patient monitoring, medical procedures, temperature detection, and UV disinfectant applications in public and private spaces.
- Logistics: From pick-n-place to last mile delivery, robots have clearly become indispensable with the ever-increasing need for efficiencies across the supply chain and e-commerce.
- Manufacturing: Using AI/DL to create the factory-of-the future, leveraging robots and cobots for no-touch manufacturing as well as enabling zero downtime to increase productivity and efficiency.
- Retail: From cleaning, inventory and safety (temperature detection, mask detection, social distancing) to shelf-scanning and self-checkout, robots are transforming the shopping experience.We have a large customer base in a diverse set of industries like agriculture, manufacturing, healthcare and logistics (e.g., John Deere in agriculture and Komatsu in construction). Most of the last mile delivery robots are using NVIDIA technology (Postmates, JD-X , Cianio, etc.)
AIM: Tell us about NVIDIA Isaac Sim.Murali: NVIDIA created the Isaac robotics platform, including the Isaac Sim application on the NVIDIA Omniverse platform for simulation and training of robots in virtual environments before deploying them in the real world. NVIDIA Omniverse is the underlying foundation for all our simulators, including the Isaac platform. We’ve added many features in our latest Isaac Sim open beta release including ROS/ROS2 compatibility and multi camera support, as well as enhanced synthetic data generation and domain randomization capabilities which are important for generating datasets to train perception models for AI based robots.
Simulation technology like Isaac Sim on Omniverse can be used for every aspect: from design and development of the mechanical robot, then training the robot in navigation and behavior, to deploying in a “digital twin” in which the robot is simulated and tested in an accurate and photorealistic environment before deployed in the real world.
AIM: What are the current challenges and what does the future hold for robotics?
Muralli: One of the most interesting areas of development is cobots, which can be deployed in areas where robots have not been used thus far. Traditionally, the use of robots on factory floors posed safety risks and were deemed too dangerous to work alongside humans, and therefore these machines were typically placed in isolated environments or caged. Enter cobots. Though designed to work in close proximity with humans, cobots faced several challenges like limited capabilities and inability to think, putting a damper on their widespread adoption.
But now, thanks to advancements in AI, which brings intelligence to cobots, we’re seeing these systems make real-time decisions that ensure safety in the factory-of-the future, while maintaining and optimizing productivity. This includes training a cobot to perceive the environment around it and adapt accordingly — allowing it to reduce its speed, adjust its force/strength, detect changing working conditions, or even shut down safely before it interferes with a human in its proximity. By leveraging the power of AI, coupled with changes in cobot design (softer materials, new types of joints, removal of sharp edges, etc.), we’re seeing the emergence of applications and use cases that were not previously feasible (e.g., robots in commercial kitchens, etc.)
Robots are being taught what to do, and how to improve upon complex tasks, as quickly as within a few hours or overnight (versus what used to take weeks or even months)! AI techniques such as one-shot learning, transfer learning, imitation learning, reinforcement learning, etc. are no longer confined to research papers; many of these methods are in practical use today for real-world robotics deployments.
AIM: How do you see the Robotics landscape evolving in India?
Muralli: Manufacturing is increasingly reliant on robotic production. For example, the automotive industry. Our collaboration with BMW for example, begins with creating a digital twin of a future factory in Omniverse and laying out the entire robotic managed production lines digitally, before committing to physical construction. Other industries benefiting from robotics are the industrial & nuclear power sectors. For example, warehouse and inventory management, materials transportation, quality inspection and predictive maintenance in the former & internal reactor inspection and emergency response in the latter.
|RecommendKeepReplyMark as Last Read|
|From: Frank Sully||8/6/2021 5:51:06 PM|
|Nvidia has absolutely trounced AMD in this vital chip area|
By Sead Fadilpašic about 7 hours ago
AI is the future and Nvidia knows it
Despite the competition coming in droves, Nvidia has managed to maintain its leading position in the global market for Artificial Intelligence (AI) chips used in cloudand data centers.
Nvidia has also managed to maintain a large gap between itself and the rest, according to a new report from technology research firm Omdia, which claims that it took 80.6% of the market share of global revenue in 2020.
Last year, the company generated $3.2 billion in revenue, up from $1.8 billion the year before. The bulk of its earnings came from GPU-derived chips, for which Omdia says are the leading type of AI processors used in cloud and data center equipment.
Whether or not Nvidia keeps its dominant position in the future remains to be seen, as Omdia expects the market for AI processors to quickly grow and attract many new suppliers. Global market revenue for cloud and data center AI processors rose 79% last year, hitting $4 billion.
By the time we reach 2026, the company expects revenue to increase ninefold, to $37.6 billion.
For Jonathan Cassell, principal analyst, advanced computing, at Omdia, one advantage Nvidia has over the competition is its familiarity among the clients.
“NVIDIA’s Compute Unified Device Architecture (CUDA) Toolkit is used nearly universally by the AI software development community, giving the company’s GPU-derived chips a huge advantage in the market," he noted.
"However, Omdia predicts that other chip suppliers will gain significant market share in the coming years as market acceptance increases for alternative GPU-based chips and other types of AI processors.”
Omdia sees Xilinx, Google, Intel and AMD as the biggest contenders for at least a larger piece of the AI pie. Xilinx offers field-programmable gate array FPGA products, Google’s Tensor Processing Unit (TPU) AI ASIC is employed extensively in its own hyperscale cloud operations, while Intel’s racehorse comes in the form of its Habana AI proprietary-core AI ASSPs and its FPGA products for AI cloud and data center servers.
AMD, currently ranked fifth, offers GPU-derived AI ASSPs for cloud and data center servers.
|RecommendKeepReplyMark as Last Read|
|From: Frank Sully||8/7/2021 1:28:32 PM|
|NVIDIA, Qualcomm Will Use TSMC’s 5nm And 3nm Nodes For 2022 – Pokde.Net|
By Financial News On Aug 7, 2021
TSMC is reportedly looking at another great year in 2022, with both NVIDIA and Qualcomm returning to TSMC’s advanced processes. We recently saw both NVIDIA and Qualcomm make the jump to Samsung, with NVIDIA’s Ampere GPUs and also Qualcomm’s flagship Snapdragon 888 manufactured by Samsung.
Business is apparently going so well at TSMC that not only have they fully booked their existing 7nm and 5nm nodes, but the production capacity for the upcoming 3nm node is also completely exhausted. TSMC recently increased the target capacity for their 5nm and 3nm nodes, with the queue for their 7nm process also full.
According to the report, TSMC is looking quite optimistic, with large orders from Apple, Qualcomm, NVIDIA and even Intel. They are also expanding their capacity with fabs being set up in several countries outside of Taiwan. Their main competition would remain Samsung, but apparently the Korean foundry is not so favorable.
Reports have suggested that Samsung’s advanced process nodes are rather lackluster in terms of yield and stability. But as we’ve always emphasized, if the price is right, you’d definitely see chipmakers using Samsung’s foundries. Samsung has traditionally offered a cost advantage, but the recent price hike on their part could mitigate this somewhat.
|RecommendKeepReplyMark as Last Read|
|From: Frank Sully||8/7/2021 2:55:41 PM|
|Cattle-ist for the Future: Plainsight Revolutionizes Livestock Management with AI |
Vision AI leader helps cattle industry improve livestock health and enhance operations — from farming through protein production.
August 6, 2021 by Clarissa Garza
Computer vision and edge AI are looking beyond the pasture.
Plainsight, a San Francisco-based startup and NVIDIA Metropolis partner, is helping the meat processing industry improve its operations — from farms to forks. By pairing Plainsight’s vision AI platform and NVIDIA GPUs to develop video analytics applications, the company’s system performs precision livestock counting and health monitoring.
With animals such as cows that look so similar, frequently shoulder-to-shoulder and moving quickly, inaccuracies in livestock counts are common in the cattle industry, and often costly.
On average, the cost of a cow in the U.S. is between $980 and $1,200, and facilities process anywhere between 1,000 to 5,000 cows per day. At this scale, even a small percentage of inaccurate counts equates to hundreds of millions of dollars in financial losses, nationally.
“By applying computer vision powered by edge AI and NVIDIA Metropolis, we’re able to automate what has traditionally been a very manual process and remove the uncertainty that comes with human counting,” said Carlos Anchia, CEO of Plainsight. “Organizations begin to optimize existing revenue streams when accuracy can be operationally efficient.”
Plainsight is working with JBS USA, one of the world’s largest food companies, to integrate vision AI into its operational processes. Vision AI-powered cattle counting was among the first solutions to be implemented.
At JBS, cows are counted by fixed-position cameras, connected via a secured private network to Plainsight’s vision AI edge application, which detects, tracks and counts the cows as they move past.
Plainsight’s models count livestock with over 99.5 percent accuracy — about two percentage points better than manual livestock counting by humans in the same conditions, according to the company.
For a vision AI solution to be widely adopted by an organization, the accuracy must be higher than humans performing the same activity. By monitoring and tracking each individual animal, the models simplify an otherwise complex process.
Highly robust and accurate computer vision models are only a portion of the cattle counting solution. Through continued collaboration with JBS’s operations and innovation teams, Plainsight provided a path to the digital transformation required to more accurately provide accountability when receiving livestock at scale and thus ensure that the payment for livestock received is as accurate as possible.
Higher Accuracy with GPMoos
For JBS, the initial proof of value involved building models and deploying on an NVIDIA Jetson AGX Xavier Developer Kit.
After quickly achieving nearly 100 percent accuracy levels with their models, the teams moved into a full pilot phase. To augment the model to handle new and often challenging environmental conditions, Plainsight’s AI platform was used to quickly and easily annotate, build and deploy model improvements in preparation for a nationwide rollout.
As a member partner of NVIDIA Metropolis, an application framework that brings visual data and AI together, Plainsight continues to develop and improve models and AI pipelines to enable a national rollout with the U.S. division of JBS.
There, Plainsight uses a technology stack built on the NVIDIA EGX platform, incorporating NVIDIA-Certified Systems with NVIDIA T4 GPUs. Plainsight’s application processes multiple video streams per GPU in real time to count and monitor livestock as part of managing the accounting of livestock when received.
“Innovation is fundamental to the JBS culture, and the application of AI technology to improve efficiencies for daily work routines is important,” said Frederico Scarin do Amaral, Senior Manager Business Solutions of JBS USA. “Our partnership with Plainsight enhances the work of our team members and ensures greater accuracy of livestock count, improving our operations and efficiency, as well as allowing for continual improvements of animal welfare.”
Inaccurate counting is only part of the problem the industry faces, however. Visual inspection of livestock is arduous and error-prone, causing late detection of diseases and increasing health risks to other animals.
The same symptoms humans can identify by looking at an animal, such as gait and abnormal behavior, can be approximated by computer vision models built, trained and managed through Plainsight’s vision AI platform.
The models identify symptoms of particular diseases, based on the gait and anomalies in how livestock behave when exiting transport vehicles, in a pen or in feeding areas.
“The cameras are an unblinking source of truth that can be very useful in identifying and alerting to problems otherwise gone unnoticed,” said Anchia. “The combination of vision AI, cameras and Plainsight’s AI platform can help enhance the vigilance of all participants in the cattle supply chain so they can focus more on their business operations and animal welfare improvements as opposed to error-prone manual counting.”
In addition to a variety of other smart agriculture applications, Plainsight is using its vision AI platform to monitor and track cattle on the blockchain as digital assets.
The company is engaged in a first-of-its-kind co-innovation project with CattlePass, a platform that generates a secure and unique digital record of individual livestock, also known as a non-fungible token, or NFT.
Plainsight is applying its advanced vision AI models and platform for monitoring cattle health. The suite of advanced technologies, including genomics, health and proof-of-life records, will provide 100 percent verifiable proof of ownership and transparency into a complete living record of the animal throughout its lifecycle.
Cattle ranchers will be able to store the NFTs in a private digital wallet while collecting and adding metadata: feed, heartbeat, health, etc. This data can then be shared with permissioned viewers such as inspectors, buyers, vets and processors.
The data will remain with each animal throughout its life through harvest, and data will be provided with a unique QR code printed on the beef packaging. This will allow for visibility into the proof of life and quality of each cow, giving consumers unprecedented knowledge about its origin.
|RecommendKeepReplyMark as Last Read|