|From: Glenn Petersen||5/1/2021 5:53:47 AM|
|Global Cloud Services Market Reaches Reaches US$42 Billion In Q1 2021|
April 29, 2021
Cloud infrastructure services spending grew 35% to US$41.8 billion in the first quarter of 2021. The trend of using cloud services for data analytics and machine learning, data center consolidation, application migration, cloud native development and service delivery continued at pace. Overall, customer spending exceeded US$40 billion a quarter for the first time in Q1, with total expenditure nearly US$11 billion higher than in Q1 2020 and nearly US$2 billion more than in Q4 2020, according to Canalys data. The acceleration of digital transformation over the last 12 months, with organizations adapting to new working practices, customer engagement, and business process and supply chain dynamics, has elevated demand for these services. This, combined with the rebound in some economies, in line with government stimuli, the roll-out of mass COVID-19 vaccination programs and subsequent easing of restrictions, has increased customer confidence in committing to multi-year contracts.
Amazon Web Services (AWS) was the leading cloud service provider in Q1 2021, growing 32% on an annual basis to account for 32% of total spend. In the last quarter it announced new CloudFront edge locations in Croatia and Indonesia and extended its Wavelength Zones for 5G networks to Japan and across the United States. It launched its new EX2 X2gd instances based on the AWS-designed Graviton2 CPU for memory-intensive workloads and improved price-performance.
Microsoft Azure grew 50% for the third consecutive quarter, taking 19% market share in Q1 2021. Growth was boosted by cloud consumption and longer-term customer commitments enabled by investments in Azure Arc for hybrid-IT control plane management, Azure Synapse for data analytics, and AI as a platform. It introduced new industry clouds propositions for financial services, manufacturing and non-profit organizations, adding to healthcare and retail.
Google Cloud maintained its momentum, benefiting from its Google One approach driving cross-sell and integration opportunities across its portfolio. Overall, it grew 56% in the latest quarter to account for a 7% market share. Cloud-native development and accelerated cloud migration among its customers has been boosted by its focus on industry-specific solutions, machine learning, analytics and data management. It announced a new cloud region in Israel.
“Cloud emerged as a winner across all sectors over the last year, basically since the start of the COVID-19 pandemic and the implementation of lockdowns. Organizations depended on digital services and being online to maintain operations and adapt to the unfolding situation,” said Canalys Research Analyst Blake Murray. “Though 2020 saw large-scale cloud infrastructure spending, most enterprise workloads have not yet transitioned to the cloud. Migration and cloud spend will continue as customer confidence rises during 2021. Large projects that were postponed last year will resurface, while new use cases will expand the addressable market.” Investment at the edge, including 5G, is a key area, especially for the development of ultra-low latency applications and use cases, such as autonomous vehicles, industrial robotics and augmented or virtual reality.
Competition among the leading cloud service providers to capitalize on these opportunities will continue to intensify. “Geographic expansion for data sovereignty and to improve latency, either via full-region deployment or a local city point of presence, is one area of focus for the cloud service providers,” said Canalys Chief Analyst Matthew Ball. “But differentiation through custom hardware development for optimized compute instances, industry-specific clouds, hybrid-IT management, analytics, databases and AI-driven services is increasing. But it is not just a contest between the cloud service providers, but also a race with the on-premises infrastructure vendors, such as Dell Technologies, HPE and Lenovo, which have established competitive as-a-service offerings. The challenge will be demonstrating a differentiated value proposition for each.”
Canalys defines cloud infrastructure services as services that provide infrastructure as a service and platform as a service, either on dedicated hosted private infrastructure or shared infrastructure. This excludes software as a service expenditure directly, but includes revenue generated from the infrastructure services being consumed to host and operate them.
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||5/9/2021 5:41:58 AM|
|Companies Extend Cloud to the Edge|
Firms world-wide on pace to spend $240.6 billion on edge computing through 2024
By Angus Loten
Wall Street Journal
May 3, 2021 6:19 pm ET
Alstom SA, a France-based rail transportation manufacturer that provides signaling for major North American railway routes, recently installed edge tools in wayside devices alongside tracks, including railway grade crossings. PHOTO: KRISZTIAN BOCSI/BLOOMBERG NEWS
Companies are expected to raise spending over the next few years on technology designed to power data-processing tools installed directly on factory-floor machinery, cellphone towers, train-track signals or store-checkout hardware—instead of in a data center.
The goal of edge computing is to improve the performance of production and service applications by processing data where it is being generated and apply it at lightning speed. The rapid growth of edge computing across industries comes as digitized functions and processes spew massive amounts of data, which is increasingly stored and processes in the cloud, analysts say.
Software company Red Hat Inc., an affiliate of International Business Machines Corp. , last week launched an edge-computing platform designed to operate across multiple cloud services.
“With so much data, we have to get it closer to where it’s needed,” Red Hat Chief Executive Paul Cormier said. “You can’t always take the time on a factory floor to push all that data down the line; you need it right on the production line to make very quick decisions,” he said.
Companies world-wide are on pace to spend $240.6 billion on edge computing through 2024, investing in hardware, software and services, at a compound annual growth rate of more than 15%, according to the latest forecast by tech industry research firm International Data Corp.
Beyond big data, the race to install edge capabilities is also accelerating alongside the growth of the Internet of Things, a network of connected equipment and other devices used in manufacturing or retail sales, among other sectors.
Industrial giants such as General Electric Co. , Siemens AG and Robert Bosch GmbH are using edge-computing technology to optimize production-line machines in real time. Typically powered by artificial intelligence, edge-computing systems parse production data at the source and make tiny near-instantaneous tweaks, such as adjusting the electrical current load in welding robots or the force applied by a machine press.
Red Hat Chief Executive Paul CormierPHOTO: RED HAT INC.
Likewise, energy companies have used edge tools to improve analysis at oil-and-gas wells, and retailers have applied it to enhance remote checkout services.
“Like the cloud before it, edge is fueling the next wave of innovation, enabling organizations to automate operations, create rich customer experiences, and bring new products and services to market,” said Dave McCarthy, IDC’s vice president of cloud and edge Infrastructure services.
Mr. McCarthy said the spending trend offers a fertile area of growth for cloud-service providers, such as Amazon.com Inc.’s Amazon Web Services, Microsoft Corp.’s Azure and Alphabet Inc.’s Google Cloud—all of which have unveiled edge-computing tools in recent years: “By extending their cloud platforms to distributed edge locations, cloud service providers can address a wider set of use cases than previously possible,” he said.
Alstom SA, a France-based rail transportation manufacturer that provides signaling for major North American railway routes, says it plans to boost spending on edge computing in the year ahead.
The company recently installed edge tools developed by Red Hat in wayside devices alongside tracks, including railway grade crossings.
Among other functions, the devices exchange real-time data to route trains via trackside signals, while predicting and flagging track maintenance needs, without bouncing data back to a core server, the company says.
“The information synthesized is used to automatically alert operations and maintenance teams to conditions that may require attention,” said Emilio Barcelos, product manager at Alstom who covers wayside intelligence and analytics. He added that the aim of its edge strategy is to improve safety and efficiency.
“About half of our clients doing edge are doing it for the bottom line—automate, reduce costs, improve efficiency,” said Thomas Bittman, distinguished research vice president at information-technology research and consulting firm Gartner Inc. The other half, he added, are looking to leverage edge technology for new revenue opportunities. The second half is growing the fastest, he said.
Mr. Bittman said most companies are applying edge computing to areas in which a real-time response is required, where a lot of data is being generated and cannot easily or cheaply be transferred back and forth, or for semi-automated functions. Many are approaching edge computing as an extension of their cloud strategies, he added.
Gartner expects companies world-wide to spend roughly $332 billion on cloud services by the end of the year, up from $270 billion in 2020, or about 23%.
Companies Extend Cloud to the Edge - WSJ
|RecommendKeepReplyMark as Last ReadRead Replies (1)|
|From: Glenn Petersen||5/9/2021 6:40:55 AM|
|Amazon and Apple have introduced wireless networks that are would seem to function as low-end edge computing networks:|
Amazon and Apple Built Vast Wireless Networks Using Your Devices. Here’s How They Work.
Apple’s iPhone-powered Find My network and Amazon’s Sidewalk network—coming soon to all newer Echo devices—are platforms in their own right, capable of supporting billions of connected devices
By Christopher Mims
Wall Street Journal
May 7, 2021 11:00 am ET
ILLUSTRATION: NAN LEE
What to do if you’re a globe-spanning tech titan that wants to connect millions or even billions of devices, but you don’t want the hassle or cost of dealing with telcos, satellite operators or cable companies for connectivity? You use the devices your customers have already purchased—and brought into homes, businesses and public spaces—to make an end-run around traditional wireless networks.
Apple and Amazon AMZN -0.45% are transforming the devices we own into the equivalent of little cell towers or portable Wi-Fi hot spots that can connect other gadgets and sensors to the internet. They have already switched on hundreds of millions—with many more on the way. Instead of serving as wireless hubs solely for your own smartwatches, lights and sensors, your iPhones and Echo speakers can help other people’s gadgets stay connected as well—whether you know it or not.
On Friday, Amazon announced it’s expanding its Sidewalk network, which already includes certain Ring Floodlight Cam and Spotlight models, to include Echo devices released in 2018 and after. This includes Echo speakers and Echo Dots, as well as all Echo Show, Echo Plus and Echo Spot devices. It will also use recent Ring Video Doorbell Pro models to communicate on the Sidewalk network via Bluetooth. Sidewalk was designed to allow smart devices to send very small bits of data securely from any available wireless connection, to supplement Wi-Fi networks and reduce wireless communication breakdowns.
This announcement comes on the heels of Apple’s AirTag introduction. These coin-size trackers can help locate lost items almost anywhere, because they use the company’s Find My network. Each AirTag sends out a low-powered wireless signal, which can be received by the iPhones, iPads and Macs in a given area.
Yes, perfect strangers are using slivers of our bandwidth, as our devices send out and listen to little chirrups of radio chatter that don’t pertain to us. And you’re now able to leverage the radios and internet connection of countless devices owned by other people, too.
Users can opt out of these systems, but the tech giants are betting that for the most part we won’t, because of the benefits that these new networks will provide—from finding our lost possessions, pets and loved ones to remotely controlling our smart locks, security systems and lights.
“What we’re seeing now is the battle of the mesh networks,” says Ben Wood, chief analyst at CCS Insight, a tech industry consultancy. “The use cases of these networks are limited only by customers’ imaginations.”
Apple’s new AirTags can be located through the Find My network, which is powered by nearby iPhones, iPads and Macs. PHOTO: JAE C. HONG/ASSOCIATED PRESS
How Apple’s Find My Works
If you could see an AirTag communicating with the radio waves that enable the Find My network, it would look like millisecond-bursts of light shooting out every few minutes, pinging every possible iPhone, iPad and Mac in range—about 30 feet. This regular burst of chatter is key: If you lose your AirTag-attached keys in a park while on vacation in Sydney, for example, many strangers’ iPhones would exchange an incredibly tiny amount of encrypted data with that AirTag as they passed by. When you open the Find My app to look for the keys, that data will have made its way to you, even if you’re already back in the U.S.
That tiny amount of data is why the Find My network, which began helping locate iPhones back in 2011, doesn’t kill our batteries or bloat our cellphone bills, says Fadel Adib, an associate professor of electrical engineering and computer science at MIT.
When a phone is transmitting and receiving data over a cellular network, it might use around a watt of power, while transmitting via Bluetooth Low Energy to an AirTag might require 1/100th of that power. “So communicating with these tags is just a tiny, tiny fraction of what the phone is normally drawing,” he adds.
When your own iPhone (iPhone 11 or newer) is in proximity to an AirTag, the two can also communicate via ultra-wideband radio frequency using Apple’s custom-designed U1 chip. This allows you to pinpoint the AirTag’s location—but isn’t used to find other people’s stuff.
Apple has said its Find My network is secure, and uses end-to-end encryption. The company has also said the Find My network is judicious with personal data: The company has taken steps to make it difficult to, say, use an AirTag dropped in a stranger’s bag to track that person as they go about their day, by for example alerting them on their iPhone that an unfamiliar AirTag is nearby.
A diagram provided by Amazon shows how Ring and Echo products talk to smart devices on the Sidewalk network. PHOTO: AMAZON
How Amazon’s Sidewalk Works
Amazon’s Sidewalk network is in many ways different from Apple’s Find My network, though it also uses encryption for security, says Manolo Arana, general manager of the Sidewalk project at Amazon. For one thing, the devices that power it stay put, so the network is not constantly in flux. And rather than just tracking the location and identity of things, Sidewalk can be used for nearly any kind of short two-way communication, he says.
Cities blanketed by the Sidewalk network could allow devices to function even when their main connection to the internet goes down, or is unavailable. Say your Ring smart security light is too far from your home’s Wi-Fi router, or maybe you just lose internet connectivity. If a neighbor’s Echo or Ring device is in range, your security light could still function by routing its tiny bits of traffic through that other connection.
The Ring Video Doorbell Pro will use Bluetooth to connect Level smart locks to Amazon’s Sidewalk network. PHOTO: RING
Tile, which makes Bluetooth device-tracking tags that have been popular for years, is adding the ability to track them through Amazon’s Sidewalk network, which allows them to connect directly to Echo devices. Tile’s previous attempts to create its own network similar to Apple’s Find My have been hampered by, among other things, the way iPhones frequently ask Tile users for permission to track their location, says Tile chief executive CJ Prober.
Apple’s Find My Network uses short-range wireless Bluetooth signals to communicate with nearby Apple devices—as well as a recently announced handful of other products, such as Chipolo trackers and VanMoof bicycles. Amazon’s Sidewalk also uses Bluetooth, but is adding long-range wireless technology known as LoRa to the Sidewalk network via certain Echo and Ring devices.
LoRa systems have generally been used for enterprise applications, such as sensors and actuators in manufacturing and the energy industry. They can communicate small amounts of data across many miles when placed atop towers or even on satellites, using relatively little power. When placed inside a device like an Echo speaker, inside a home, the range can still be about a mile, says Marc Pegulu, vice president of LoRa at semiconductor-maker Semtech, one of Amazon’s technology partners.
The latest update to the iPhone’s operating system features a new privacy feature called App Tracking Transparency. WSJ’s Joanna Stern spoke exclusively with Apple’s Craig Federighi about the decisions behind the feature, as well as Apple's interest in mixed reality—and the possibility he'll replace Tim Cook as CEO. Photo illustration: Alex Kuzoian for The Wall Street Journal
Sidewalk’s long range will enable a new way to track people with dementia who may wander, says Adam Sobol, CEO of CareBand. CareBand’s smartwatch-style wrist monitor with built-in GPS can communicate via LoRa to Sidewalk-enabled Echo or Ring devices, allowing family members and caregivers to remotely monitor the whereabouts of a loved one. Mr. Sobol says that 90% of seniors with dementia who wander stay within a mile. Another advantage of Sidewalk is that users wouldn’t have to pay any cellular fees to use devices like the CareBand tracker.
Mr. Wood of CCS Insight says that embedding wireless networking into devices that are popular in their own right, such as Ring security cameras, is a good strategy. “If you look at where there’s already a high density of Ring devices, in Los Angeles especially, where Ring started, you could imagine a scenario in which this invisible Amazon Sidewalk network becomes this incredible asset to Amazon and a value-add to Prime subscribers,” he says.
How You Can Opt Out
People who own participating Echo and Ring devices can elect not to be part of the Sidewalk network, but the toggle to opt out is buried four menus deep in Amazon’s Alexa and Ring apps. And while you can opt out of Apple’s Find My network, you do it by turning off Find My. In other words, if you don’t help others find their misplaced devices, you risk losing yours.
Apple doesn’t say how much data Find My uses, though it’s likely to be minuscule. Amazon doesn’t say either, but there’s a cap to the amount of data it will send via your home’s internet connection: 80 kilobits per second, and 500 megabytes a month per household Amazon account. “That’s 10 minutes of YouTube video,” says Mr. Arana of Amazon.
Nearly ubiquitous Sidewalk connectivity in cities could provide even more functions for Amazon, says Mr. Wood. The company’s logistics arm could use it to track high-value deliveries using small Tile-style trackers, for instance. And while other companies have attempted widespread, device-based networks such as this, the sales volume and customer-approval ratings enjoyed by both Apple and Amazon provide a strategic advantage when building something that could arguably seem unworkable and a little creepy.
“People love Amazon, and it’s become an intrinsic part of many people’s lives, and particularly during the pandemic,” says Mr. Wood. In the same way, Apple has built a high level of trust with its customers, allowing it to roll out its Find My network over more than a decade with little pushback from users, he adds.
Aside from these vast networks becoming platforms in their own right for enabling new applications, they also serve as yet another way to keep customers locked into their creators’ ecosystems.
AirTags, in particular, are classic Apple: Who would switch to Android when it might mean suddenly not being able to find their keys? By the same token, Amazon’s Sidewalk network gives people one more reason to own an Echo—and to buy one for their aging loved ones.
—For more WSJ Technology analysis, reviews, advice and headlines, sign up for our weekly newsletter.
Write to Christopher Mims at email@example.com
Amazon and Apple Built Vast Wireless Networks Using Your Devices. Here’s How They Work. - WSJ
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||5/10/2021 8:58:23 AM|
|Pentagon Weighs Ending JEDI Cloud Project Amid Amazon Court Fight|
With $10 billion contract mired in litigation from Amazon, defense officials are reviewing other options to move forward
By John D. McKinnon
Wall Street Journal
May 10, 2021 5:30 am ET
WASHINGTON—Pentagon officials are considering pulling the plug on the star-crossed JEDI cloud-computing project, which has been mired in litigation from Amazon.com Inc. and faces continuing criticism from lawmakers.
The Joint Enterprise Defense Infrastructure contract was awarded to Microsoft Corp. in 2019 over Amazon, which has contested the award in court ever since.
A federal judge last month refused the Pentagon’s motion to dismiss much of Amazon’s case. A few days later, Deputy Defense Secretary Kathleen Hicks said the department would review the project.
“We’re going to have to assess where we are with regard to the ongoing litigation around JEDI and determine what the best path forward is for the department,” Ms. Hicks said at an April 30 security conference organized by the nonprofit Aspen Institute.
Her comments followed a Pentagon report to Congress, released before the latest court ruling, that said another Amazon win in court could significantly draw out the timeline for the program’s implementation.
“The prospect of such a lengthy litigation process might bring the future of the JEDI Cloud procurement into question,” the Jan. 28 report said.
Ms. Hicks and other Pentagon officials say there is a pressing need to implement a cloud program that serves most of its branches and departments. The JEDI contract, valued at up to $10 billion over 10 years, aims to allow the Pentagon to consolidate its current patchwork of data systems, give defense personnel better access to real-time information and put the Defense Department on a stronger footing to develop artificial-intelligence capabilities that are seen as vital in the future.
Big tech firms are investing in data centers as they compete for the $214 billion cloud computing market. WSJ explains what cloud computing is, why big tech is betting big on future contracts.
Some lawmakers and government-contracting experts say JEDI should be scuttled because its single-vendor, winner-take-all approach is inappropriate and outmoded for mammoth enterprises like the Department of Defense.
These people say the Pentagon should move to an increasingly popular approach to enterprise cloud-computing that includes multiple companies as participants. Spreading out the work also reduces the risk of legal challenges from excluded companies, they say.
Rep. Steve Womack (R., Ark.) called on the Pentagon last week to start fresh with a new contract-bidding process that would “enable best-in-class capability by prioritizing the ongoing competition that a cloud environment can promote.”
Should the Pentagon scuttle JEDI, the government could seek to patch together a new cloud program by expanding several existing Defense Department information-technology contracts, said John Weiler, a longtime JEDI critic who is executive director of the IT Acquisition Advisory Council, a public-private consortium that advises government and industry on tech-procurement best practices.
Microsoft has acknowledged the problems created by the delays but said it is ready to continue the project.
“We agree with the U.S. [government] that prolonged litigation is harmful and has delayed getting this technology to our military service members who need it,” the company said. “We stand ready to support the Defense Department to deliver on JEDI and other mission critical DoD projects.”
Amazon declined to comment for this article. The company has contended in court that then-President Donald Trump exerted improper pressure on the Pentagon to keep the contract from going to Amazon because it is led by Jeff Bezos.
Mr. Trump has blamed Mr. Bezos for what he viewed as unfavorable coverage of his administration in the Washington Post, which Mr. Bezos bought in 2013 for $250 million. The Post says its editorial decisions are independent.
At the time, the Trump White House referred questions to the Pentagon, which denied that Mr. Trump or administration officials had any impact on the selection process.
Before the latest court fight, Oracle Corp. —one of the original bidders—had sued to halt the contract awarding process. Its 2019 lawsuit claimed that an Amazon employee who worked for the Pentagon in 2016 and 2017 helped steer the procurement process to favor Amazon, which then hired him back.
A judge subsequently rejected those allegations, allowing the bidding process to move forward.
Amazon has maintained that it got no favorable treatment from the Pentagon at any point, but the issue resurfaced last week, with Sen. Mike Lee (R., Utah) and Rep. Ken Buck (R., Colo.) sending a letter requesting a Justice Department investigation into alleged conflicts by that employee and others.
Last month, Sen. Chuck Grassley (R., Iowa) wrote a letter to Pentagon officials raising concerns about the agency’s oversight of the project and seeking more details about alleged conflicts of interest and possible improprieties, which some critics and rival companies say might have skewed the initial procurement steps in Amazon’s favor.
Several of the concerns raised in both letters had been reviewed previously. A federal judge in 2019 concluded that the former Amazon employee “did not taint” the program.
A Pentagon inspector general report last year determined that the Pentagon adviser didn’t violate any ethical obligations or give preferential treatment to Amazon.
Steven Schooner, a George Washington University law professor who specializes in government contracting, said early questions about the Pentagon’s underlying procurement strategy for JEDI have grown over time.
“And all of that is before this case became one of the most jaw-dropping, head-scratching collections of conflicts of interest imaginable,” he said.
Write to John D. McKinnon at firstname.lastname@example.org
Pentagon Weighs Ending JEDI Cloud Project Amid Amazon Court Fight - WSJ
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||5/13/2021 12:33:26 PM|
|Google wins cloud deal from Elon Musk’s SpaceX for Starlink internet connectivity|
PUBLISHED THU, MAY 13 20218:30 AM EDT
Jordan Novet @JORDANNOVET
-- Google announced that its cloud unit has won a deal to supply computing and networking resources to Elon Musk’s SpaceX to help deliver internet service through the latter’s Starlink satellites.
-- The Starlink satellite internet will rely on Google’s private fiber-optic network to quickly make connections to cloud services as part of a deal that could last seven years.
Google announced on Thursday its cloud unit has won a deal to supply computing and networking resources to SpaceX, Elon Musk’s privately held space-development company, to help deliver internet service through its Starlink satellites.
SpaceX will install ground stations at Google data centers that connect to SpaceX’s Starlink satellites, with an eye toward providing fast internet service to enterprises in the second half of this year.
The deal represents a victory for Google as it works to take share from Amazon and Microsoft in the fast-growing cloud computing market.
Investors are counting on Google’s nascent cloud business to boost growth in the event that its advertising business slows down. While Google’s cloud business delivered only 7% of parent company Alphabet’s total revenue in the first quarter, it grew almost 46% year over year, compared with growth of 32% for Google’s advertising services.
It’s also an unusual type of deal for Google -- or any other cloud provider -- as it relies heavily on Google’s internal network that connects data centers, rather than simply outsourcing functions like computing power or data storage to these data centers.
“This is one of a kind. I don’t believe something like this has been done before,” said Bikash Koley, Google’s head of global networking. “The real potential of this technology became very obvious. The power of combining cloud with universal secure connectivity, it’s a very powerful combination.”
“They chose us because of the quality of our network and the distribution and reach of our network,” said Thomas Kurian, CEO of Google’s cloud group.
Amazon popularized the public cloud business with the launch in 2006 of general-purpose computing and storage tools from its Amazon Web Services division. Google introduced its own computing service in 2012. But over the last two decades, Google has also spent money assembling a private fiber-optic network to connect its data centers, Koley said. While much of Google’s cloud growth has come from taking care of computing and storage needs for clients such as Goldman Sachs and Snap, the SpaceX deal will draw heavily on Google’s networking capabilities.
Cloud providers have increasingly focused on the telecommunications industry, particularly with the ascent of 5G connectivity. Last month, for example, Amazon said Dish would use AWS infrastructure to deliver 5G service to consumers.
In SpaceX’s case, there is no need for cell towers. Instead, customers’ devices will communicate to satellites, and then the satellites will link up to Google data centers. Inside those data centers, customers can run applications quickly using Google’s cloud services, or they can send the information on to other companies’ services that are geographically nearby, enabling low latency so there’s minimal lag. Data then comes right back through the Google data centers to satellites, and then down to end users.
The deal could last seven years, according to a person who declined to be named discussing confidential terms.
Starlink’s service might be valuable for consumers living in places with limited internet access, as well as businesses and government organizations running projects in remote areas, Kurian said. He anticipates that having Starlink draw on Google’s cloud network will lead organizations to deploy applications inside Google’s cloud to take advantage of high speeds.
Google is not the only cloud provider to be working with Starlink. In October, Microsoft said it was working with SpaceX to bring Starlink internet connectivity to modular Azure cloud data centers that customers can deploy anywhere. SpaceX would still rely on Google data centers in that scenario, a person familiar with the matter said. (Data would travel from the customer’s Azure modular data center through the Starlink satellite to Google’s data center and then out to other cloud services — and return in the opposite direction.)
Initially SpaceX will deploy the ground stations at Google data centers in the U.S., but the company wants to expand internationally, the person said.
SpaceX is one of the world’s most valuable privately held start-ups, having raised money at a $74 billion valuation in February, CNBC reported. Google invested $900 million in SpaceX in 2015. SpaceX has launched over 1,500 Starlink satellites into orbit, and last week the company said more than 500,000 people have ordered or made a deposit for the internet service.
Google Cloud wins SpaceX deal for Starlink internet connectivity (cnbc.com)
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||6/9/2021 1:15:28 PM|
|The Cost of Cloud, a Trillion Dollar Paradox|
by Sarah Wang and Martin Casado
There is no doubt that the cloud is one of the most significant platform shifts in the history of computing. Not only has cloud already impacted hundreds of billions of dollars of IT spend, it’s still in early innings and growing rapidly on a base of over $ 100B of annual public cloud spend. This shift is driven by an incredibly powerful value proposition — infrastructure available immediately, at exactly the scale needed by the business — driving efficiencies both in operations and economics. The cloud also helps cultivate innovation as company resources are freed up to focus on new products and growth.
source: Synergy Research Group
However, as industry experience with the cloud matures — and we see a more complete picture of cloud lifecycle on a company’s economics — it’s becoming evident that while cloud clearly delivers on its promise early on in a company’s journey, the pressure it puts on margins can start to outweigh the benefits, as a company scales and growth slows. Because this shift happens later in a company’s life, it is difficult to reverse as it’s a result of years of development focused on new features, and not infrastructure optimization. Hence a rewrite or the significant restructuring needed to dramatically improve efficiency can take years, and is often considered a non-starter.
Now, there is a growing awareness of the long-term cost implications of cloud. As the cost of cloud starts to contribute significantly to the total cost of revenue (COR) or cost of goods sold (COGS), some companies have taken the dramatic step of “repatriating” the majority of workloads (as in the example of Dropbox) or in other cases adopting a hybrid approach (as with CrowdStrike and Zscaler). Those who have done this have reported significant cost savings: In 2017, Dropbox detailed in its S-1 a whopping $75M in cumulative savings over the two years prior to IPO due to their infrastructure optimization overhaul, the majority of which entailed repatriating workloads from public cloud.
Yet most companies find it hard to justify moving workloads off the cloud given the sheer magnitude of such efforts, and quite frankly the dominant, somewhat singular, industry narrative that “cloud is great”. (It is, but we need to consider the broader impact, too.) Because when evaluated relative to the scale of potentially lost market capitalization — which we present in this post — the calculus changes. As growth (often) slows with scale, near term efficiency becomes an increasingly key determinant of value in public markets. The excess cost of cloud weighs heavily on market cap by driving lower profit margins.
The point of this post isn’t to argue for repatriation, though; that’s an incredibly complex decision with broad implications that vary company by company. Rather, we take an initial step in understanding just how much market cap is being suppressed by the cloud, so we can help inform the decision-making framework on managing infrastructure as companies scale.
To frame the discussion: We estimate the recaptured savings in the extreme case of full repatriation, and use public data to pencil out the impact on share price. We show (using relatively conservative assumptions!) that across 50 of the top public software companies currently utilizing cloud infrastructure, an estimated $100B of market value is being lost among them due to cloud impact on margins — relative to running the infrastructure themselves. And while we focus on software companies in our analysis, the impact of the cloud is by no means limited to software. Extending this analysis to the broader universe of scale public companies that stands to benefit from related savings, we estimate that the total impact is potentially greater than $500B.
Our analysis highlights how much value can be gained through cloud optimization — whether through system design and implementation, re-architecture, third-party cloud efficiency solutions, or moving workloads to special purpose hardware. This is a very counterintuitive assumption in the industry given prevailing narratives around cloud vs. on-prem. However, it’s clear that when you factor in the impact to market cap in addition to near term savings, scaling companies can justify nearly any level of work that will help keep cloud costs low.
Unit economics of cloud repatriation: The case of Dropbox, and beyond
To dimensionalize the cost of cloud, and understand the magnitude of potential savings from optimization, let’s start with a more extreme case of large scale cloud repatriation: Dropbox. When the company embarked on its infrastructure optimization initiative in 2016, they saved nearly $75M over two years by shifting the majority of their workloads from public cloud to “lower cost, custom-built infrastructure in co-location facilities” directly leased and operated by Dropbox. Dropbox gross margins increased from 33% to 67% from 2015 to 2017, which they noted was “primarily due to our Infrastructure Optimization and an… increase in our revenue during the period.”
source: Dropbox S-1 filed February 2018
But that’s just Dropbox. So to help generalize the potential savings from cloud repatriation to a broader set of companies, Thomas Dullien, former Google engineer and co-founder of cloud computing optimization company Optimyze, estimates that repatriating $100M of annual public cloud spend can translate to roughly less than half that amount in all-in annual total cost of ownership (TCO) — from server racks, real estate, and cooling to network and engineering costs.
The exact savings obviously varies company, but several experts we spoke to converged on this “formula”: Repatriation results in one-third to one-half the cost of running equivalent workloads in the cloud. Furthermore, a director of engineering at a large consumer internet company found that public cloud list prices can be 10 to 12x the cost of running one’s own data centers. Discounts driven by use-commitments and volume are common in the industry, and can bring this multiple down to single digits, since cloud compute typically drops by ~30-50% with committed use. But AWS still operates at a roughly 30% blended operating margin net of these discounts and an aggressive R&D budget — implying that potential company savings due to repatriation are larger. The performance lift from managing one’s own hardware may drive even further gains.
Across all our conversations with diverse practitioners, the pattern has been remarkably consistent: If you’re operating at scale, the cost of cloud can at least double your infrastructure bill.
The true cost of cloud
When you consider the sheer magnitude of cloud spend as a percentage of the total cost of revenue (COR), 50% savings from cloud repatriation is particularly meaningful. Based on benchmarking public software companies (those that disclose their committed cloud infrastructure spend), we found that contractually committed spend averaged 50% of COR.
Actual spend as a percentage of COR is typically even higher than committed spend: A billion dollar private software company told us that their public cloud spend amounted to 81% of COR, and that “cloud spend ranging from 75 to 80% of cost of revenue was common among software companies”. Dullien observed (from his time at both industry leader Google and now Optimyze) that companies are often conservative when sizing cloud commit size, due to fears of being overcommitted on spend, so they commit to only their baseline loads. So, as a rule of thumb, committed spend is often typically ~20% lower than actual spend… elasticity cuts both ways. Some companies we spoke with reported that they exceeded their committed cloud spend forecast by at least 2X.
If we extrapolate these benchmarks across the broader universe of software companies that utilize some public cloud for infrastructure, our back-of-the-envelope estimate is that the cloud bill reaches $8B in aggregate for 50 of the top publicly traded software companies (that reveal some degree of cloud spend in their annual filings). While some of these companies take a hybrid approach — public cloud and on-premise (which means cloud spend may be a lower percentage of COR relative to our benchmarks) — our analysis balances this, by assuming that committed spend equals actual spend across the board. Drawing from our conversations with experts, we assume that cloud repatriation drives a 50% reduction in cloud spend, resulting in total savings of $4B in recovered profit. For the broader universe of scale public software and consumer internet companies utilizing cloud infrastructure, this number is likely much higher.
source: company S-1 and 10K filings; a16z analysis
While $4B of estimated net savings is staggering on its own, this number becomes even more eye-opening when translated to unlocked market capitalization. Since all companies are conceptually valued as the present value of their future cash flows, realizing these aggregate annual net savings results in market capitalization creation well over that $4B.
How much more? One rough proxy is to look at how the public markets value additional gross profit dollars: High-growth software companies that are still burning cash are often valued on gross profit multiples, which reflects assumptions about the company’s long term growth and profitable margin structure. (Commonly referenced revenue multiples also reflect a company’s long term profit margin, which is why they tend to increase for higher gross margin businesses even on a growth rate-adjusted basis). Both capitalization multiples, however, serve as a heuristic for estimating the market discounting of a company’s future cash flows.
Among the set of 50 public software companies we analyzed, the average total enterprise value to 2021E gross profit multiple (based on CapIQ at time of publishing) is 24-25X. In other words: For every dollar of gross profit saved, market caps rise on average 24-25X times the net cost savings from cloud repatriation. (Assumes savings are expressed net of depreciation costs incurred from incremental CapEx if relevant).
This means an additional $4B of gross profit can be estimated to yield an additional $100B of market capitalization among these 50 companies alone. Moreover, since using a gross profit multiple (vs. a free cash flow multiple) assumes that incremental gross profit dollars are also associated with certain incremental operating expenditures, this approach may underestimate the impact to market capitalization from the $4B of annual net savings.
For a given company, the impact may be even higher depending on its specific valuation. To illustrate this phenomenon [please note this is not investment advice, see full disclosures below and at a16z.com], take the example of infrastructure monitoring as a service company Datadog. The company traded at close to 40X 2021 estimated gross profit at time of publishing, and disclosed an aggregate $225M 3-year commitment to AWS in their S-1. If we annualize committed spend to $75M of annual AWS costs — and assume 50% or $37.5M of this may be recovered via cloud repatriation — this translates to roughly $1.5B of market capitalization for the company on committed spend reductions alone!
While back-of-the-envelope analyses like these are never perfect, the directional findings are clear: market capitalizations of scale public software companies are weighed down by cloud costs, and by hundreds of billions of dollars. If we expand to the broader universe of enterprise software and consumer internet companies, this number is likely over $500B — assuming 50% of overall cloud spend is consumed by scale technology companies that stand to benefit from cloud repatriation.
For business leaders, industry analysts, and builders, it’s simply too expensive to ignore the impact on market cap when making both long-term and even near-term infrastructure decisions.
source: CapIQ as of May 2021; note: charts herein are for informational purposes only and should not be relied upon when making any investment decision
The paradox of cloud
Where do we go from here? On one hand, it is a major decision to start moving workloads off of the cloud. For those who have not planned in advance, the necessary rewriting seems SO impractical as to be impossible; any such undertaking requires a strong infrastructure team that may not be in place. And all of this requires building expertise beyond one’s core, which is not only distracting, but can itself detract from growth. Even at scale, the cloud retains many of its benefits — such as on-demand capacity, and hordes of existing services to support new projects and new geographies.
But on the other hand, we have the phenomenon we’ve outlined in this post, where the cost of cloud “takes over” at some point, locking up hundreds of billions of market cap that are now stuck in this paradox: You’re crazy if you don’t start in the cloud; you’re crazy if you stay on it.
So what can companies do to free themselves from this paradox? As mentioned, we’re not making a case for repatriation one way or the other; rather, we’re pointing out that infrastructure spend should be a first-class metric. What do we mean by this? That companies need to optimize early, often, and, sometimes, also outside the cloud. When you’re building a company at scale, there’s little room for religious dogma.
While there’s much more to say on the mindset shifts and best practices here — especially as the full picture has only more recently emerged — here are a few considerations that may help companies grapple with the ballooning cost of cloud.
Cloud spend as a KPI. Part of making infrastructure a first-class metric is making sure it is a key performance indicator for the business. Take for example Spotify’s Cost Insights, a homegrown tool that tracks cloud spend. By tracking cloud spend, the company enables engineers, and not just finance teams, to take ownership of cloud spend. Ben Schaechter, formerly at Digital Ocean, now co-founder and CEO of Vantage, observed that not only have they been seeing companies across the industry look at cloud cost metrics alongside core performance and reliability metrics earlier in the lifecycle of their business, but also that “Developers who have been burned by surprise cloud bills are becoming more savvy and expect more rigor with their team’s approach to cloud spend.”
Incentivize the right behaviors. Empowering engineers with data from first-class KPIs for infrastructure takes care of awareness, but doesn’t take care of incentives to change the way things are done. A prominent industry CTO told us that at one of his companies, they put in short-term incentives like those used in sales (SPIFFs), so that any engineer who saved a certain amount of cloud spend by optimizing or shutting down workloads received a spot bonus (which still had a high company ROI since the savings were recurring). He added that this approach — basically, “tie the pain directly to the folks who can fix the problem” — actually cost them less, because it paid off 10% of the entire organization, and brought down overall spend by $3M in just six months. Notably, the company CFO was key to endorsing this non-traditional model.
Optimization, optimization, optimization. When evaluating the value of any business, one of the most important factors is the cost of goods sold or COGS — and for every dollar that a business makes, how many dollars does it cost to deliver? Customer data platform company Segment recently shared how they reduced infrastructure costs by 30% (while simultaneously increasing traffic volume by 25% over the same period) through incremental optimization of their infrastructure decisions. There are a number of third-party optimization tools that can provide quick gains to existing systems, ranging anywhere from 10-40% in our experience observing this space.
Think about repatriation up front. Just because the cloud paradox exists — where cloud is cheaper and better early on and more costly later in a company’s evolution — exists, doesn’t mean a company has to passively accept it without planning for it. Make sure your system architects are aware of the potential for repatriation early on, because by the time cloud costs start to catch up to or even outpace revenue growth, it’s too late. Even modest or more modular architectural investment early on — including architecting to be able to move workloads to the optimal location and not get locked in — reduces the work needed to repatriate workloads in the future. The popularity of Kubernetes and the containerization of software, which makes workloads more portable, was in part a reaction to companies not wanting to be locked into a specific cloud.
Incrementally repatriate. There’s also no reason that repatriation (if that’s indeed the right move for your business), can’t be done incrementally, and in a hybrid fashion. We need more nuance here beyond either/or discussions: for example, repatriation likely only makes sense for a subset of the most resource-intensive workloads. It doesn’t have to be all or nothing! In fact, of the many companies we spoke with, even the most aggressive take-back-their-workloads ones still retained 10 to 30% or more in the cloud.
While these recommendations are focused on SaaS companies, there are also other things one can do; for instance, if you’re an infrastructure vendor, you may want to consider options for passing through costs — like using the customer’s cloud credits — so that the cost stays off your books. The entire ecosystem needs to be thinking about the cost of cloud.
* * *
How the industry got here is easy to understand: The cloud is the perfect platform to optimize for innovation, agility, and growth. And in an industry fueled by private capital, margins are often a secondary concern. That’s why new projects tend to start in the cloud, as companies prioritize velocity of feature development over efficiency.
But now, we know. The long term implications have been less well understood — which is ironic given that over 60% of companies cite cost savings as the very reason to move to the cloud in the first place! For a new startup or a new project, the cloud is the obvious choice. And it is certainly worth paying even a moderate “flexibility tax” for the nimbleness the cloud provides.
The problem is, for large companies — including startups as they reach scale — that tax equates to hundreds of billions of dollars of equity value in many cases… and is levied well after the companies have already, deeply committed themselves to the cloud (and are often too entrenched to extricate themselves). Interestingly, one of the most commonly cited reasons to move the cloud early on — a large up-front capital outlay (CapEx) — is no longer required for repatriation. Over the last few years, alternatives to public cloud infrastructures have evolved significantly and can be built, deployed, and managed entirely via operating expenses (OpEx) instead of capital expenditures.
Note too that as large as some of the numbers we shared here seem, we were actually conservative in our assumptions. Actual spend is often higher than committed, and we didn’t account for overages-based elastic pricing. The actual drag on industry-wide market caps is likely far higher than penciled.
Will the 30% margins currently enjoyed by cloud providers eventually winnow through competition and change the magnitude of the problem? Unlikely, given that the majority of cloud spend is currently directed toward an oligopoly of three companies. And here’s a bit of dramatic irony: Part of the reason Amazon, Google, and Microsoft — representing a combined ~5 trillion dollar market cap — are all buffeted from the competition, is that they have high profit margins driven in part by running their own infrastructure, enabling ever greater reinvestment into product and talent while buoying their own share prices.
And so, with hundreds of billions of dollars in the balance, this paradox will likely resolve one way or the other: either the public clouds will start to give up margin, or, they’ll start to give up workloads. Whatever the scenario, perhaps the largest opportunity in infrastructure right now is sitting somewhere between cloud hardware and the unoptimized code running on it.
Acknowledgements: We’d like to thank everyone who spoke with us for this article (including those named above), sharing their insights from the frontlines.
Companies selected denoted some degree of public cloud infrastructure utilization in 10Ks
The views expressed here are those of the individual AH Capital Management, L.L.C. (“a16z”) personnel quoted and are not the views of a16z or its affiliates. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by a16z. While taken from sources believed to be reliable, a16z has not independently verified such information and makes no representations about the enduring accuracy of the information or its appropriateness for a given situation. In addition, this content may include third-party advertisements; a16z has not reviewed such advertisements and does not endorse any advertising content contained therein.
This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. You should consult your own advisers as to those matters. References to any securities or digital assets are for illustrative purposes only, and do not constitute an investment recommendation or offer to provide investment advisory services. Furthermore, this content is not directed at nor intended for use by any investors or prospective investors, and may not under any circumstances be relied upon when making a decision to invest in any fund managed by a16z. (An offering to invest in an a16z fund will be made only by the private placement memorandum, subscription agreement, and other relevant documentation of any such fund and should be read in their entirety.) Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z, and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by Andreessen Horowitz (excluding investments for which the issuer has not provided permission for a16z to disclose publicly as well as unannounced investments in publicly traded digital assets) is available at a16z.com.
Charts and graphs provided within are for informational purposes solely and should not be relied upon when making any investment decision. Past performance is not indicative of future results. The content speaks only as of the date indicated. Any projections, estimates, forecasts, targets, prospects, and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others. Please see a16z.com for additional important information.
The Cost of Cloud, a Trillion Dollar Paradox - Andreessen Horowitz (a16z.com)
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||6/11/2021 10:38:18 AM|
|Kill the console|
June 11, 2021
Few business models are more lucrative than recurring monthly subscriptions, and yet the video game industry has resisted them for decades. Since 2017, Microsoft has harbored ambitions to change that with Xbox Game Pass, an all-you-can-play software library launched that year that costs between $10 and $15 a month and features more than 300 games.
Game Pass wasn't the first subscription service in gaming; we've had individual game subscriptions since the '90s, and newer models like seasonal battle passes have popularized recurring in-app purchases. But Game Pass has become the most ambitious model to date and the closest to a true "Netflix for gaming." And it's about to become a whole lot more accessible.
Microsoft announced major expansion plans for Xbox Game Pass yesterday that promise to bring that all-you-can-play game library to many more screens.
Microsoft says it's working with device manufacturers to build Game Pass directly into smart TVs, so the subscription service can be accessible with just a controller, no extra hardware required.Even more ambitious are the company's plans to build its own streaming devices, perhaps similar to Google's Chromecast or the Apple TV, that could also enable access to Game Pass with just a controller.
The cloud is the center of this vision, in particular Microsoft's existing yet still-in-beta cloud gaming platform previously known as xCloud. Less powerful hardware, like streaming set-top boxes and smart TVs, would require the cloud to stream console-quality games.
Now called simply Xbox Cloud Gaming, Microsoft's cloud gaming platform is available on Android phones, the iPhone and iPad, and web browsers to select subscribers of Microsoft's $15 Game Pass plan.Microsoft is planning to expand access to the web browser version of Game Pass to all Ultimate plan customers in the coming weeks. Cloud gaming is also launching in new countries later this year, including Australia, Brazil, Mexico and Japan.Microsoft says it's updating the data centers that power the cloud gaming service with new Xbox Series X hardware to improve performance. It's also launching a version of the service within the Xbox app on PC and on its consoles to make it easier to try games instantly before a player decides either to download or buy them.
Microsoft could fundamentally change the gaming market with an Xbox business oriented around Game Pass, ushering in the kind of paradigm shift that subscription streaming services like Netflix and Spotify brought to the TV and music industries. But only if Microsoft succeeds in convincing gamers to come aboard.
Xbox games could conceivably become accessible from almost any screen, anywhere, at any time. The approach, and a shift in its business model, could also reduce Microsoft's need to compete directly against Sony's PlayStation or the Nintendo Switch.Microsoft hopes its position in the cloud computing, gaming and desktop operating system sectors make it uniquely capable at delivering a streaming subscription platform and product experience of this scale where others, like Google Stadia, have struggled.A key selling point of Game Pass will be access to Microsoft's first-party games, including Halo: Infinite and new games from its Bethesda subsidiary, the day they release, for no extra cost. Microsoft is expected to reveal more about its upcoming releases Sunday during its E3 2021 showcase.
Microsoft doesn't have an easy road ahead of it. It's not clear what the long-term effects of its subscription push will mean for funding blockbuster games, and whether the economics of the industry can sustain such a model without widespread adoption from other publishers. It's also not a given that most consumers even want a game industry dominated by yet more subscription services. But Microsoft is gambling the future of Xbox on this vision of the future, and on ensuring it gets there first.
— Nick Statt ( email | twitter)
Microsoft's cloud vision might just change gaming forever - Protocol — The people, power and politics of tech
|RecommendKeepReplyMark as Last Read|
|From: Glenn Petersen||6/17/2021 3:28:58 PM|
|Six predictions for the future of the edge|
The edge is going to change the way we interact with each other and the world.
June 17, 2021
Thanks to edge computing, the world is about to look very different.
Within a decade, the edge will boast more computing power — and produce far more data — than the cloud does today, said Lin Nease, HPE fellow and chief technologist for IoT.
The edge is where IoT, artificial intelligence and ultra-fast 5G networks are converging.
And that will change our lives dramatically over the next few years. Here are six major trends we can expect to see over the coming decade.
1. Every large facility will have its own edge data center
By the year 2025, the number of connected devices in the wild is expected to exceed 56 billion, according to IDC. And as IoT sensors proliferate through public and private spaces, the volume of data they produce will grow exponentially.
Organizations will need to process that data locally so they can act on it in real time, Nease said. Instead of shuttling that data to the cloud, they'll be operating their own mini data centers on site. Modular, self-contained units like the HPE Edge Center, which can be placed wherever needed, will become common across a wide range of industries.
"Within three years, all operations facilities, including retail stores, hospitals, warehouses — any place the physical operations of the company occur — will have data room capabilities to ingest data from cameras, microphones and environmental sensors," said Nease. "These little edge clouds will resemble the private clouds companies already have in their big data centers, but with less than a dozen servers."
And instead of deploying increasingly intelligent end devices, many organizations will opt for inexpensive sensors that simply collect data, which is then processed locally in their small, private clouds.
"The low-cost approach is to buy cheap sensors and have software running somewhere in the facility that can do pattern recognition or infrared sensing," Nease said. "If you're a large retailer, for example, you will do this in your store in about three years. You will have to if you want to compete."
2. Smarter spaces will enable frictionless transactions
Data captured by edge devices in public spaces will help provide context around who people are and what they came there to do. That will change both how these spaces are designed and how people interact with them, said Partha Narasimhan, CTO at Aruba, a Hewlett Packard Enterprise company.
"Today, some of the context is split between the digital world and the physical world," he said. "Those two will absolutely come together."
For example, when you walk into your bank, smart cameras can identify you as a regular customer and make an intelligent guess as to why you came in, said Narasimhan. If you were researching interest rates on the bank's website the night before, a loan officer could be expecting you, with your financial records already called up on screen.
Similarly, when you step into your doctor's waiting room, edge systems can automatically check you in, pull up your electronic medical record, take your co-payment and alert the nurse you've arrived.
"As you journey through your course of care, hospitals will know who you are, where you are and what assets, like wheelchairs or X-ray machines, you are using," noted Christian Renaud, research director for IoT at 451 Research. "They'll have lower operational costs, you'll have a faster diagnostic experience and clinicians will get to spend more time with patients and less time entering data into electronic health records."
And, as we gradually return to the workplace, edge systems in the building will know that we're there primarily to collaborate with colleagues. We may be assigned a different workspace each day, depending on the people we need to collaborate with that day. Offices will be redesigned with more meeting rooms and fewer cubicles, better teleconferencing gear, smarter whiteboards and more ways to intelligently capture the content of conversations.
"To some people, it may feel a little creepy that someone already knows who you are and why you're there," said Narasimhan. "But so long as you manage that data in a secure way that respects people's privacy, most of us will overcome that feeling because it's so convenient."
3. The robots will be watching us — and learning
The killer app for AI-powered cameras at the edge won't be security surveillance or autonomous vehicles, Nease says. Instead, they will be helping many of us become better at our jobs.
"People will be analyzing the physical environment of their operations using computer vision," he said. "But instead of employing it as a management tool, they'll be using it to redesign processes to figure out how to make them more efficient."
Companies like Drishti are already doing this on manual assembly lines for companies like Ford and Honeywell, using smart cameras to quantify processes, identify quality control issues and train employees more effectively.
A major luxury carmaker is studying how to use computer vision and robots to mimic the work of its craftspeople, noted Dr. Eng Lim Goh, CTO for high-performance computing and artificial intelligence at HPE. The goal is not to replace human workers but enable the creation of highly customized bespoke vehicles, built to the exact requirements of each customer.
"These robots will not only learn from humans but also from each other," he said. "And if you have factories in different countries, they'll be able to share their learnings across borders."
4. Edge devices will begin to learn on their own
Today, edge devices infer decisions based on machine-learning models that have been trained in the cloud and then pushed down to the edge. But as the battery life of IoT devices improves and their computing power increases, these devices will begin to learn on their own, Goh says.
"The beauty of this is that the devices collect the data and then immediately learn from it," he added. "Imagine the latency reduction in your decision-making."
To avoid bias that can be introduced by relying on limited sets of training data, edge devices will be able to aggregate and analyze data from multiple sensors, a concept known as swarm learning.
Because devices share only the insights gleaned from the data, the data itself remains private and secure, Goh added.
For example, connected X-ray machines at a hospital that treats a lot of patients with tuberculosis can share insights with another location that sees more cases of pneumonia. The ability to analyze lung X-rays at both facilities improves, while no patient data is shared.
"We'll continue to see more robust compute and analytics get further and further out in the network until they approach the point of origination, whether that's a camera, a sensor, a drill press or an MRI machine," Renaud said.
5. Augmented and virtual reality will become actual realities
Today's AR and VR apps have proved how quickly nascent consumer technologies can deliver value in enterprise settings. As edge compute and connectivity continue to advance, we will see a continued evolution of experiences that are more immersive, with equipment that is less intrusive. That will drive continued applications of AR and VR into use cases that today are considered impractical.
With greater compute and storage capability at the edge, images and video can be cached locally and served up instantly via ultra-low latency 5G networks.
"The edge will enable new services like AR and VR that require content to be closer to where users are," said Narasimhan. "Virtual experiences will be enhanced to the point where they're easy and intuitive to consume. And as they get more refined, they will make it easier for people to meet virtually."
Beyond reducing the need for business travel, AR and VR will be deployed in a wide range of industrial and commercial settings to improve workflows, bridge expertise supply and demand imbalances, and transform customer experiences.
Factory workers can use AR glasses to view 3D schematics as they assemble parts. AR mirrors inside clothing stores will let customers digitally try on different outfits. We will see more widespread use of the technology by doctors to perform remote surgery a thousand miles away from the operating theater. Visitors to theme parks will use it to interact with life-size holograms of their favorite characters.
"AR will allow employees to play with different datasets and explore them from different perspectives," noted Ross Rubin, principal analyst at Reticle Research. "It will be useful for any application that frees people from having to look at a screen at a particular time and overlay information in a way that sparks insights that might not otherwise be available in that environment."
6. The big privacy issues will eventually be solved
The ability to deliver digital services to anyone in any location carries with it the ability to track everyone's behavior everywhere. Organizations that hope to realize the full benefits of edge computing will need to solve the privacy problem.
"We've done more than 50 consumer and enterprise surveys over the past five years, and privacy always comes up as one of the top concerns," Renaud said. "Hopefully, enlightened regulation will moderate the ambitious potential of the technology to get to a point where our privacy is safe."
In the European Union, for example, data sovereignty rules limit the physical locations where information can be stored, and some countries require connected cameras to automatically blur faces or license plate numbers.
Technologies like swarm learning, which allows insights to be shared without needing the underlying data, can help alleviate data sovereignty concerns. HPE and others are currently working on new data standards that would allow people greater control over how their data is used, Nease said.
"I think these standards will emerge and people will give permission for their data to be used, just as they do with Google every time they perform a search," he says. "But there is no stopping the privacy problem. It is now upon us."
Six predictions for the future of the edge - Protocol — The people, power and politics of tech
|RecommendKeepReplyMark as Last Read|