SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksInvesting in Exponential Growth


Previous 10 Next 10 
From: Paul H. Christiansen12/15/2020 12:28:00 PM
1 Recommendation   of 1048
 
Select here for what investors can learn from C3.ai’s Go-to-Market Strategy

Share RecommendKeepReplyMark as Last Read


To: Paul H. Christiansen who wrote (966)12/16/2020 12:14:31 PM
From: Frank Sully
   of 1048
 
I bought C3.ai at about $106. Currently trading at $114. To the Moon , Alice! Cheers, Sully

This month’s hottest IPO isn’t DoorDash or Airbnb — it’s artificial-intelligence company C3.ai
Tom Siebel’s company has an enormous market for democratizing artificial intelligence.

The most recent flurry of IPOs included two of this year’s most anticipated: DoorDash DASH, 1.83% and Airbnb ABNB, 11.67%.

The companies certainly didn’t disappoint coming out the gate, especially if you were an early investor, as DoorDash and Airbnb soared 85% and 112%, respectively, on their opening day of trading. Pundits and analysts were left befuddled, and the prices of each have slipped in the meantime.

An initial public offering that was overlooked during that time was C3.ai AI, 12.19%. Shares of the Redwood City, Calif., company sit well over double the set price.

C3.ai is the more interesting company that debuted last week. Its work over the past decade to democratize artificial intelligence (AI) for enterprise has real promise, and there is evidence through its early partnerships and customer success that it could lead to significant and stable growth. The company is led by CEO Tom Siebel, who had the same position at Siebel Systems, which was purchased by Oracle ORCL, 1.76% in 2006. The 68-year-old billionaire founded the company in 2009.

B2B applications Artificial Intelligence is a popular buzz word that has infiltrated many of our lives through everything from Siri on our Apple AAPL, -0.10% iPhones to powerful recommender engines that help us find products and services on Amazon AMZN, 2.01%. The consumer applications have created greater awareness to AI for many of us, but there is a bigger AI opportunity brewing in business-to-business (B2B) enterprise applications. AI to help banks better understand customer churn, identify fraud and deploy predictive revenue models. To help oil and gas companies predict maintenance requirements to proactively identify failures before they happen. And to help health-care providers improve health outcomes, reduce care costs and improve patient experience.

C3.ai’s offerings are designed to democratize at scale all of those scenarios and others in aerospace and defense, telecommunications, retail, utilities and more. The C3.ai AI Suite, which is the company’s core technology, is designed to sharply reduce the time to value in using AI in the enterprise. It functions as a software as a service (SaaS) application, and while it has deep partnership integration with Microsoft MSFT, 2.31% and Adobe ADBE, 1.50%, it can be flexibly deployed on Amazon’s AWS, IBM IBM, -0.21% Cloud, Google GOOG, -0.17% Cloud and/or on-premise.

50 million businesses The outcome of its significant R&D investment is a powerful enterprise AI footprint that delivers more than 1.1 billion predictions a day using more than 4.8 million machine-learning models that the company has in production. Moreover, according to C3.ai, these predictions and models touch more than 50 million businesses on a daily basis.

Beyond technology partners, C3.ai has also been able to apply its model-driven architecture to win a diverse group of marquis customers across a vast set of industries, with an average deal size in 2020 at over $12.1 million. This includes Royal Dutch Shell RDS.A, 0.95%, Astra Zeneca AZN, 1.82%, Baker Hughes BKR, -1.52%, Raytheon Technologies RTX, -0.41% and the U.S. Air Force. Customer expansion yielded a healthy 71% year-over-year growth rate for C3.ai in its fiscal 2020 totaling $157 million, and an average growth rate over the past three fiscal years of 69%.

Perhaps what is more exciting is the market potential for C3.ai as the proliferation of AI continues to accelerate. According to the company’s S1 filling, it estimates its total addressable market (TAM) at $174 billion this year, growing to $271 billion by 2024. Specifically, the company sees itself participating in the $44 billion enterprise AI software market, the $63 billion enterprise infrastructure software market and the $93 billion enterprise application market.

Streamlining data Those markets are converging rapidly with AI capabilities being a critical connector. Many companies will be seeking tools and technologies that can shorten the difficult process of managing vast data repositories, software tools and infrastructure complexities. C3.ai’s architecture is designed to streamline this process and considerably shorten the enterprise challenge of applying AI to solve complex business problems.

Of course, the road for C3.ai will have its share of challenges. Large enterprise software and infrastructure providers like SAP SAP, 1.90%, Salesforce CRM, 1.00% and Oracle ORCL, 1.75%, to name a few, are all working diligently to apply greater AI capabilities to exponential data to deliver next-generation insights for enterprise customers. This massive market opportunity isn’t a secret, by any means. However, with the flexible architecture of C3.ai, there is also an argument that many enterprise software platforms, much like Adobe and Microsoft already have, could see C3.ai as complementary and as a vehicle to speed customer adoption of industry-specific AI capabilities.

What perhaps was most evident to me, after watching last week’s IPO frenzy, is it didn’t take a sophisticated AI model to see the potential of C3.ai.

Daniel Newman is the principal analyst at? Futurum Research, which provides or has provided research, analysis, advising, and/or consulting to Microsoft, Amazon, IBM, SAP, Oracle, and dozens of companies in the tech and digital industries. Neither he nor his firm holds any equity positions with any companies cited. Follow him on Twitter? @danielnewmanUV.

Share RecommendKeepReplyMark as Last Read


From: Frank Sully12/16/2020 4:11:23 PM
1 Recommendation   of 1048
 
Barron's interview with C3.ai founder Tom Siebel


Tom Siebel Is Back: An Interview With the CEO and Founder of C3.ai



By
Eric J. Savitz

Dec. 9, 2020 7:52 pm ET






C3.ai had a spectacular debut in the public market on Wednesday. The artificial-intelligence software company priced an offering of 15.5 million shares at $42 a share, above the expected range of $36 to $38 a share, before opening at $100. The stock closed its first day at $92.49, a 120% gain from its IPO price.

Among other things, the IPO marks the return to public view of Tom Siebel, the founder and CEO of C3.ai (ticker: AI). The legendary software entrepreneur was an early executive at Oracle (ORCL) and the founder of the customer relationship management software pioneer Siebel Systems, which he sold to Oracle for $5.86 billion in 2006.

Barron’s caught up with Siebel on listing day for an insightful chat about C3.ai, the outlook for artificial intelligence software, and assorted other topics. An edited transcript of our conversation can be found below:

Barron’s: Hey, Tom. Looks like C3.ai is off to a spectacular start as a public company.

Tom Siebel: I am not competent to comment on the behavior of equity markets. It’s not my field. To the extent I have any expertise, it’s in building and operating software companies. That said, the big picture is we have a huge addressable market, and the investment community recognizes there is a huge market in commercial and industrial AI applications. We’re looking at a $250 billion addressable software market—that’s bigger than a bread box.

And how are you going after it?

We spent the last decade building out a really remarkable software platform, called the C3.ai suite, that represents 1,000 man-years of software engineering work. It is a cohesive set of software services that allows our customers to rapidly and successfully design, develop, provision and operate enterprise and commercial AI applications, at small, medium and large scale.

You so far have a relatively smaller number of very large customers.

The Phase 1 strategy, yes, did involve customer concentration. We wanted to focus on “lighthouse” customers— Shell, Enel, ENGIE, Koch, the United States Air Force, Philips Medical—in multiple industries, multiple geographies and multiple use cases, and demonstrate that the product could be applied successfully to solve complex AI problems, that delivered substantial economic value in a short period of time. And we did that.

What comes next?

The next phase is about scaling the business, not only selling to global juggernauts but also to middle-sized companies, selling to divisions and departments of large companies; in banking, in telecom, in financial services, and in manufacturing and aerospace; in Asia, Europe, North America. That’s the phase we’re entering now. I would argue we have clear technology leadership in this space. I’m unaware of anyone who has built a successful AI platform like we have. If we succeed at that objective, establishing a market position in enterprise AI, this will be a large and hugely successful enterprise application software company. And we’ll build a company that is structurally profitable and cash flow positive.



How should people think about what you do? Are you more an application developer or a platform for clients to build their own applications, or are you building custom applications?

Unfortunately, the answer to that question is yes. About 86% of our revenue is software-as-a-service recurring revenue. Today, 65% of that is from applications. We have a family of applications for banking, like anti-money-laundering, cash management, credit approval, broker rule compliance. Or applications for utilities, like distributed energy resource management, AI-based predictive maintenance, smart grid analytics. We have a family of applications for oil and gas, and for aerospace. So today 65% of our software revenue comes from those applications, and 35% comes from the platform. We sell both. Shell has 200 projects they’re building on top of our platform. Enel has 150. I suspect in a steady state, license revenue will be around 60% for applications, and 40% platform.

And what about services?

Services is about 14% of overall business and will stay there. We’ll prevent it from getting larger by partnering with IBM Global Services and others. If we let it get larger, we’ll get valued as a services business, which as you know carry lower valuations.

Microsoft took a $50 million stake in C3.ai at the IPO price. What’s the story there?

That investment had nothing to do with them making money. We have a huge partnership with these guys. Our technology is entirely complementary to Azure. I’m working on hundreds of millions of dollars of sales opportunities with the Microsoft sales organization.

And you announced a deal with them recently in a very familiar area for you.

We announced a partnership with Microsoft and Adobe, to take the Microsoft CRM stack, the Adobe marketing automation stack, and the C3.ai stack, and bring to market an entirely family of applications—believe it or not—in AI-enabled CRM [customer relationship management] software.

CRM of course was what Siebel Systems did.

So what’s old again is new again. It’s not unusual when I’m working with these large customers—who often were huge Siebel Systems customers—that as we’re deploying our 12th AI application, they say, hey Tom, when are you going to do AI-enabled CRM. We’ve developed those solutions in combination with Microsoft and Adobe, and all three organizations will be selling those worldwide. So the symbolism of that Microsoft investment was not about making money—it was about sending signal to the market and to their own employee base that, hey, this is an important market, pay attention.

Who are your competitors?

When we were doing database software at Oracle in the ‘80s, the competition was companies building their own relational database systems. Who succeeded at that? No one. When we brought ERP systems and CRM systems to market in the 90s, the competition was, the customer was going to build their own. Who succeeded at building their own ERP system, name the company? Nobody did. They all would end up buying from Oracle, or SAP, or Siebel or somebody else. So it is not unusual—this is standard in the business—that when you get into a new market, the knee-jerk reaction of the CIO is to build it himself. Or to pay Accenture $500 million to help them build it.

So the competition is from homegrown systems.

Virtually every one of our customers has tried one, two, three times to build it themselves. What they’ll do is use componentry from Snowflake [SNOW], Databricks, Datastax, H2O.ai, and DataRobot, and they’ll attempt to assemble all of these things together into a cohesive whole that does something useful. Unfortunately, it is an impossible problem, and to my knowledge no one has ever succeeded at doing it. General Electric [GE] spent, like, $6 billion over a number of years trying to do this before they folded their tent.

Databricks, Datastax.... That’s a very hot set of companies you just named.

I’m not saying that these products from companies like Databricks or Snowflake have no value. They have very high value. You can think of what we’ve done—and I know this is hard to believe—is to take the functionality of every software company that’s involved in AI, aside from the cloud, take Palantir, Databricks, DataRobot, H2O, Snowflake, and built all of it into one cohesive architecture. What Palantir does as a company is a feature for us. What H2O and DataRobot do is something called auto ML [machine learning]. That’s a feature. What Alteryx does is a feature within our product, we call it Ex Machina.

So wait, don’t you compete with all of them?

If one of our customers wants to use one of these things—and every company does—because they’ve standardized on it, or somebody thinks it is technically superior to our solution, it doesn’t matter, they can use it. If Shell wants to use DataRobot instead of our auto ML capability, God bless them, it’s fine. If they want to use Databricks, they do—and by the way, they do use Databricks, instead of our data virtualization technology. They don’t have to lose for me to win. We really don’t compete with those guys.

A few years ago, you changed the company’s name, from C3.iot to C3.ai. Tell me about that.

There was a time period, in 2016 and 2017, when the internet of Things was all the rage, and all anyone wanted to talk about was connecting devices. And we do that. So that was probably a mistake to name the company that. Now, for instance we read data from 57 million sensors and 42 million smart meters. Really what we’re doing with the data is predictive analytics. We’re doing AI. I got to a point where 100% of our applications were predictive analytics and AI and 30% of that was IoT. So the first 20 minutes of every presentation was explaining that we were not really an IoT company. It wasn’t a change in the business, we were confusing the market with the name. It was a mistake. AI happens to be really what we do.

Tom, you guys were growing 70% in the April 2020 fiscal year, and dropped to 10% growth in the last six months. What’s the story there?

In the February, March, April, May time period, we hit a speed bump the size of the Empire State Building. It was not a business cycle issue. This [the pandemic] was an act of God. It was apocalyptic. Business came to a screeching halt. Our revenue continued to grow, because we have a backlog, but it grew at a slower rate. But once you got to July, August, September, with what’s happening in digital transformation and AI, it’s blowing and going. Our pipeline is growing at a greater rate than it ever has grown. Coming out of this, you will see a company growing not at 70% or 80%, ain’t no way no how, but we’ll be growing in the top decile of software companies.

Tom, this was fun, thanks very much.


Share RecommendKeepReplyMark as Last Read


From: Frank Sully12/18/2020 2:01:47 AM
   of 1048
 
NVIDIA: A case study In Exponential Growth and AI

AI is a huge market and these companies are already benefiting Chris Neiger (TMFNewsie)
Dec 15, 2020 at 9:22PM

Artificial intelligence (or AI) gets a lot of attention these days because the technology is being implemented into so many parts of our lives, from smart speakers to apps. And as AI expands, the opportunity for investors is enormous. Over the next four years, AI spending is estimated to double and reach $110 billion by 2024, according to research firm IDC.

For investors looking to tap into this enthusiasm for AI, there are a handful of companies pushing artificial intelligence forward. Here's why NVIDIA ( NASDAQ:NVDA), Amazon.com ( NASDAQ:AMZN), and Appian ( NASDAQ:APPN) should be on your AI stock buy list.


Image source: Getty Images.


1. NVIDIA For many years NVIDIA's core business came from selling graphics processors for gaming. But as data centers have become more complex, many companies have looked to NVIDIA's GPUs for their artificial intelligence data centers.

This shift has helped NVIDIA's data center revenue grow quickly and back in August the segment outpaced gaming revenue for the first time. In NVIDIA's most recent quarter, data center sales spiked 162% to $1.9 billion. Not all of the data center sales came from AI, but as more companies aim to boost their AI data center capabilities, many of them will look to NVIDIA's GPUs to help them do so.

NVIDIA is also pursuing new AI opportunities through its pending $40 billion acquisition of Arm Holdings. NVIDIA CEO Jensen Huang says the deal "will create a company fabulously positioned for the age of AI," as NVIDIA combines its artificial intelligence platform with Arm's CPU designs.

NVIDIA is already a leader in AI and when the company closes its acquisition of Arm Holdings, expected in the first quarter of 2022, the company's AI prospects appear even brighter.

Exponential Growth Of NVIDIA GAAP Operating Income


Share RecommendKeepReplyMark as Last Read


From: Frank Sully12/18/2020 10:58:43 AM
   of 1048
 
NVIDIA: Supercomputer Win

blogs.nvidia.com

Share RecommendKeepReplyMark as Last Read


From: Frank Sully12/18/2020 11:13:56 AM
   of 1048
 
NVIDIA: FWIW, I am planning a modest investment of 40 shares into NVIDIA next week. I like it's exponential growth and AI potential, particularly with the purchase of ARM. Any comments? Here is a SA article on valuing NVIDIA.

Nvidia: Deep Dive And Cash Flow Analysis
Dec. 14, 2020 5:56 AM ET
|
15 comments
|
About: NVIDIA Corporation (NVDA)



Trevor Jennewine
Long Only, Value, Growth, long-term horizon



(962 followers)



Summary
Nvidia is a leading provider of GPUs and AI solutions, with products that address a wide range of use cases.

Management estimates Nvidia's addressable market will reach $250 billion by 2023 - nearly 17 times Nvidia's trailing 12-month revenue.

Based on my DCF model, I estimate Nvidia's fair value at $517 per share.

Nvidia is a buy for long-term investors.

Investment Thesis In the age of big data and AI, Nvidia's ( NVDA) GPU-accelerated computing platforms and AI solutions address critical needs for developers, researchers, and scientists across various markets, from high performance computing and data analytics to autonomous vehicles and robotics.

My investment thesis is summarized in the following points:

1. Nvidia's intellectual property gives it an advantage over competitors in several quickly growing markets.

2. Nvidia has an enormous market opportunity, at $250 billion by 2023.

3. Nvidia's financial performance has been stellar in recent years. Since 2015, Nvidia has grown revenue at 26% per year and profits at 44% per year.

Nvidia's Market Opportunity Nvidia is a leading provider of GPU-accelerated computing platforms and AI solutions. Nvidia's products have applications in a variety of industries and markets, including gaming, professional visualization, data centers, and edge computing.

The GPU is the foundation of Nvidia's computing platforms. While GPUs were originally designed to carry out the complex calculations necessary for graphics processing, Nvidia's parallel computing platform, CUDA, allows GPUs to be used as general purpose processors, too.

Gaming: For PC gamers, Nvidia's hardware solutions include several generations of GeForce Graphics Cards, including the most recent RTX 30 series, which are powered by Nvidia's latest GPU architecture: Ampere. The RTX series graphics cards combine ray-tracing and AI to deliver more realistic visual effects.

Professional Visualization: For animators and design professionals, Nvidia's hardware solutions include several generations of Quadro GPUs. However, Nvidia is dropping the name Quadro from the upcoming RTX A6000. The RTX A6000 will incorporate Ampere architecture GPUs, ray-tracing, and AI tools to deliver enhanced graphics to a wide range of users, from animators using Pixar RenderMan, to design engineers using Autodesk ( ADSK) AutoCAD, to marketers using Adobe ( ADBE) Premiere Pro.

Data Centers: Nvidia's data center products are used by data scientists, researchers, and developers to accelerating workloads in high performance computing (HPC), data science, artificial intelligence, machine learning, and deep learning.

Nvidia's data center hardware products include the HGX platform, built with Nvidia GPUs, NVLink-powered NVSwitches, and Mellanox InfiniBand networking. HGX servers are used to accelerate AI and HPC workloads. The HGX platform is also the building block for another Nvidia product: the DGX platform. This server includes the previously mentioned HGX components, but also incorporates CPUs. The DGX server is targeted at AI applications, such as training deep neural networks for deployment in edge servers, autonomous robots, or autonomous vehicles.

Source: Nvidia Investor Presentation ( May 2020)

Complementing its hardware, Nvidia's GPU Cloud (NGC) provides access to GPU-optimized software designed to accelerate deep learning, machine learning, and high performance computing applications. This provides access to deep learning frameworks, like TensorFlow and PyTorch, which enable the designing, training, and validation of deep neural networks. It also includes TensorRT, an inference engine that runs trained neural networks on GPUs to generate real-time AI inferencing.

Edge Computing: In 2019, Nvidia launched the EGX platform, a family of GPU accelerated computing systems designed to handle AI workloads at the edge - it includes both EGX servers for edge data centers and embedded Jetson systems for edge devices.

Using trained AI models, the EGX servers generate inferences based on data streamed from edge devices (such as an IoT sensor). The EGX server then makes a decision, and sends the data back to the edge device. This technology, known as edge AI, has applications across a range of industries: retail, manufacturing, telecom, smart cities, healthcare, etc. For instance, by analyzing traffic patterns, EGX servers could coordinate traffic lights throughout a city to minimize roadway congestion.

In the event of a low confidence decision, the EGX server can send the information back to a central data center, where the AI model can be retrained with new data, then redeployed at the edge. This dynamic is shown in the graphic below.

Source: Nvidia EGX Platform.

Depending on the edge use case, Nvidia offers various software solutions that complement its hardware and help developers build the applications they need: Metropolis for smart cities and retail, Isaac for robotics and manufacturing, Clara for healthcare, Aerial for 5G, Merlin for recommendation systems, and Jarvis for conversational AI.

Market Opportunity If the acquisition of ARM is approved, Nvidia's management estimates the company's market opportunity will reach $250 billion by 2023:

  • Devices: $95 billion
  • Data Center: $80 billion
  • Auto/Edge AI: $75 billion
  • The total figure, $250 billion, represents nearly 17 times Nvidia's revenue over the last 12 months - that's a big opportunity. And by combining Mellanox's high performance interconnect solutions, ARM's energy-efficient processors, and Nvidia's world class GPU-accelerated computing platforms, the company would be well positioned to capture growth across all three markets listed above.

    Financial Analysis Nvidia has posted strong revenue growth in recent years, increasing sales from $5.0 billion in fiscal 2016 to $14.8 billion over the trailing 12 months, which clocks in at 25.6% per year. The charts below shows Nvidia's revenue growth in each of its four business segments: Gaming, Professional Visualization, Data Center, and Autonomous Vehicles.

    Source: Nvidia Investor Presentation ( November 2020)

    So far in fiscal 2021, Nvidia's revenue growth has accelerated substantially, up nearly 57% in the most recent quarter. Despite some margin compression, growth in revenue has accelerated earnings growth as well, with profits up 46% in Q3 2021.

    Like the income statement, Nvidia's balance sheet looks strong, with $10.1 billion in cash and marketable securities compared to $7.6 billion in debt and lease liabilities.

    Valuation In my DCF model, I have made the following assumptions:

  • Growth Rate: 20%
  • Terminal Rate: 2.9%
  • Discount Rate: 8.0%
  • Since 2015, Nvidia's FCF-per-share has grown at an annualized 30%. I have selected a much more conservative 20% to introduce a margin of safety. I selected 2.9% as the terminal growth as this figure corresponds with worldwide GDP growth over the last decade. Finally, I have selected 8.0% as my discount rate as this figure corresponds with the average return of the S&P 500 since 1957.

    Source: Created by the author.

    Based on my DCF model, I estimate Nvidia's fair value at $517. This number is roughly equivalent to the stock price at the time this article was written.

    I have also used FCF-per-share to model potential returns over the next five years. Again, I assume 20% annualized growth in FCF per share. But because Nvidia's current price-to-FCF multiple is near its historical high, I have selected three more conservative multiples: 70, 50, and 30.

    In the most bullish scenario, Nvidia returns roughly 18% per year over the next five years. And in the most bearish scenario, Nvidia returns -0.5% per year over that time period.

    Conclusion Nvidia is a market-leading provider of the GPU-accelerated computing platforms and AI solutions needed for high performance computing, machine learning, and deep learning. Given the company's strong competitive presence in these high growth markets, Nvidia looks well positioned to capture value in the years ahead.

    For long-term investors, I rate Nvidia a buy at $517.

    Disclosure: I am/we are long ADBE, NVDA. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

    Share RecommendKeepReplyMark as Last Read


    From: Frank Sully12/18/2020 12:18:15 PM
       of 1048
     
    C3.ai Continues To Soar!

    I bought 180 shares of C3.ai (AI) on Monday for $105.79/share. Now it's trading for $134.78/share, up 27.4% in a week. Whoopie! I'm in AI for the long-term.

    Quote

    AS OF 12/18/2020 12:13PM ET

    $134.78 +$17.541 (+14.962%)

    Cheers,
    Sully

    Share RecommendKeepReplyMark as Last Read


    From: Frank Sully12/18/2020 12:33:18 PM
       of 1048
     
    Nvidia And Arm: All About Control
    Sep. 21, 2020 5:05 AM ET
    |
    41 comments
    |
    About: NVIDIA Corporation (NVDA)



    Trading Places Research
    Long/Short Equity, Value, long-term horizon, Growth



    (4,731 followers)



    Summary
    There is great enthusiasm for this merger both in the Arm developer community and amongst investors.

    But questions persist. Foremost among them is the regulatory issues: can this even happen? There is a long road ahead with regulators.

    But also, what does Nvidia get out of this merger that they could not have gotten otherwise, much more cheaply, with less regulatory oversight, and without the opportunity costs? Control.

    My best guess is that this deal never happens, and Chinese regulators are the reason.

    Questions Answered I had a lot of questions about what the NVIDIA ( NVDA)-Arm means before details were announced, and we have at least preliminary answers now.

  • Can this even happen? The regulatory road here is going to be long and hard and involve the UK, US and China. Given the structure of the deal, I believe Chinese regulators will wind up being the ones to squash it.
  • What does NVIDIA get from the acquisition that they couldn’t have also gotten much more cheaply? Only one thing: control of the CPU roadmap.
  • How will the rest of the tech industry respond? The answer to the last question will make them fearful, and many Arm customers will try and block this deal. Some may look for the exit.
  • Before we dig in, first a very quick Arm primer for those unfamiliar.

    What is Arm? Arm graphic

    Arm is a UK-based company that has been making low power systems-on-a-chip (SOCs) since the first Apple Newton. They do not make their own chips, but rather license the instruction set, core designs, and reference designs to other companies. A partial list of customers includes Apple ( AAPL), Qualcomm ( QCOM), Samsung ( OTC:SSNLF), AMD ( AMD), Amazon ( AMZN), Huawei, ZTE and even Intel ( INTC). NVIDIA uses Arm IP in their Nintendo Switch SoC and a range of other dedicated controllers.

    Arm-based chips power almost every single smartphone, soon every single new Mac, and there is a huge push to get them into the data center, starting with Amazon’s Graviton instances.

    This is an extremely slow revolution where Arm-based chips replace the long-dominant x86 platform.

  • Phase 1: Intel and AMD miss out on a huge opportunity in smartphones, and allow Arm-based chips to dominate. This is water under the bridge.
  • Phase 2: Arm-based chips replace x86 in PCs. This is already happening. Apple will have all Macs running on their own Arm-based silicon within a couple of years. They may announce the first ones on Tuesday.
  • Phase 3: Arm-based chips replace x86 in the data center. Again, this is already happening with Amazon’s Graviton2 instances.
  • Over 20 billion chips a year now ship with Arm IP. The total number out there is 180 billion, and 70% of the world’s population own something with Arm IP in it. SoftBank ( OTCPK:SFTBY) purchased them in 2016 for $32 billion. So that is the very condensed history.

    Details of the Deal Let’s start with what NVIDIA is getting, what they paid, and how they are paying.

  • They are not getting the IoT division. Rumor had been that SoftBank wanted to hold on to this, and they have. This brings up more questions.
  • Arm screenshot. NVIDIA does not get what’s highlighted in red.

  • Most importantly, NVIDIA gets the IP to the instruction sets, core designs and reference designs the smartphone, PC, AI, and data center.
  • They also get the best tech company in the UK, one of the most important in the world, and all their talent.
  • They are paying a reported $40 billion, but that’s not exactly true.

  • Only $2 billion in cash is due at signing, which is mostly a breakup penalty. They had $11 billion in cash and investments at the end of July, offset by $2.4 billion in current liabilities.
  • Another $10 billion is also due at closing, estimated in 18 months. That is likely an optimistic timetable. But in any event, they have $5.6 billion in operational cash flows in the TTM, so they have plenty to pay that.
  • The bulk of the deal will be paid in 44.3 million new NVIDIA shares priced around $485 dollars, valued at $21.5 billion, or 89 times TTM earnings.
  • In addition to the 44.3 million shares, Arm employees will get another 3.1 million shares valued at $1.5 million. In total, that was a 7.1% dilution for current shareholders.
  • There is another $5 billion in cash due SoftBank if Arm hits performance targets. We don’t know what these targets are.
  • So adding it up, “$40 billion” really means:

  • $12 billion in cash.
  • $23 billion in shares priced at a 16-year high earnings multiple.
  • Maybe/maybe not another $5 billion in cash.
  • This may turn out to look like an NFL contract where they announce a sky-high total value, only part of which the player will ever see.

    Can This Even Happen? This is going to be one of the most scrutinized deals of all time, because there are so many stakeholders here. The deal cuts a wide swath across the entire industry that will create issues for anyone making chips, smartphones, PCs or servers. Moreover, there are national issues at stake here for, at a minimum, the US, UK and China.

    Vivek Arya from Bank of America (NYSE: BAC) put the question plainly in the conference call:

    My question is on the regulatory and the ecosystem hurdles. So what kind of pushback do you see from regulators, especially in China, given all the trade tensions and then separately, any pushback from Arm’s customers, several of whom are your competitors today? And you know, they have unfettered access to Arm’s technology. So what kind of hurdles do you foresee, right, from both that regulatory and ecosystem perspective?

    NVIDIA CEO Jensen Huang almost completely punted on the regulatory issue and focused his answer on Arm’s customers, who include Apple, Qualcomm, Broadcom, Samsung, Intel, AMD, Huawei, ZTE and pretty much anyone else in the business of designing or manufacturing chips, phones, tablets, PCs or servers. There is a lot of the tech world with a lot riding on this. Apple just bet the future of the Mac on the Arm instruction set. This fact has not escaped the leadership of either company. Will there be briefs filed by everyone I mentioned and more? You can bet on it.

    The bottom line is that Arm has never competed with its own customers before, and now it will be.

    Here was Huang’s answer:

    The customers after they heard the rumors have reached out to Simon [Segars, Arm CEO] and he’s spoken to a lot of customers. And the fact that we're announcing the deal tells you a little something that we believe that the customers are going to be satisfied with our genuine intention to keep the platform open and neutral. We bought the company and we paid as you know, a very significant price, because of the vast network and the vast ecosystem of Arm, we love the business model…

    Open and fair and in fact offer all of those, all of Arm’s customers even more IP, I think is going to be very exciting. Regulatory, they're focused on pro competition, a condition that can ensure pro-competition and pro-customers, pro- customer choice. We are going to bring customers more choice. We are going to, for the very first time have credible plan to turn the Arm CPU core into a full-out data center platform…

    This combination really allows us to focus that energy and capture the value and deliver it to the market, an alternative platform that is going to be very powerful, very capable. And so the regulator's love to see pro-comp, you know, competitive dynamics in the marketplace. And so, we're pro- customer, we're pro-competition, the regulator should be very supportive of it.

    So he’s trying to frame that as x86 vs. Arm, and that would provide more competition, not less. But the combination does not really do that, it just changes the address of corporate headquarters. Nothing he said precludes some nightmare scenarios for competitors. For example, I did not hear him promise:

  • To keep licensing terms unchanged for a period of time.
  • To not force licensees to license an entire suite of IP, much of which is not needed, like Qualcomm does.
  • To license future development under open, fair and nondiscriminatory terms.
  • They talked a lot about AI. Are they going to be willing to license those developments under reasonable terms, not tied to other IP? How will they deal with their main GPU competitor, AMD? If they start developing data center chips of their own, as they have indicated may be in the cards, how will they deal with Amazon, or the much smaller companies like Ampere and Nuvia that are working on competing chips? The phrase “open and neutral” is nice, but there is a lot of wiggle room inside those quotes. Just look at what Facebook ( FB) does with seemingly benign words.

    Stacey Rasgon of Bernstein was also a little skeptical:

    I hear what you're saying about the potential value that customers will have. But at the same time, how does this really work in practice? I mean, it's a practical consideration that you're going to get first look at everything. You're going to control the direction of innovation of the company. You're going to know all of your customers’ roadmaps before they do. So, how are they really going to get comfortable with this, especially in some of these like high-performance computing and servers? And I sort of ask this within the envelope of maybe some of your prior efforts to drive an IP licensing business, which, to my recollection, did not go all that well.

    Ouch, Stacey Rasgon. Huang’s response was basically, “trust us.” He first described their current licensing approach of allowing customers to choose which layers of the stack they want. Then this:

    In the case of Arm, this will just be one layer, one technology offering, meaning that in the case that a customer would like to have our technology already fixed and hardened in a chip, that's fine. But if they would like to have it be soft and malleable, that's fine too. I think that the business model, the modern business model, this is just not as odd as it seems. You know, we go to the cloud and we have infrastructure as a service, platform as a service, software as a service. And the platforms make themselves available to you however you would like to be – like you to engage it.

    I think fundamentally, NVIDIA is, of course, going to continue to protect our most important thing, which is our enterprise reputation that we have to be a company that customers and our partners can trust… And so, we decided that this business model was something that we want to bring to make part of an extension of our company that we think it's great for business, it's great for economics, that is great for our strategy, and that it's going to be a pillar for extending our computing platform. If we believe that as I do, and we make a commitment to keep it open, we will.

    “Trust us,” is cold comfort for Arm customers and NVIDIA competitors, who are one and the same. The difference between NVIDIA now, and NVIDIA plus Arm is that they would have market-making powers, where they didn’t before. With this comes a whole new set of incentives. Even if they could trust Huang and his team, what about his successor, and his successor’s successor?

    And again, Huang really punted on the regulatory issue. They have promised to keep Arm headquarters in Cambridge, UK, and invest more into it, so this may take care of UK regulators. But that doesn’t mean there won’t be opposition. One of Arm’s co-founders, Hermann Hauser, has very publicly opposed the merger. His three arguments:

  • NVIDIA will reduce UK employment. NVIDIA seems to want to do the opposite.
  • "The sale of Arm to NVIDIA will destroy the very basis of Arm’s business model which is to be the Switzerland of the semiconductor industry dealing in an even-handed way with its over 500 licensees. Most of them are NVIDIA’s competitors. Among them are many UK companies. Assurances to the contrary should be legally binding.”
  • His main argument is that the crown jewel of UK tech will be in the hands of a US company, and subject to the control of the Departments of Treasury and Commerce, particularly the Office of Foreign Assets Control at Treasury.
  • The US will be trickier, as there will be an array of US corporations who are Arm customers, and who will game out a range of bad outcomes for regulators to mull over. Hauser’s second point is even more relevant in the US. If approved, NVIDIA will likely have to sign on to a consent decree or similar that severely limits what they can do. Unless their lawyers are as incompetent as GM’s were in the Nikola deal, I have to imagine they already understand this. But even with a consent decree, Arm customers would still be subject to NVIDIA’s willingness to abide by it, and the Justice Department’s willingness to enforce it.

    Huang tried not to mention China at all in his response, and there is good reason for that. He was pressed again on China by C.J. Muse from Evercore, and gave this non-answer:

    With respect to China, the important thing to realize is that the ownership, the company ownership of the IP is not the relevant issue. The origin of the IP is the relevant factor in export control. The IP of ARM was originated, created, developed over three decades in Cambridge. And so, the amount of soft – the amount of core, the amount of innovation is measured in thousands of human years. And so, the IP will essentially stay in the UK. The headquarters of Arm will be in the UK. We're going to continue to advance the technology in the UK.

    So he answered the question in terms of US technology export controls, rather than address the elephant in the room, which is how the Chinese Communist Party is going to react. This is the most delicate situation of them all. Like so many things in China, explaining it is an entire article unto itself, but I will try and keep it brief.

    In 2018, Softbank sold off its Chinese licensing business (not the IP) to a joint venture with Hopu Investments, a Chinese private equity firm run by Fang Fenglei. Fang has deep ties to both the Chinese Communist Party and Goldman Sachs ( GS). There is a joke in there somewhere. SoftBank retained 49% of the JV, Arm China.

    At the time, China represented a fifth of Arm’s revenue. So essentially, in exchange for $845 million, and a partnership with a connected Chinese big shot to make life easier on the mainland, SoftBank sold off half their Chinese cash flows, or 10% of all their revenue.

    They put Allen Wu in charge of the operations, and he turned out to be very problematic. Wu started his own fund in the image of SoftBank’s VisionFund. He began hitting up Chinese Arm customers for investments, offered up reduced Arm licensing fees as a sweetener, and made licensing more difficult for those who refused. When this all became apparent over the winter and spring of 2020, SoftBank and Fang tried to oust Wu, and he refused to step down. He has retained the company seal, which is a centuries-old legal token of control. Just the other day, a front company for Wu sued the Arm China board over his attempted ouster.

    NVIDIA is acquiring SoftBank’s minority interest in Arm China, and they are also acquiring a bit of a mess there, even if SoftBank were to sort it out first. But the bigger issue is how the Communist Party will view this deal. Given the last 4 years, and the recent US actions with TikTok and WeChat, they may not be eager to let a US company control a key piece of everyone’s tech stack. So Hauser’s third point is especially relevant for China.

    There is a long road ahead here, and the deal, in the end, may wind up getting squashed, or the regulatory terms become too onerous for NVIDIA to take. If I had to bet on it, I would say that is what happens, and China winds up being the fly in the ointment.

    What Does NVIDIA Get That They Couldn’t Have Gotten A Lot Cheaper? Here is the big question for me. Let’s start with their explanation, from CEO Huang on the conference call:

    The age of AI is driving a tremendous acceleration in the demand for computing, precisely at a time that Moore's Law has slowed. This requires a new approach in computing, as legacy architectures are not keeping up. Accelerators – NVIDIA’s accelerated computing platform has risen to the challenge, and is leading the path forward. And it's the backdrop to why this deal is so compelling and complementary for both businesses. There are several exciting growth drivers in our combination. First, Arm’s vast ecosystem. We can bring NVIDIA's GPU and AI Technology to large end markets, including mobile and PCs.

    Second, in the data center, NVIDIA will turbocharge Arm’s R&D and meet cloud computing customers demand for a higher velocity Arm’s CPU – Arm’s server CPU roadmap. NVIDIA AI would be an excellent support for all data center CPUs, x86, power and Arm. And third, we can accelerate our Edge AI and IoT roadmap and growth trajectory for the next wave of computing.

    The combination will boost our ecosystem of developers who are the cornerstone of our computing platform. Arm expands NVIDIA’s developer reach from 2 million to over 15 million.

    And then later in the call, he tied in Mellanox acquisition, and their new Bluefield data processing units (DPUs).

    The CPU is fantastic and low latency, single-threaded, really, really, really predictable latency type of processing. The GPU is really fantastic at high throughput data processing. And the DPU is really good at network and serial data movements, security processing. These three type of processors will define the future of computing. It is exactly the reason why we're so excited about this acquisition that with Arm and NVIDIA and Mellanox we have three computing platforms, one for networking, one for high throughput computing, accelerated computing, and AI computing, and one for CPUs and single-threaded computing.

    There’s a lot there. But as we look at it, ask yourself: could NVIDIA have gotten this for less than $40 billion or whatever it actually winds up costing?

    The first point he made is bringing NVIDIA to Arm customers. This set everyone’s hair on fire in more than a few places, including AMD and Qualcomm. He leaves open the question of how forceful they will be. Could they try and force Apple to license GPU technology they do not need along with the instruction set they use? Qualcomm is doing something similar to Apple, and Apple is, well, displeased by this, and are spending billions of dollars to not be a Qualcomm customer anymore.

    The second point, focus on the data center, will be met with mixed reactions. Companies like Amazon, Ampere and Nuvia that are working on Arm data center CPUs will be excited on the one hand that the data center cores will be getting a lot of attention, but wary that a competitor, NVIDIA, will at least be getting first crack at it, if not more.

    Then there is Qualcomm, Huawei, Samsung, and every handset maker not named Apple, who screamed out in unison, “But what about us?” The entire smartphone industry, aside from Apple who only licenses the instruction set, is highly dependent on Arm pushing those cores forward. NVIDIA has talked very little about smartphone SoCs through all of this.

    His next point on IoT brings up even more questions, so many that it gets its own section below, so hold on to that for a moment.

    Then he brings up Arm developers.

    But the nut of the argument centers around Moore’s Law, which states that CPU transistor count doubles every two years. It no longer works its magic for Intel and everyone else. A silicon atom is 0.2 nm across. The smallest transistor in a commercial product was just announced at Apple’s September event at 5 nm, meaning 25 atoms across. This can only go so far. For a long time, this problem has been solved with multiple cores, but single-core performance has not grown nearly as fast as it once did.

    Core packing will continue. Ampere will have a dual-socket server design early next year with 256 Arm cores. That is not a typo.

    But the bigger project is moving beyond the CPU in the data center, though it remains the base of the stack. On top of that, you put GPU accelerators for high-throughput tasks, and add to that DPUs that push data around the center at very high speeds. Binding that all together and making the whole thing work is AI, already an NVIDIA core competency.

    But back to that question: did they need to spend $40 billion or whatever it winds up being? For almost all of it, the answer is “no.”

    Imagine that instead of this, they announced a program to move hard into Arm development.

  • Go on a hiring spree and poach top talent.
  • Announce an annual conference just for Arm developers modeled around Apple’s annual Worldwide Developers Conference.
  • Produce a roadmap of where they see the CPU-GPU-DPU-AI stack going.
  • Make it an emphasis in corporate communications.
  • This would cost significantly less than $40 billion, or even $12 billion, the cash price. NVIDIA could fly the developers into the conference, and put them up in nice hotels, and it would still be cheaper. Matt Ramsay of Cowen asked this question very directly:

    You guys have license for essentially all of Arm’s core technology plus and architectural license for V8 and I'm sure V9 as it comes out, so I guess Jensen the question that I've been getting from folks is, why buy the whole company, right? What things are you looking forward to in close collaboration of CPU and GPU in the datacenter or putting your IP on clients and Edge devices? Could you’ve not have accomplished by just being an influential member of the Arm partnership and doing your own architectural license implementation of chips?

    Huang gave the same non-answer I already quoted, because he can't say the real reason. What does NVIDIA get from an acquisition that they cannot get from this much cheaper strategy? Control of the roadmap, and first access to it. At the end of the day, that is why it is worth it to NVIDIA, and why the prospect is so scary to so many stakeholders.

    IoT, and the RISC-V Risk NVIDIA did not get the Arm IoT unit, which SoftBank held on to. One of the very interesting tidbits that came out of the deal is that Arm’s EBITDA margin rose from 15% to 35% with the exclusion of the IoT unit, so right now it is a big drag on the consolidated company.

    But the split opens up even more questions about how this is going to work and what it means for the platform. For that, we need to discuss RISC-V.

    RISC-V is an open-source instruction set that is very robust and is becoming an important competitor with Arm in the IoT space. Because IoT SoCs are dedicated controllers that do not require third-party buy-in like smartphones, Arm’s ubiquity advantage vanishes. Companies save millions in licensing fees by choosing RISC-V over Arm. Like in the Linux space 25 years ago, companies like SiFive are filling the support gap in the open source ecosystem.

    Arm’s response was, in the first place, to allow smaller companies to start playing around with lower levels of IP for free, with payments coming only at production. In addition, they have made the lower level instruction sets open and editable by customers, one of RISC-V’s advantages.

    So there are three issues brought up here.

  • NVIDIA bought the instruction sets, which the IoT division is reliant on. What are the terms here? Does SoftBank get a free license? Will they fork the instruction set, or is the plan to fork it at the deal’s completion? Doing so would hurt one of Arm’s advantages, which is the common instruction sets across a wide range of device types.
  • If the deal goes through, NVIDIA and SoftBank will be competitors in the IoT space. How that works out in conjunction with point number 1 is anyone’s guess.
  • But there is a low-probability risk here that Arm customers will be so turned off by this deal, and afraid of being reliant on NVIDIA’s goodwill, that they will turn to RISC-V. Again, the probability is low, and nil in the short and medium terms, but it still exists.
  • This portion of the picture has been largely overlooked, but also has the potential to cause the most future problems for everyone involved should IoT really take off.

    Opportunity Costs Opportunity costs are one of those things that people talk about a lot, but don’t always factor into their thinking. These costs are the things you can’t do, because you did something else. The cost of my vacation was $5,000, but because of it, I couldn’t pay my mortgage and lost my house. That’s an opportunity cost in extreme.

    Any deal this big has huge opportunity costs. It’s not just the money NVIDIA is spending here, which at the end of the day is not all that much, but the attention and focus it will suck from other things they want to do. This likely precludes any more big moves by NVIDIA for a while.

    But the bigger opportunity risk here is that they spend 18 months trying to get this deal approved by regulators, only to be thwarted in the end. The $2 billion is mostly breakup fee, and they will have wasted 18 months, and a huge portion of leadership’s focus on a deal that never happened.

    So regardless of how this plays out, NVIDIA has set their course for the next year-plus at a minimum, and the opportunity costs to that will be large.

    Arm Financials SoftBank only reports limited financials on its divisions, so we know only a little bit about what it is NVIDIA is buying on the financial side. This is further confounded by SoftBank changing how they report their segments almost every year, so it’s hard to put a time-series together.

    Furthermore, the IoT division is a huge drag on the rest, and SoftBank is keeping that.

    NVIDIA reported that Arm would be immediately accretive to gross margin and EPS. The first part is obvious, but the second part not entirely because of the dilutive effects of the new shares. What they are telling us is that NVIDIA net income will rise by 8% or more with the consolidation. This is very good news for current shareholders.

    SoftBank changed how they reported out segment net sales and “income” (EBT) for fiscal 2018, so we only have the last two years to look at:

    Image from SoftBank FY 2019 annual report

    That 2018 segment income is actually a loss of ¥42.3 billion, as it includes the ¥176.3 billion cash they got from setting up the Arm China JV. So how does that square with the acquisition being accretive to EPS? Again, the IoT division proves to be a huge drag on profit measurements, and SoftBank is keeping that. SoftBank seems to tire quickly of profit-making companies.

    The other interesting metric they provide is royalty units, which is the number of chips that shipped with Arm IP on board, so many devices will have multiple royalty units. Royalties were 57% of Arm net sales in FY 2019, ended in March 2020, with 31% from licensing, rounded out by 12% from their fast-growing software and services division. But beyond the income, it’s a pretty good measurement of what is happening on the ground with Arm technology.

    What’s being reflected here is the flattening of global smartphone sales. But two big caveats:

  • They saw a marked uptick in all their YoY metrics in Q4, ending last March. Customers have a quarter to report, so that reflects the December quarter and the surge in 5G phone sales beginning then. That should continue.
  • Many people, including me, think they are on the cusp of a new wave of growth centered around PCs and the data center.
  • So if the deal goes through, the Arm NVIDIA gets may already look quite different than the Arm of today.

    What is Goodwill? Data by YCharts

    When you see a chart like that it is a bit of a red flag that an acquiring company likely overpaid on a deal. What you are seeing there is the effect on NVIDIA’s balance sheet from their acquisition of Mellanox. The total tab there was $7.1 billion, but 48% of that, $3.4 billion, went into goodwill. What is goodwill? Here’s how NVIDIA explains it in their most recent reporting:

    We allocate the fair value of the purchase price of an acquisition to the tangible assets acquired, liabilities assumed, and intangible assets acquired, including in-process research and development, or IPR&D, based on their estimated fair values. The excess of the fair value of the purchase price over the fair values of these net tangible and intangible assets acquired is recorded as goodwill. [emphasis added]

    I will translate for you: “This is about how much we overpaid by, but we don’t want to take a write-down, not for a few years.”

    So this is NVIDIA admitting they overpaid for Mellanox by about 92%. It was an all-cash deal. You can bet that goodwill will be the first thing I look at when NVIDIA reports if the deal is consummated. Most of Arm should wind up in intangibles, because this is where IP and brands go. But if a huge chunk shows up in goodwill like the Mellanox deal, we will know they overpaid, by their own admission. Though again, $40 billion was for the headline, not the real number.

    For what it’s worth, SoftBank reported 79% of their Arm acquisition as goodwill, booking only 21% as tangible and intangible assets. Their description of goodwill is much more colorful and SoftBank-y:

    Goodwill reflects the excess earning power expected from future business development, congregative human resources related to research and development, and the synergy between the Company and the acquiree.

    Extra points for “synergy.”

    What Does This All Mean For Arm Tech? People with whom I speak regularly have been subjected to articles of this length in rant form since about 2008 after Apple bought PA Semi and started their own chip design. I started writing at Seeking Alpha in October 2018, and two months later wrote my first Arm Is Going To Take Over The World screed for publication here. It was my most-read article at that point, and it was because Intel and AMD fans lined up from far and wide to tell me what a moron I was to suggest that Arm would replace x86 in the data center.

    How much has changed in less than two years. Whether this deal goes through or not, it is a bellwether event for Arm. There is just a ton of momentum here with Amazon, Apple and now NVIDIA having huge Arm announcements in a year’s time. Soon, I won’t have to write the “What is Arm?” section, because everyone will already know.

    That this slow revolution is continuing is almost certain now. In fact, it is picking up steam.

    Is This a Good Deal for NVIDIA? If they were paying with $40 billion in debt-financed cash, I would say this is a terrible idea. But they are not. They are mostly paying with stock priced at 89 times TTM earnings, and so the dilution is pretty minor in the end. An eighth of the reported price is dependent on performance targets, unknown to us.

    So if this gets past regulators, which frankly I do not think will happen, it looks like a great deal for NVIDIA to me. That may surprise you if you’ve read the 5000 words that preceded this paragraph, but I am mostly playing devil’s advocate here, and trying to poke holes in their story. There are many, many holes.

    But in the end, NVIDIA is trying to do what Apple does regularly, and is so very successful with: control the core technologies behind its products. NVIDIA wants to build a data center behemoth built around the CPU-GPU-DPU-AI stack. They control three of the four layers, and the Arm acquisition would give them the fourth, and most important layer in the stack. They can build this Goliath licensing Arm technology, but they would be dependent on someone else’s roadmap. That someone else is mostly in the business of selling smartphone SoC IP, and is most interested in the future of IoT.

    But again, this is precisely what is going to scare Arm’s customers so much. The entire smartphone ecosystem is dependent on the continued development of those cores, and if the company is shifting its attention to the data center, the rest may languish down the road. Those customers may even start looking at RISC-V. They have at least 18 months to mull it over, which is a lifetime in tech.

    To be clear, the neutral rating on NVIDIA reflects the regulatory uncertainty surrounding this deal, and the large opportunity costs should it fail.

    Disclosure: I am/we are long AAPL. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

    Share RecommendKeepReplyMark as Last Read


    From: Frank Sully12/18/2020 1:00:07 PM
       of 1048
     
    Nvidia price target raised to $625 from $605 at Wells Fargo
    Wells Fargo analyst Aaron Rakers raised the firm's price target on Nvidia to $625 from $605 and keeps an Overweight rating on the shares. Rakers views Nvidia as remaining the strongest secular growth and platform story in semis driven by accelerating GPU server attach rates, the company's positioning for DPUs and an anticipated NVIDIA + Arm server CPU strategy. The analyst expects Nvidia to see continued positive GeForce RTX 3xxx-series product cycle through 1H21.

    Share RecommendKeepReplyMark as Last Read


    From: Frank Sully12/20/2020 7:09:11 AM
       of 1048
     
    Nvidia’s Chips Have Powered Nearly Every Major AI Breakthrough
    etftrends.com

    Share RecommendKeepReplyMark as Last Read
    Previous 10 Next 10