SI
SI
discoversearch

Technology Stocks : NVIDIA Corporation (NVDA)
NVDA 191.56+2.7%5:20 PM ESTNews

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
From: Glenn Petersen9/1/2017 4:03:28 PM
   of 1580
 
Why a 24-Year-Old Chipmaker Is One of Tech’s Hot Prospects

Nvidia, a maker of graphics processing units, is riding an artificial intelligence boom to put its chips in drones, robots and self-driving cars.

By DON CLARK
New York Times
SEPT. 1, 2017



Nvidia’s new Volta computer chip, which, according to the company, cost an estimated $3 billion to develop. Credit Christie Hemm Klok for The New York Times
___________________________________

SANTA CLARA, Calif. — Engineers at CTA.ai, an imaging-technology start-up in Poland, are trying to popularize a more comfortable alternative to the colonoscopy. To do so, they are using computer chips that are best known to video game fans.

The chips are made by the Silicon Valley company Nvidia. Its technology can help sift speedily through images taken by pill-size sensors that patients swallow, allowing doctors to detect intestinal disorders 70 percent faster than if they pored over videos. As a result, procedures cost less and diagnoses are more accurate, said Mateusz Marmolowski, CTA’s chief executive.

Health care applications like the one CTA is pioneering are among Nvidia’s many new targets. The company’s chips — known as graphics processing units, or GPUs — are finding homes in drones, robots, self-driving cars, servers, supercomputers and virtual-reality gear. A key reason for their spread is how rapidly the chips can handle complex artificial-intelligence tasks like image, facial and speech recognition.

Excitement about A.I. applications has turned 24-year-old Nvidia into one of the technology sector’s hottest companies. Its stock-market value has swelled more than sevenfold in the past two years, topping $100 billion, and its revenue jumped 56 percent in the most recent quarter.

Nvidia’s success makes it stand out in a chip industry that has experienced a steady decline in sales of personal computers and a slowing in demand for smartphones. Intel, the world’s largest chip producer and a maker of the semiconductors that have long been the brains of machines like PCs, had revenue growth of just 9 percent in the most recent quarter.



A demonstration room on the Nvidia campus in Santa Clara, Calif. Excitement about the use of its chips in artificial intelligence applications has made Nvidia one of the tech sector’s hottest companies. Credit Christie Hemm Klok for The New York Times
_______________________________________

“They are just cruising,” Hans Mosesmann, an analyst at Rosenblatt Securities, said of Nvidia, which he has tracked since it went public in 1999.

Driving the surge is Jen-Hsun Huang, an Nvidia founder and the company’s chief executive, whose strategic instincts, demanding personality and dark clothes prompt comparisons to Steve Jobs.

Mr. Huang — who, like Mr. Jobs at Apple, pushed for a striking headquarters building, which Nvidia will soon occupy — made a pivotal gamble more than 10 years ago on a series of modifications and software developments so that GPUs could handle chores beyond drawing images on a computer screen.

“The cost to the company was incredible,” said Mr. Huang, 54, who estimated that Nvidia had spent $500 million a year on the effort, known broadly as CUDA (for compute unified device architecture), when the company’s total revenue was around $3 billion. Nvidia puts its total spending on turning GPUs into more general-purpose computing tools at nearly $10 billion since CUDA was introduced.

Mr. Huang bet on CUDA as the computing landscape was undergoing broad changes. Intel rose to dominance in large part because of improvements in computing speed that accompanied what is known as Moore’s Law: the observation that, through most of the industry’s history, manufacturers packed twice as many transistors onto chips roughly every two years. Those improvements in speed have now slowed.



Nvidia’s chief executive, Jen-Hsun Huang, made a pivotal bet more than 10 years ago on a series of modifications and software developments to the company graphics processing units, or GPUs. Credit Ethan Miller/Getty Images
_______________________________

The slowdown led designers to start dreaming up more specialized chips that could work alongside Intel processors and wring more benefits from the miniaturization of chip circuitry. Nvidia, which repurposed existing chips instead of starting from scratch, had a big head start. Using its chips and software it developed as part of the CUDA effort, the company gradually created a technology platform that became popular with many programmers and companies.

“They really were well led,” said John L. Hennessy, a computer scientist who stepped down as Stanford University’s president last year.

Now, Nvidia chips are pushing into new corporate applications. German business software giant SAP, for example, is promoting an artificial-intelligence technique called deep learning and using Nvidia GPUs for tasks like accelerating accounts-payable processes and matching resumes to job openings.

SAP has also demonstrated Nvidia-powered software to spot company logos in broadcasts of sports like basketball or soccer, so advertisers can learn about their brands’ exposure during games and take steps to try to improve it.

“That could not be done before,” said Juergen Mueller, SAP’s chief innovation officer.

Such applications go far beyond the original ambitions of Mr. Huang, who was born in Taiwan and studied electrical engineering at Oregon State University and Stanford before taking jobs at Silicon Valley chipmakers. He started Nvidia with Chris Malachowsky and Curtis Priem in 1993, setting out initially to help PCs offer visual effects to rival those of dedicated video game consoles.

The company’s original product was a dud, Mr. Malachowsky said, and the graphics market attracted a mob of rivals.

But Nvidia retooled its products and strategy and gradually separated itself from the competition to become the clear leader in the GPU-accelerator cards used in gaming PCs.

GPUs generate triangles to form framelike structures, simulating objects and applying colors to pixels on a display screen. To do that, many simple instructions must be executed in parallel, which is why graphics chips evolved with many tiny processors. A new GPU announced by Nvidia in May, called Volta, has more than 5,000 such processors; a new, high-end Intel server chip, by contrast, has just 28 larger, general-purpose processor cores.

Nvidia began its CUDA push in 2004 after hiring Ian Buck, a Stanford doctoral student and company intern who had worked on a programming challenge that involved making it easier to harness a GPU’s many calculating engines. Nvidia soon made changes to its chips and developed software aids, including support for a standard programming language rather than the arcane tools used to issue commands to graphics chips.

The company built CUDA into consumer GPUs and high-end products. That decision was critical, Mr. Buck said, because it meant researchers and students who owned laptops or desktop PCs for gaming could tinker on software in campus labs and dorm rooms. Nvidia also convinced many universities to offer courses in its new programming techniques.



Nvidia’s new headquarters in Santa Clara during construction last year. The company’s stock-market value has swelled more than sevenfold in the past two years. Credit Ramin Rahimian for The New York Times
__________________________

Programmers gradually adopted GPUs for applications used in, among other things, climate modeling and oil and gas discovery. A new phase began in 2012 after Canadian researchers began to apply CUDA and GPUs to unusually large neural networks, the many-layered software required for deep learning.

Those systems are trained to perform tricks like spotting a face by exposure to millions of images instead of through definitions established by programmers. Before the emergence of GPUs, Mr. Buck said, training such a system might take an entire semester.

Aided by the new technology, researchers can now complete the process in weeks, days or even hours.

“I can’t imagine how we’d do it without using GPUs,” said Silvio Savarese, an associate professor at Stanford who directs the SAIL-Toyota Center for A.I. Research at the university.

Competitors argue that the A.I. battle among chipmakers has barely begun.

Intel, whose standard chips are widely used for A.I. tasks, has also spent heavily to buy Altera, a maker of programmable chips; start-ups specializing in deep learning and machine vision; and the Israeli car technology supplier Mobileye.

Google recently unveiled the second version of an internally developed A.I. chip that helped beat the world’s best player of the game Go. The search giant claims the chip has significant advantages over GPUs in some applications. Start-ups like Wave Computing make similar claims.

But Nvidia will not be easy to dislodge. For one thing, the company can afford to spend more than most of its A.I. rivals on chips — Mr. Huang estimated Nvidia had plowed an industry record $3 billion into Volta — because of the steady flow of revenue from the still-growing gaming market.

Nvidia said more than 500,000 developers are now using GPUs. And the company expects other chipmakers to help expand its fan base once it freely distributes an open-source chip design they can use for low-end deep learning applications — light-bulbs or cameras, for instance — that it does not plan to target itself.

A.I., Mr. Huang said, “will affect every company in the world. We won’t address all of it.”

nytimes.com
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  

Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.