We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksNVIDIA Corporation (NVDA)

Previous 10 Next 10 
From: Frank Sully4/8/2021 1:41:32 AM
1 Recommendation   of 2601
Photonic Supercomputer For AI: 10X Faster, 90% Less Energy, Plus Runway For 100X Speed Boost

John Koetsier
Senior Contributor

The Lightmatter photonic computer is 10 times faster than the fastest NVIDIA artificial intelligence GPU while using far less energy. And it has a runway for boosting that massive advantage by a factor of 100, according to CEO Nicholas Harris. In the process, it may just restart a moribund Moore’s Law.

Or completely blow it up.

“On typical workloads we’re up to 10 times faster than existing technologies like NVIDIA’s A100 chip,” Harris told me on a recent episode of the TechFirst podcast. “If you look at ResNet-50, which is a neural network that a lot of people operate; or BERT, which is a natural language processing neural network; or DLRM, which is a network that people use to recommend products to you ... we’re typically more than 10 times faster.”

10X faster than an NVIDIA A100 is a big deal.

The Lightmatter photonic computer core

NVIDIA markets the A100 as a component of “the most powerful accelerated server platform for AI and high performance computing,” saying it’s the “world’s first 5 petaFLOPS AI system.”

A petaflop is one thousand trillion, or one quadrillion, floating point operations per second. In comparison — and at serious risk of comparing apples to oranges — Apple’s new M1 chip reportedly delivers 2.6 teraflops per second. One petaflop is 1000 (or 1024) teraflops, so the A100 is screaming fast.

But Harris says the Lightmatter photonic computer is 10 times faster.

That’s impressive, to say the least, because it suggests a compute capacity of 50 petaflops or more per chip. In comparison, supercomputers run at performance levels of hundreds of petaflops, often using hundreds or thousands of chips to do so. Again, use a grain of salt: comparing vastly different computing infrastructures’ speeds using a single number is probably not a super-apt comparison. Lightmatter is not a general-purpose computer, and one-to-one speed comparisons may not completely make sense.

The point, however, is that it’s fast. Blazing fast.

But computing isn’t just about speed. It’s also about energy use — and heat. As everyone in technology knows, heat is a major problem impacting server farms all over the globe and limiting the speed CPUs can run at.

“Every time we shrink transistors they’re supposed to decrease how much energy they use, and that hasn’t been the case for the past 15 years,” Harris says. “And it’s turned into a really big energy problem and a challenge in cooling computer chips.”

Moore’s Law, named for Fairchild Semiconductor co-founder Gordon Moore, says that the number of transistors in chips doubles about every two years. The problem is that Moore’s Law petered out: as we’ve shrunk transistors, they’ve gotten on the scale of the electron, Harris says ... and now they’re getting leaky and less reliable. We’re no longer fitting more and more on a chip; instead we’re adding additional cores to chips.

A photonic computer, as the name implies, uses photons, not electrons. They’re not magic, and they’re not as good as electrons for some kinds of computing, like logic operations, control flows, and if/then statements.

As Harris says, a photonic computer is not going to run Windows.

But there are some things photonic computers like Lightmatter’s are really good at. And they turn out to be the sorts of things that are growing at exponential rates in today’s server farms and cloud computing centers:

AI. Machine learning. Neural nets.

Lightmatter will be shipping its photonic computers in a product it calls Envise by the end of the year, Harris says. Envise packages photonic processing cores with traditional transistor-based systems to offer the best of both worlds. An Envise blade includes 16 Envise chips in a 4U server configuration that uses a miserly three kilowatts of power.

“Envise is really the first photonic computer, period, that you can buy, and it addresses any kind of neural net,” says Harris. “So if you want to run algorithms behind Alexa or Siri or any of the voice assistants, Envise can run those. If you want to do translation, Envise can run that. If you want to identify things in images for your self-driving car, Envise can do that too.”

According to Lightmatter, engineers can use PyTorch or TensorFlow or Onyx — all the languages and formats they’re used to — to build neural networks on Envise. Lightmatter offers a compiler in Idiom to compile programs to native code for photonic processing.

Perhaps the most exciting part of photonic computing, however, is a quality of photons that is totally impossible for electrons to duplicate: color.

Because light has different colors which occupy different places on the electromagnetic spectrum, you can run photonic computers on different colors. Simultaneously. Using the same same hardware.

And that’s where Lightmatters’ photonic computers get scary fast.

“For every color we add, we increase the throughput by that number,” Harris says. “So two colors is twice as fast. Three colors is three times as fast, and the efficiency scales about the same way. So we think you can probably do 64 colors in the future. We’re not there yet, but we think that’s possible. Imagine having 64 virtual processors on a chip, and it’s just the area of one.”

Normal processors do one job at a time, even if they appear to human senses and timeframes to be multitasking. Photonic processors would run multiple jobs in multiple colors at the same time.

Now you’re getting scary fast.

“I think that we have a roadmap that extends beyond 100X the current speed of accelerators,” Harris told me.

That essentially means you’d have the power of a room-sized supercomputer in a package you can carry in a larger-sized piece of carry-on luggage, running at up 20 GHz or more.

Eventually, you might get small photonic systems on a laptop or even a smartphone. Much quicker, they’re going to turn up in cloud-based systems.

“It’d be a dream of mine to eventually power a Google search,” says Harris. “A lot of that is run on neural networks.”

Lightmatter is pretty confident about shipping product in 2021. That said, everything is experimental until it’s not, and there’s likely some manufacturing and scaling challenges in the company’s way.

Assuming it all works out, however, we need photonic computers sooner rather than later. Already, data centers consume significant portions of the world’s total electricity supply: easily 1% but perhaps as much as 5% of all the electricity we generate. By 2025, some estimates say global computing could suck as much as 20% of all the world’s power ... with all the environmental damage that entails.

Photonic computer is low-power, and doesn’t need cooling like existing CPUs and GPUs. And, at far greater throughput for exactly the kinds of computing that are growing fast, it could just be the technology that reverses that power trend.

And, of course, enables continued growth in our use of AI and machine learning.

Get a full transcript of my conversation with Harris, or subscribe to TechFirst.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Frank Sully who wrote (1814)4/8/2021 1:59:30 PM
From: engineer
   of 2601
whats up with a 2008 date code on the chipset shown? Have they been working on the board and SW since 2008?

Are they for real?

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: engineer who wrote (1815)4/8/2021 6:51:50 PM
From: Frank Sully
   of 2601

They seem to be for real. They have a web site here.

Here is their history:

The story behind Lightmatter’s tech

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/8/2021 8:41:17 PM
   of 2601
GTC 2021 Keynote Watch Party ~Nvidia's CEO Jensen Huang ~ April 12th (Mon) 11:30 AM (EDT)

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/10/2021 12:20:30 AM
   of 2601
What To Expect From Nvidia's Analyst Day on April 12

Shanthi Rexaline , Benzinga Staff Writer

April 09, 2021 4:33pm

NVIDIA Corporation NVDA 0.58% shares have recovered from the market-wide tech sell-off. The company has an immediate catalyst in the form of its Analyst Day, scheduled for Mon. April 12, 1 p.m. to 3 p.m. EST.

The Nvidia Analyst: Credit Suisse analyst John Pitzer maintained an Outperform rating and $620 price target for Nvidia shares.

The Nvidia Thesis: The key near-term issues the company has to address are regarding Gaming over-earning, timeline for reaccelerating year-over-year growth in core Data Center Group and the regulatory process around the Arm Holdings acquisition, analyst Pitzer said in a note.

The event, the analyst said, is likely to underscore key long-term EPS drivers, which continue to increase.

The company will likely highlight growing proof-points of a $100 billion+ total addressable market for the DCG, including $45 billion for Cloud, $30 billion for Enterprise and $15 billion for Edge, Pitzer said. Nvidia could shed light on growing software monetization, with AI Application Frameworks, the analyst added.

The company is also likely to emphasize the still-robust Gaming market, with or without crypto, and the growing momentum in autonomous driving, the analyst said.

The opportunity, according to the analyst, clearly supports a long-term gross margin of 70%, an operating margin of 50% and a free cash flow margin of 30%. For the calendar year 2020, these metrics are expected at 66%, 41% and 24%, respectively, supporting the calendar year 2021 EPS of $13.35, roughly in line with the consensus, the analyst said.

NVDA Price Action: Nvidia shares were down 0.58% at $576 at market close Friday.

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/10/2021 10:07:09 AM
   of 2601
This Genius Move Has All but Guaranteed NVIDIA's Role in the Future of Tech

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/11/2021 11:21:53 PM
1 Recommendation   of 2601
Why Nvidia Is My “Slam Dunk” Investment for the Decade

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/12/2021 1:45:23 PM
   of 2601
NVIDIA Unveils Grace: A High-Performance Arm Server CPU For Use In Big AI Systems
by Ryan Smith on April 12, 2021 12:20 PM EST

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/12/2021 2:54:25 PM
   of 2601
GTC 2021 Keynote with NVIDIA CEO Jensen Huang

Share RecommendKeepReplyMark as Last Read

From: Frank Sully4/12/2021 2:59:33 PM
   of 2601
Nvidia Entangled in Quantum Simulators

Share RecommendKeepReplyMark as Last ReadRead Replies (2)
Previous 10 Next 10