We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksNVIDIA Corporation (NVDA)

Previous 10 Next 10 
From: Frank Sully6/23/2021 3:22:23 PM
   of 2485
Nvidia price target raised to $900 from $750 at Raymond James

Raymond James analyst Chris Caso raised the firm's price target on Nvidia to $900 from $750 and keeps a Strong Buy rating on the shares. Near-term trends are following the path the firm anticipated when upgrading the stock from Outperform in April, and Caso still considers Nvidia to be the semi company best positioned for growth over the long term, the analyst tells investors in a research note. Following some checks over the past week, Caso is convinced that datacenter has begun to reaccelerate due to hyperscale digestion, a resumption of enterprise activity and rising virtualization.

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/24/2021 11:20:33 AM
1 Recommendation   of 2485
AI Chipset Market Valuation is Estimated to Touch USD 38.46 Billion by 2025 at 39.18% CAGR - Report by Market Research Future (MRFR)

June 24, 2021 10:00 ET | Source: Market Research Future

New York, US, June 24, 2021 (GLOBE NEWSWIRE) -- Market Overview:

According to a comprehensive research report by Market Research Future (MRFR), “Global AI Chipset Market information by Components, by Technology and Application – forecast to 2027” the market is predicted to touch USD 38.46 Billion at a whopping 39.18% CAGR between 2019- 2025.

Market Scope:

The global artificial intelligence chipset market is growing rapidly, witnessing advances in AI chipset technologies. Advancements in AI, 5G, and IoT technologies contribute to the AI chipset market growth. Moreover, increasing government initiatives to drive the development of IoT-based modules create significant market opportunities.

As the world becomes more digitized and connected, advanced technologies like AI gain significant prominence in the industrial spectrum. This, as a result, is estimated to enable AI chipsets to emerge as solutions for large-scale data processing, and the AI chipset market would garner significant accruals.

The market is majorly driven by an increase in the number of smart connected devices, the need to improve company efficiency, and an increase in reliance on the Internet. The increased need for real-time networking across industry sectors such as manufacturing, automobile, transportation, energy & utility, aerospace, and oil & gas is also aiding the market growth.

Dominant Key Players on AI Chipset Market Covered Are:

  • Micro Technology (US)
  • IBM Corporation (US)
  • Samsung Electronics Co. Ltd (South Korea)
  • Huawei Technologies Co. Ltd (China)
  • Advanced Micro Devices (US)
  • Intel Corporation (US)
  • NVIDIA Corporation (US)
  • Qualcomm Technologies Inc. (US)

Market USP exclusively encompassed:

Market Drivers

AI chipsets, also known as AI accelerators, play a causal role in achieving the high speeds and efficiencies necessary for large-scale AI-specific calculations. Artificial intelligence is driving rapid transformation across the industrial landscape. With companies becoming more data-driven, the demand for AI is growing too.

Speech recognition, security systems, recommendation systems, medical imaging, and better supply-chain management are the areas in which AI technology has enabled organizations to execute their work efficiently, with the tools, algorithms, and computing power. AI chipsets are expected to be the most important component of future smart devices.

Rising demand for AI chipsets in industries like finance, education, logistics, transportation, and healthcare boosts the artificial intelligence chipset market growth. Besides, increasing AI-backed applications have significantly increased the demand for AI chipsets. Solution providers are making substantial investments to foster R&D activities to develop and improve AI chips.

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/24/2021 6:37:00 PM
   of 2485
Nvidia AI could let you make video calls in your PJs without anyone knowing

The model could make videoconferencing marginally more bearable

STORY BY Thomas MacaulayWriter at Neural by TNW — Thomas covers AI in all its iterations. Likes Werner Herzog films and Arsenal FC.

Nvidia has unveiled an AI model that converts a single 2D image of a person into a “talking head” video.

Known as Vid2Vid Cameo, the deep learning model is designed to improve the experience of videoconferencing.

If you’re running late for a call, you could roll out of bed in your pajamas and disheveled hair, upload a photo of you dressed to impress, and the AI will map your facial movements to the reference image — leaving the other attendees unaware of the chaos behind the camera. That could be a boon for the chronically unkempt, but you should probably test the technique before you turn up in your birthday suit.

The system can also adjust your talking head’s viewpoint to show you looking straight at the screen, when secretly your eyes are fixed on a TV in the background.

They sound like nifty features for those of us who dread video calls, but the most useful aspect of the model may be bandwidth reduction. Nvidia says the technique can cut the bandwidth needed for video conferences by up to 10x.

GANimated video

Vid2Vid Cameo is powered by generative adversarial networks (GANs), which produce the videos by pitting two neural networks against each other: a generator that tries to create realistic-looking samples, and a discriminator that attempts to work out whether they’re real or fake.

This enables the two networks to synthesize realistic videos from a single image of the user, which could be a real photo or a cartoon avatar. During the video call, the model will capture their real-time motion and apply it to the uploaded image.

The model was trained on a dataset of 180,000 talking-head videos, which taught the network to identify 20 key points that encode the location of features including the mouth, eyes, and nose.

These points are then extracted from the image uploaded by the user to generate a video that mimics their appearance.

The model will be available soon in the Nvidia Maxine SDK and Nvidia Video Code SDK, but you can already try a demo of it here. I gave it a go, and was pretty impressed by the system’s efforts — although I wouldn’t let it cut my hair.

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/25/2021 10:54:01 PM
1 Recommendation   of 2485
NVIDIA CEO: Buckle Up for the Industrial HPC Revolution

AI paired with accelerated and high performance computing have forged a digital flywheel that’s propelling super-exponential advances.

June 25, 2021 by PARESH KHARYA

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/26/2021 9:43:32 PM
   of 2485
This Chinese 7nm chip will take on next-gen Nvidia and AMD GPUs

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/28/2021 11:16:22 AM
1 Recommendation   of 2485
NVIDIA and Google Cloud to Create Industry’s First AI-on-5G Lab to Speed Development of AI Everywhere

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/28/2021 11:22:20 AM
   of 2485
Nvidia eyes ARM roadmap for AI, 5G integration from server to card to chip

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/29/2021 8:17:08 PM
   of 2485
The names pushing AI toward a $1T chip market, according to Mizuho

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/30/2021 5:42:47 PM
   of 2485
From Genomes to Proteins to Cells, Digital Biology Revolution Marches on with HPC and AI

Global scientists get a read on genomics data with high performance computing systems and NVIDIA Clara Parabricks.

June 28, 2021 by RORY KELLEHER

Share RecommendKeepReplyMark as Last Read

From: Frank Sully6/30/2021 6:06:01 PM
   of 2485
Graphcore brings new competition to Nvidia in latest MLPerf AI benchmarks

Graphcore’s score in BERT natural language training was the best score of any of the two-socket AMD systems submitted.

By Tiernan Ray | June 30, 2021 -- 17:00 GMT (10:00 PDT)

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10