|We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor. We ask that you disable ad blocking while on Silicon Investor in the best interests of our community. If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.|
There are likely more good times ahead, since China's Internet market is growing at a fast clip: according to iResearch, the number of Internet search users in China is projected to grow at a compound annual growth rate of 27.5% from 2005 to 2007.
"This company has a growth opportunity that's going to be seen by investors as enormous," says Paul Bard, an analyst with Renaissance Capital LLC in Greenwich, Conn. "And I'm sure what people are going to say is, wow, this could be what Google was three years ago."
Even Google appears to think the company is a good investment: The U.S. search engine owns 2.6% of Baidu, according to Securities and Exchange Commission filings, and isn't listed among the entities that plan to sell their stakes in the IPO.
The IPO is to be underwritten by Goldman Sachs Group Inc.'s Goldman Sachs (Asia) LLC, Credit Suisse Group's Credit Suisse First Boston and Piper Jaffray Cos., and will trade in the form of American depositary shares on the Nasdaq Stock Market under the symbol BIDU.
Baidu is selling 3.7 million shares to the public, a 12% sliver of the 31.7 million shares that will be outstanding after the offering. The company's executives, directors, and venture-capital firms will continue to own substantial stakes in Baidu after it goes public.
(from The Wall St Journal)
Goldman Sachs is lead underwriter and it is expected to begin trading at $19-$20. Here is their website.
Update: August 1, 2021
In recent years Baidu has invested heavily in artificial intelligence (AI) and has become China’s leading AI company. This is described in CEO’s excellent 2020 book, which I highly recommend. For the current state of affairs see the seven minute video below.
Book: “Artificial Intelligence Revolution: How AI Will Change Our Society, Economy, And Culture” by Baidu CEO Robin Li
Video: “Baidu: How China Became The World Champion In AI”
Baidu Is No Longer A Search Engine Play, But An AI Bet
Baidu is invested heavily in AI search engines (obviously), autonomous vehicles, Baidu Brain and has a $2 billion AI chip subsidiary.
Self-driving vehicles: Message 33420466 Message 33420467
Baidu Brain: Message 33420592
Risk 1: China regulation: From tech to education investors-corner.bnpparibas-am.com
Risk 2: Chinese tech stocks slump as U.S. SEC begins rollout of law aimed at delisting mobile.reuters.com
Risk 3: Invest in China, but Without Illusions nytimes.com
AI in general:
For a comprehensive discussion of AI and AI companies in general, see the Artificial Intelligence, Robotics and Automation board moderated by my friend Glenn Petersen. Subject 59856
Modern AI is based on Deep Learning algorithms. Deep learning is a subset of machine learning, which is essentially a neural network with three or more layers. These neural networks attempt to simulate the behavior of the human brain—albeit far from matching its ability—allowing it to “learn” from large amounts of data.
Deep Learning algorithms for AI. The first one-hour video explains how this works. Amazingly, it is just least squares minimization of the neural network loss function using multi-dimensional Newton Raphson (gradient descent). See second one half hour video. Who thought Calculus would come in handy?
1. MIT Introduction to Deep Learning
2. Gradient Descent, Step-by-Step
Math Issues: Optimizing With Multiple Peaks Or Valleys
A problem with gradient descent optimization is that it can find minima of functions as well as maxima of functions. Worse, there can be multiple peaks and valleys, so more properly gradient descent finds local extrema. One is interested in machine learning, e.g., in global minima. This makes the problem considerably more difficult. This is particularly true since loss functions for deep learning neural networks can have millions or even billions of parameters.
Another problem has to do with the size of the data sets used to train deep learning neural networks, which can be huge. Since gradient descent is an iterative process, it becomes prohibitively time-consuming to evaluate the loss function at each and every data point, even with high-performance AI chips. This leads to stochastic gradient descent: the loss function is evaluated at a relatively small random sample of the data at each iterative step.
Stochastic Gradient Descent:
Recent Earnings History:
Further updates will be provided as deemed necessary.
|© 2021 Knight Sac Media. Data provided by IEX, Alpha Vantage, Coinbase, Binance, Fintel and CityFALCON News|