Technology StocksASML Holding NV

Previous 10 Next 10 
From: BeenRetired10/5/2017 8:32:54 AM
   of 6595

Google "building JavaScript AI in web browsers"..................................................

Here's a gentle guide to building JavaScript AI in web browsers. Totally not a scary thingGoogle unwraps toy image rec neural net

By Katyanna Quach 5 Oct 2017 at 01:14

Google today popped online something called Teachable Machines, a simple demo for programmers interested in deep learning.

The point is, it works directly in your web browser so you can get going tinkering with an educational neural network right away without having to spin up a full machine-learning development stack and toolchain.

The demo isn’t particularly useful, but it does teach you the basics of how AI-powered image recognition works. The input data used to train the model is a series of photos from your computer’s webcam. You then give it more data, such as more snaps from the webcam, and you're rewarded with GIFs, sounds, or speech depending on the matches.

So for example, you can train the model to output a GIF of a cat if you hold up one finger, and a GIF of a dog if you hold up your hand. Then in inference mode, depending on whether you hold up a finger or a hand, you'll be shown the matching GIF. There’s a bar that shows you how confident the image recognition system is at pulling up the correct corresponding output.

It shows users the importance of training data and how image recognition models can be tricked. If the training examples are very similar to each other or you didn't feed it enough stills from the webcam, it can confuse the model and the confidence bar is low.

Teachable Machines is just an experiment, and its source code is here. What’s underneath it, powering the system, is much, much more interesting: it's deeplearn.js. Google launched deeplearn.js – an open source software library written in Javascript that runs machine learning models in web browsers – in August.

It’s a toolkit that allows novice and pro programmers build and run simple neural networks, potentially without any coding: functions selected from drop-down menus can be used to change the dataset, model type, and some hyper-parameters such as the number of layers, during development. “It won't rival state-of-the-art setups for training and running deep neural nets, [but it] can do real deep learning,” a Google spokesperson told The Register.

There is also an option to play with ImageNet, a large dataset of pictures useful for object recognition, using your webcam. You can additionally tinker with NNArt to create artwork. This uses compositional pattern-producing networks to generate the moving images. And there's a tool called Benchmarks to test how fast various standard machine-learning tasks run in browsers. Have fun.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/6/2017 7:41:11 AM
   of 6595
Google Clips camera onboard AI.......................................................................

From what I've read about Google new stuff, Google wants Smart things to have AI, too. Not just their Cloud.
Classic bit inflation. My bright thingy is brighter than yours at play. Classic need for competitive differentiation.

Shrink has enabled all this.

Bits bonanza.


Google Clips is a tiny camera that uses AI to automatically photograph moments

In addition to its Pixel phones and Home devices, Google announced a surprise today during its Pixel event. The company has come out with a camera that uses artificial intelligence to capture intimate moments that you aren’t able to get on your own. If your dog or baby is camera-shy, you can plant the Google Clips camera somewhere nearby to automatically take photos for you. The camera is trained to capture soundless video of faces and pets that it recognizes.

As shown in the demo onstage, Google Clips looks to be targeting parents, allowing them to focus more on interacting with their kids and pets than holding a camera in their hand. By only capturing soundless video, Google Clips dodges any laws against wiretaps. When the camera is on, an LED light blinks to let those in the room know they are being photographed.

You can export the recordings as video, photos, or GIFs. It comes in a two-tone white-and-teal color pairing, with a battery life of three hours of continuous use. It has a 12-megapixel sensor and a 130-degree field-of-view lens, and takes photos at 15 fps. Lastly, it has 8GB of internal memory.

It’s currently only compatible with Pixels, iPhone 8 and 8 Plus, and the Samsung Galaxy S7 and S8, and the photos are transferred over from the camera to devices through Wi-Fi. The Google Clips camera will be available for $249.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/7/2017 8:47:56 AM
   of 6595
Bit intense "ARRIS...802.11ac Wave 2-capable Modem".............................

"With security such a paramount concern these days, both devices utilize ARRIS Secure Home Internet by McAfee to help keep all the connected items on your network safe from malware and phishing scams."

Intel Inside.

Even cable modems have become very bright. I remember the super-duper Hayes dial stuff.

This is just the start.


ARRIS Announces 802.11ac Wave 2-capable SGB6950-AC2 & SBG7400-AC2 Cable Modem Gateways

by Joe Shields on September 27, 2017 6:00 PM EST

ARRIS introduced two new Wave 2 Wi-Fi gateways to its Secure Home Gateway portfolio with its ARRIS SURFboard SBG6950-AC2 and SBG7400-AC2 gateways. The Secure Home Gateway series of devices merge the functionality of a DOCSIS 3.0 cable modem, 4-Port Gigabit Ethernet router, Ethernet hub, and dedicated security protection for any connected devices. The two devices promise multi-Gigabit Wi-Fi capability while the addition of the latest Wave 2 Wi-Fi technology in the Secure Home Gateway lineup allows more devices to share the increased bandwidth capabilities.

For those unfamiliar, Wave 2 technology was certified by the Wi-Fi Alliance in late June 2016 with enterprise-grade products coming to market before that time. Wave 2's marquee feature is MU-MIMO beamforming, to enable improved performance and network utilization with multiple connected devices. While Gigabit speeds over Wi-Fi were achieved with Wave 1 standard, Wave 2 has an increased PHY (physical) rate at 2.34 Gbps (Wave 1 maxed out at 1.3 Gbps). Other features include evolutionary speed improvements such as 160Mhz channels (Wave 1 was 20, 40, and 80Mhz), four spatial streams (up from 3) and extended 5 GHz channel support. Assuming the channels are set for Wi-Fi use, it will support more users and devices overall. While Wave 2 isn't brand new, devices are still coming out now supporting the new standard.

The two ARRIS gateways offer Wave 2 Next-Gen Wi-Fi to deliver maximum bandwidth and performance across connected devices. Download speeds of up to 1 Gbps and Wi-Fi speeds up to 2350 Mbps are possible with SBG7400-AC2 and its 24x8 channel setup, while the SBG6950-AC supports 686 Mbps and Wi-Fi speeds up to 1900 Mbps through a 16x4 channel setup. Both gateways allow for up to 4 wired devices through the ethernet ports on the back.

With security such a paramount concern these days, both devices utilize ARRIS Secure Home Internet by McAfee to help keep all the connected items on your network safe from malware and phishing scams.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/7/2017 10:28:47 AM
   of 6595

Pricey, bit intense Watch 3 a runaway success...........................................................................

From what I've read, selling like hotcakes.

Other will have to pick up the pace in this bit intense world.
EUV/ArF bonanza.


Apple Watch Series 3 review: The wearable leader runs out to an insurmountable leadThere's none better, with or without LTE

If you want to pick nits, it’s about a millimeter thicker than the Series 1 model Apple is still selling. But that’s with more storage (16GB versus 8GB), a bigger battery, GPS, 50-meter water resistance, a barometric altimeter, and, of course, cellular. I’ve tested several LTE-enabled Android Wear watches that make the 42mm Apple Watch look small, so putting such capabilities in the 38mm model is nothing less than a remarkable feat of engineering.

Speed and other internal improvementsWhile it might look the same as models that came before, Apple Watch Series 3 couldn’t be more different on the inside. Along with LTE, there’s also a new S3 processor and W2 wireless chips, which give the Series 3 Apple Watch a tremendous speed boost.

Doug Duvall/IDG It only takes a second or two for apps to refresh with Apple Watch Series 3.

Navigation and animations are much smoother now, but most importantly, apps open much quicker. The speed of third-party apps was a pretty major pain point with previous generations of Apple Watch (particularly the original model, which most people will be upgrading from), and the new internals make a huge difference. I didn’t experience any lag when launching stock apps, and third-party ones rarely showed the spinning loading ring while updating. Even raise-to-wake seems quicker (though the lack of an always-on display is still a bummer).

That makes Apple Watch Series 3 much more of a standalone device, even without LTE. Where I mostly relied on my old Apple Watch for quick notifications, by the end of my testing I was instinctively using my Series 3 to respond to messages, check sports scores, even read headlines. Siri’s responsiveness is particularly impressive, but everything from stocks to sports to weather now load within a second or two. By the time the S4 chip comes around, watch apps will be just as fast as the ones on our iPhones, if not faster.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/8/2017 7:05:56 AM
   of 6595

"Google’s Clips camera is powered by a tailor-made AI chip"..................................

"More and more companies are turning to tech like this to improve on-device AI."

The AI bonanza has just started. EUV/ArF AI.


Google’s Clips camera is powered by a tailor-made AI chip

Google’s Clips camera may be a little creepy, but it also seems pretty useful for the right user; deploying machine learning to automatically snap the best pictures of your kids and pets. But the key to that functionality isn’t just Google’s AI prowess, it also requires a specialized processor built by the Intel-owned chipmaker Movidius.

The chip in question is the Myriad 2, which Movidius describes as a “visual processing unit” or VPU. (That’s as opposed to a graphics processing unit, GPU; or central processing unit, CPU.) The Myriad 2 is a processor tailor-made to handle machine vision tasks like object recognition, and Movidius claims it’s the “industry’s first always-on vision processor.” It’s previously shown up in Google’s Project Tango devices as well as DJI’s autonomous drones, and helps to make their on-board vision processing more efficient.

Google has long been interested in Movidius’ chips. As well as using their VPUs to power Project Tango, the search giant embarked on a partnership with Movidius last year to improve how image recognition works on devices like smartphones. With the launch of the Clips camera, we have a perfect example of the sorts of benefits these collaborations bring.

Clips does it all its AI processing on-device rather than relying on a connection to the cloud to scan images for familiar faces. That’s good for user privacy (there’s no chance of data being snaffled in-transit), but also increases the device’s battery life (because it doesn’t have to maintain an internet connection at all times). These benefits are the direct result of using a specialized chip like Movidius’ VPU.

More and more companies are turning to tech like this to improve on-device AI. Only last month Apple unveiled new iPhones complete with dedicated AI “neural engine” processors, and Huawei showed off similar capability with its recent Kirin 970 chipset. On-device AI is the future and specialized silicon is helping deliver it. Google has long been interested in Movidius’ chips. As well as using their VPUs to power Project Tango, the search giant embarked on a partnership with Movidius last year to improve how image recognition works on devices like smartphones. With the launch of the Clips camera, we have a perfect example of the sorts of benefits these collaborations bring.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/8/2017 7:26:38 AM
   of 6595

"Harman Kardon Invoke AI (Cortana) Smart Speaker"..........................................................

by Brandon Hill — Friday, October 06, 2017
Windows 10 Cortana Smart Home Controls Arrive For Harman Kardon Invoke AI Smart Speaker
While companies like Amazon and Google are angling to put more devices into your living spaces to create a smart home revolution with the Amazon Echo and Google Home Mini, Microsoft is looking to counterpunch with its own Cortana service. A new report indicates that Microsoft has added a new "Connected Home" section to the Cortana Notebook.

This new integration was first reported by Windows Central, which says that Connected Home allows users to tap into five services at the moment: Wink, Insteon, Nest, SmartThings and Hue. Of course, you will need to sign-in to each respective service to enable voice control functionality with your various smart devices.

Connected Home is coming online just ahead of the October 17th official rollout of the Windows 10 Fall Creators Update. The Fall Creators Update will implement Microsoft's new Fluent Design language and will serve as the launching point for Windows Mixed Reality devices like the Samsung HMD Odyssey. You can also expect feature improvements (like OneDrive On-Demand) and beefed up security across the board.

We're fast approaching the launch of the “Invoke” Cortana-powered Smart AI speaker, which is produced by Harman Kardon. "At the heart of Invoke is Cortana, your personal digital assistant, helping you stay on top of what’s important. Enhance every moment with captivating sound, voice control your music and smart home, make and receive hands-free calls with Skype, get answers to your questions, and more," said Harman Kardon when announcing the device back in May.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/8/2017 7:51:14 AM
   of 6595
LTE Watch 3: SiP…on steroids…………………………………

“dozens of discretes.”

On some ancient node. Wasting space and sucking power. BTW, where does the HDD go?

Apple Watch Packs Q’comm LTE

SiP packs more components in same space

Rick Merritt

10/5/2017 06:01 PM EDT

SAN JOSE, Calif. — Qualcomm supplied the LTE modem in the Apple Watch Series 3 as well as a handful of other wireless chips, according to a teardown from TechInsights. The latest watch appears to continue to push the boundaries of system-in-package design, packing a dozen major chips and dozens of discretes.

The new watch uses the same size SiP as the existing device. However, the Series 3 clearly packs more components, TechInsights said.

TechInsights found the Qualcomm MDM9635M, a Snapdragon X7 LTE modem in the 42mm sport band model A1861 with GPS + cellular it opened up. The same LTE chip appeared in the iPhone 6S/6S Plus, the Samsung Galaxy S6 Edge and other handsets. The modem was mated in a package-on-package with a Samsung K4P1G324EH DRAM in the watch.

Among other wireless chips, TechInsights said the watch contains a Qualcomm PMD9645 PMIC and a WTR3925 RF transceiver. Several other chip vendors also won wireless sockets.

TechInsights preliminary report identified an Apple/Dialog PMIC, an Avago AFEM-8069 front-end module, and a Skyworks SKY 78198 power amplifier. At least one other power amp is believed to be in the design.

Toshiba scored a win supplying 16 GBytes of NAND flash in the watch with four die marked FPV7_32G. SK Hynix supplied a DRAM believed to be packaged with Apple’s latest application processor, a dual-core device.

The Apple-designed application processor in the new watch is slightly larger than the one in the existing device at 7.74mm x 6.25mm, compared to 7.29mm x 6.25mm. What TechInsights believes is the new W2 custom Bluetooth chip, however, measures 2.61mm x 2.50mm, significantly smaller than the W1 in the Series 2 at 3.23mm x 4.42mm.

TechInsights found a 32-bit STMicro ST33G1M2 MCU on the backside of the SiP near RF components. Analog Devices continued to supply two capacitive touch chips — a touch screen controller and a AD7149 sensor controller also used in the Series 2 watch.

Broadcom supplied a wireless charging chip, the same one found in a teardown of the iPhone 8. NXP continued to provide NFC support with the same PN80V NFC module used in the iPhone 8

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/8/2017 8:05:38 AM
   of 6595
NXP 5G Edge Layerscape LX2160A…………………………………………………….....................................................................................
10nm can enable so much more 21st stuff.

EUV/ArF 7nm and below way, way more than 10.

That’s why it’s just the start.


NXP Seeks 'Edge' vs. Intel, Cavium

Junko Yoshida

10/4/2017 12:00 PM EDT

TOKYO — As the lines begin to blur between cloud and edge computing, NXP Semiconductors is racing to offer the highest performance SoC of the company’s Layerscape family.

The new chip, LX2160A, can offload heavy-duty computing done at data centers in the cloud, enabling the middle of the network — typically, service operators — to execute network virtualization and run high-performance network applications on network equipment such as base stations.

Toby Foster, senior product manager for NXP, told us that his team developed the new high-performance chip with three goals in mind. They sought first to enable new types of virtualization in the network, second to achieve new heights of integration and performance at low power featuring next-generation I/Os, and third, to double the scale of virtual network functions and crypto, compared to NXP’s previous Layerscape SoC (LS2088A), while maintaining low power consumption.

Specifically, the LX2160A features 16 high-performance ARM Cortex-A72 cores running at over 2 GHz at 20- to 30-watt. It supports both the 100 Gbit/s Ethernet and PCIe Gen4 interconnect standards.

Why edge computing?

The industry, including NXP, tends to view edge processing as the driver for the next phase of networking, computing and IoT infrastructure growth.

By moving workloads from the cloud to the edge, operators will suffer less latency while gaining resiliency and bandwidth reliability, explained Foster.

Bob Wheeler, principal analyst responsible for networking at the Linley Group, told us, “In some cases, such as content delivery networks, the transition from the cloud to the edge is already happening." He predicted, “Mobile edge computing will primarily happen in conjunction with 5G rollouts starting in 2019.”

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/9/2017 7:32:18 AM
   of 6595

Mad dash to AI supremacy speeds up. Bit intense AI................................................................................

We know AI, to be done right, requires its own specialized circuits.
Just like AR/VR/MR must be pixel/bit intense at faster frame rates to work right.
Just like 5G IoE requires smaller nodes to reduce power.
And on and on and on.

We're in a Kurzweil inflection point. Innovation explodes. With it, bit intensity.

All enabled by Cymer EUV/ArF 7nm and below.


Researchers find that Google’s AI has a higher IQ than Siri

When Siri first hit the scene in 2011 — as the flagship feature on the iPhone 4s, no less — it was a novel piece of software to say the least. The early version of Siri wasn’t terribly powerful, but it was able to answer relatively basic queries and handle rather simple tasks such as setting reminders. In subsequent updates, Siri became much more powerful, even adding contextual awareness along the way. Equally as important, Siri’s ability to parse and understand language improved by leaps and bounds as well.Though Siri’s capabilities today are far more advanced than the version that shipped six years ago, it’s not as if Apple has the market cornered on AI-powered assistants. On the contrary, Siri today faces stiff competition from a number of tech behemoths, including Amazon, Microsoft, and of course, Google. Pitting competing intelligent assistants against one another is nothing new, but a team of researchers at Cornell recently implemented a rather unique approach.

According to HotHardware, engineers at Cornell decided to stack a number of competing AI assistants against each other to determine which boasted the highest IQ. When the dust settled, researchers found that Google’s AI has an IQ of about 47.28, putting it slightly below the IQ of an average six-year old. Siri, in contrast, din’t fare all that well, with researchers finding that Apple’s AI checked in with an IQ of 23.9

Although this work is still in progress, the results so far indicate that the artificial-intelligence systems produced by Google, Baidu, and others have significantly improved over the past two years but still have certain gaps as compared with even a six-year-old child,” the researchers wrote in a paper published on ArXiv.

Of course, if you’re curious about real-world performance as opposed to an academic evaluation, you may want to check out this video from a few months back which saw Siri in iOS 11 go head to head with Google Assistant. At the very least, it’s overwhelmingly clear that Siri today is light years more advanced than it was back in the day

Share RecommendKeepReplyMark as Last Read

From: BeenRetired10/9/2017 8:16:39 AM
   of 6595

'credible' "expert": 100G demand to 2-3X '17 in '18...........................................................

"and recently raised its 2017 forecast by 20%."
It is easy to infer that the writer knows almost all "expert" "research" is not.
the 2 shill outlet channels used to refer to Gene Munster as the top rated analyst on Apple.
no other shill got that appellation. Because everyone knows they're highly paid contort n distort artists. pay dependent on moving stock prices. see ''investor" "concerns".

400G bonanza just starting.


Applied Optoelectronics: Extremely Compelling 2018 100G Data Center Optical Transceiver Market Demand Forecast
Oct. 9, 2017 8:00 AM ET
| About: Applied Optoelectronics, Inc. (AAOI), Includes: AMZN, GOOG, GOOGL
by: Jay Deahna

Jay Deahna

Long/short equity, value, Growth, tech

Get email alerts

Credible market research firm LightCounting expects 2018 100G data center optical transceiver demand to double or triple and recently raised its 2017 forecast by 20%.

The upside for 2017 suggests a robust outlook for Applied Opto when it delivers all important 4Q guidance in early November and justifies the current capex surge for 2018 capacity.

Consensus forecasts 17% revenue growth in 2018 - a massive disconnect to market forecast – stock discounting margin collapse – seems very overly pessimistic – buy AAOI.

On Thursday, October 5th, LIGHTWAVE Online published a very interesting (and relevant to Applied Opto ( AAOI)) story on a recent market research forecast by LightCounting, a credible optical industry market research firm. Feel free to look up both entities and check out the bios on the principals/analysts at LightCounting. You might also want to bookmark the sites for future news and reference.

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10 

Copyright © 1995-2018 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.