We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksASML Holding NV

Previous 10 Next 10 
From: BeenRetired11/14/2016 7:00:07 AM
   of 30699

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/14/2016 7:06:21 AM
   of 30699
VR: "Google plans to start selling a $79 headset in the coming months"....................................

VR is going to be YUUUGE. Bit intense VR.
Bit intense 21st.

Auschwitz, sex assault and police shootings — where virtual reality is going next

Empathy for the colorblind

That reaction can affect people’s behavior, studies show. In 2013, researchers at Stanford University and the University of Georgia studied the willingness of two groups to help a visually impaired individual after a virtual reality experience. One group was asked to perform a color-matching exercise in a virtual reality environment while imagining themselves to be colorblind. The second was asked to do the same exercise, but with a filter that forced them to experience what it was actually like to be colorblind.

The second group spent twice as much time helping colorblind people in the 24 hours after the study, the researchers found. And the effects of virtual reality appeared to be even more pronounced among those who, before the study, had been rated by the researchers as less predisposed to feel empathy for the colorblind.

In a separate study, a Stanford doctoral student in 2011 had a group of test subjects read a description of what happens when a lumberjack cuts down a redwood tree while a separate group was ordered to cut down the tree in a virtual reality. In an exit interview, both groups reported being more aware that their personal actions could have an impact on the environment. But the second group used an average of 20 percent fewer paper napkins to mop up a spill of water.

Of course, most Americans don’t have the opportunity to use the technology. The least expensive, fully featured virtual reality headset, Sony’s recently released PlayStation VR, costs $400 and is largely being developed for gaming applications.

That may soon change. Google plans to start selling a $79 headset in the coming months. Smartphone manufacturers are working to incorporate the ability to shoot a virtual reality video into smartphones. Museums such as the Smithsonian and the Victoria and Albert Museum in London have virtual reality exhibitions, and VR movies are fixtures on the film festival circuit.

Sally Smith, executive director of the Nexus Fund, who organized the Cocktails and Virtual Reality fundraiser in Washington, said she decided to produce her own film after watching a five-minute clip of “One Dark Night,” an immersive re-creation by journalist Nonny de la Peña of the 2012 shooting of Trayvon Martin in Florida.

“It was so moving and so overwhelming,” Smith said. “I realized this helps people emotionally connect in a way that I have never seen before.”

Her organization is focused on the prevention of genocide and mass violence. A constant challenge is attracting and sustaining attention among Westerners for simmering crises across the world. In recent years, she has focused efforts on supporting the Rohingya, more than 1 million Muslim people who have been stripped of their citizenship and driven from their homes in a protracted conflict with the neighboring Buddhist majority

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/14/2016 7:21:54 AM
   of 30699
"Modified drones are keeping an eye on the world’s wildlife "...............................

Desktop gene machines! Who wooda thunk!

shills would have you believe the 1960's will repeat over and over and over.
Innovation repeats over and over and over. Ever faster.

A bit intense bonanza.


Modified drones are keeping an eye on the world’s wildlife

Provided by Quartz A growing group of scientists are using drones for wildlife conservation and research. For more than a year, Michael Moore has been trying to capture the breath of whales.

It’s an audacious idea, but Moore has help. A marine biologist at Woods Hole Oceanographic Institution in Massachusetts, he’s rigged a fleet of small unmanned aerial vehicles—UAVs, or drones—with samplers to catch whales’ exhalations from above. The aim is to get a good enough sample to analyze exhaled microbes and gain a better understanding of the cetaceans’ health.

Of course, stores don’t typically sell “whale breath-catching drones” off the shelf. Moore’s team has to make a number of adjustments to adapt his UAVs for the task at hand: setting them to calibrate on flat land so their gyros aren’t affected by the rocking boat, and moving the sample-catching petri dish from the bottom of the drone to the top.

“You’re using a down-drafting device to sample an updraft,” he explains. “Inherently, you’re asking it to do something it wasn’t designed to do.”

Moore, who hopes to get his first microbial analysis later this year, is one of many scientists now harnessing UAVs for wildlife conservation and research. Using drones, one team recently discovered that there were twice as many orangutans in Sumatra as previously thought. UAVs have also been employed to map map Arctic shrubs, monitor wildlife in a Dubai national park, and even track down poachers in India. All it takes is a little engineering and ingenuity.

DIY droningThe availability of reliable, off-the-shelf UAVs has made standard aerial surveillance far easier than it used to be, but drones don’t come out of the box ready for the kinds of projects conservationists throw at them. That means researchers like Moore are increasingly using do-it-yourself modification techniques to prepare small drones for the unique aspects of tracking wildlife in sometimes harsh environments. Multi-spectral cameras, for example, allow drones see across a wider range of the electromagnetic spectrum, making it possible to analyze vegetation types in addition to animal life.

Serge Wich, a professor in the school of natural sciences and psychology at Liverpool John Moores University, and part of the Sumatran Orangutan Conservation Program, co-founded a group called Conservation Drones along with collaborator Lian Pin Koh at the University of Adelaide in Australia. The organization works exclusively on finding ingenious ways to modify UAVs for wildlife uses.

“We started off five years ago with a foam airplane, home-built on the kitchen table, with a camera that we sort of glued and taped together,” Wich says of the Sumatran orangutan survey. One of the project’s many adjustments addressed blurry images resulting from the vibration of the drones’ motors.

“We ended up with a very simple and cheap solution,” explains Koh, “which was just to have the camera sit on a piece of kitchen sponge. [That] reduced the vibration.”

There are other examples of scrappy drone customization. Jeff Kerby at Conservation Drones and Andy Cunliffe at the University of Edinburgh strapped point-and-shoot cameras to UAVs for sub-centimetre 3D mapping of Arctic shrubs—an indicator of how climate change is affecting the terrain. In August, state authorities in Uttarakhand in India announced a project to monitor wildlife and suspected poachers with as many as 12 UAVs.

Drones have also been launched to gather data from remote sensors. In 2013, after watching a TED talk by Koh, a team from New York University Abu Dhabi set up a drone that could wirelessly collect footage from camera traps in Dubai’s 31,000-acre Wadi Wurayah National Park. Historically, rangers had to fly to each of the traps—which stay in position for months and are activated by movement—in a Blackhawk helicopter to download images by hand. The “ Wadi Drone” is able to do this from the air, digitally sucking up hundreds of photos on each trip.

Drone no harmBenefits notwithstanding, sending noisy, heavy, fast-moving devices into the aerial habitats of endangered fauna has downsides—wildlife and drones don’t always mix. One project that attempted to survey an elephant herd with drones spooked its subjects so much that some of the pachyderms actually made a run for it.

We’re only beginning to understand the effect that drones can have on wild animals, says Jarrod Hodgson at the University of Adelaide. He cites a study from last year that found the heart rate of black bears in Minnesota rose significantly when a UAV flew overhead. The findings suggest that animals might be experiencing drone stress even when UAVs haven’t provoked an obvious behavioural response.

To address these concerns, Hodgson and Koh recently authored a draft “code of conduct” for drones used in conservation and research. “The key element in there is that we’re advising people to adopt a precautionary principle because we don’t fully understand the impact,” Hodgson says.

As personal drones become more common, national parks and conservation areas have also set rules around their use. Many, like Elk Island National Park in Canada—home to beavers, bison, pelicans and, of course, elk—are opting for outright bans.

But if conservationists can carry out their work responsibly, drones still have the capacity to revolutionize research in the wildest of areas. Imagine the potential of a system that could automatically identify things it takes hours for people to find in footage—not just an eye in the sky, but a brain too.

“It would be so exciting if we could automatically identify animals and humans in the data,” Wich remembers telling a colleague in the astrophysics department at his university. His colleague replied, “That’s something we might be able to help with.”

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/14/2016 9:40:59 AM
   of 30699
“ImageWare Debuts First Ever Multimodal Biometric Authentication Solution”............................

Cyber security is another of many Next Big Things. Bit intense Next Big Things.

Cymer/HMI/Zeiss/ASML .

ImageWare Debuts First Ever Multimodal Biometric Authentication Solution for the Microsoft Ecosystem--GoVerifyID® Enterprise Suite

Mon November 14, 2016 8:00 AM|PR Newswire|About: IWSY

PR Newswire

SAN DIEGO, Nov. 14, 2016 /PRNewswire/ -- ImageWare Systems, Inc. (ImageWare (IWSY)) (OTCQB: IWSY), a leader in mobile and cloud-based, multi-modal biometric identity management solutions, today introduced GoVerifyID® Enterprise Suite, an innovative, multi-modal, multi-factor biometric authentication solution for the enterprise market. An algorithm-agnostic solution, GoVerifyID Enterprise Suite is the first ever end-to-end biometric platform that seamlessly integrates with an enterprise's existing Microsoft infrastructure, offering businesses a turnkey biometric solution for quick deployment in an afternoon or less.

"Last year nearly 80 percent of businesses reported a data breach. As the digital workforce expands, with data extended to external stakeholders and across numerous types of devices and systems, the need for high-assurance, enterprise-wide protection has intensified," said Jim Miller, chairman & CEO of ImageWare. "The traditional security perimeters have changed and executives are being held accountable for safeguarding data against potentially devastating breaches that can tarnish a brand's reputation. Armed with GoVerifyID Enterprise Suite, corporations have access to a scalable and affordable solution that works with their existing Microsoft infrastructure and gives them the ultimate peace of mind."

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/14/2016 10:20:09 AM
   of 30699
“Synopsys Advances Test and Yield Analysis Solution for 7-nm”..............................................

7nm will be a huge boon for innovation.

3D interposer will be vital. Lagging edge discrete chips doomed.



Synopsys Advances Test and Yield Analysis Solution for 7-nm Process Node

Mon November 14, 2016 9:05 AM|PR Newswire|About: SNPS

PR Newswire

MOUNTAIN VIEW, Calif., Nov. 14, 2016 /PRNewswire/ --


Innovative slack-based cell-aware test for 7-nm designs increases defect coverage

FinFET SRAM defect modeling and test algorithms enable efficient test and repair of 7-nm memories

New diagnostics and yield analysis support for 7-nm reduce turnaround time

Synopsys, Inc. (SNPS) today announced it expanded its test and yield analysis solution targeting FinFET-specific defects to enable higher quality testing, repair, diagnostics and yield analysis of advanced 7-nanometer (nm) SoCs. To improve defect coverage, Synopsys has been collaborating with several semiconductor companies to advance testing and diagnostics methods for logic, memory and high-speed mixed-signal circuits targeted for manufacture with 7-nm processes. These collaborations are enabling rapid deployment of new functionality within Synopsys' synthesis-based test solution, featuring TetraMAX® II ATPG, DesignWare® STAR Memory System®, and DesignWare STAR Hierarchical System.

Leading semiconductor companies ramping up design capabilities for emerging 7-nm processes are facing increasing test quality and yield management challenges. To address these challenges, Synopsys' test solution delivers several innovative technologies that target defects occurring more frequently at emerging process nodes. For logic circuits, new modeling techniques, such as resistance sweeping, improve the ability of slack-based cell-aware tests to detect defects such as intra-cell partial bridges that are more prevalent with advanced FinFET processes. For embedded memory test and repair, the STAR Memory System solution incorporates custom algorithms based on silicon learning at the industry's top silicon foundries to detect and repair defects exemplified by resistive fin shorts, fin opens and gate-fin shorts. Furthermore, the DesignWare STAR Hierarchical System enables high coverage manufacturing and characterization test patterns for the 7-nm DesignWare PHY IP to be efficiently applied through the SoC hierarchy.

To accelerate diagnosis of 7-nm yield issues, defect isolation to specific areas within design cells is possible through new support of cell-aware descriptions in the database shared between TetraMAX II ATPG and Yield Explorer® solutions. The combination of test and diagnostic advances increase 7-nm defect detection and speed up failure analysis and yield ramp in production manufacturing environments.

"The growing complexity and process variation found with advanced 7-nm FinFET processes requires improved test and yield technologies," said John Koeter, vice president of marketing for IP and prototyping at Synopsys. "Our IP design teams are leveraging TetraMAX ATPG as well as STAR Memory System and STAR Hierarchical System test, repair and diagnostic solutions to help multiple customers designing with 7-nm IP improve their product quality and yield, while accelerating their time to market."

"As a leading provider of comprehensive test and yield solutions, Synopsys is committed to helping designers meet their growing challenges of higher quality and faster yield ramp," said Bijan Kiani, vice president of product marketing for the Design Group at Synopsys. "Through our on-going collaborations with leading semiconductor companies worldwide, we are delivering innovative solutions to address the specific requirements for advanced FinFET processes. These innovations will enable our customers to rapidly adopt 7-nm technologies to meet their goals for high-performance SoC products

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/14/2016 10:27:43 AM
   of 30699
"Mellanox Drives Virtual Reality To New Levels"............................................

Just the start...of a VR bonanza.


Mellanox Drives Virtual Reality To New Levels With Breakthrough Performance
Mon November 14, 2016 8:30 AM|Business Wire|About: MLNX

Demo of Ultra-Low Latency Long Distance Virtual Reality with 100Gb/s EDR InfiniBand to Take Place at SC’16

SALT LAKE CITY--(BUSINESS WIRE)-- Mellanox ( MLNX)® Technologies, Ltd. (NASDAQ:MLNX), a leading supplier of high-performance, end-to-end interconnect solutions for data center servers and storage systems, today announced that it will showcase a Virtual Reality over 100Gb/s EDR InfiniBand demonstration at the Supercomputing Conference, Nov. 14-17, Salt Lake City, Mellanox booth #2631.

Mellanox and Scalable Graphics will showcase an ultra-low latency solution that presents the ultimate extended virtual reality experience for rapidly growing industry markets including computer aided engineering, oil and gas, manufacturing, medical, gaming and others. By leveraging the high throughput and the low latency of Mellanox 100Gb/s ConnectX®-4 InfiniBand, Scalable Graphics VR-Link Expander provides a near-zero latency streaming solution for bringing an optimal Virtual Reality experience even over long distances.

“As opposed to our Ethernet based solution, which requires H.264 encoding of the video stream to cope with Ethernet bandwidth constraints, our InfiniBand based VR-Link Expander allows us to send raw image data that eliminates the last milliseconds of overhead,” said Christophe Mion, Chief Technology Officer at Scalable Graphics. “Thanks to Mellanox’s 100Gb/s ConnectX-4 InfiniBand it’s impossible to determine if the virtual reality PC is local or remote.”

“Expanding industry markets are rapidly adopting virtual reality as a business and training tool in the enterprise, as well as consumer segments,” said Scot Schultz, Director HPC/Technical Computing at Mellanox. “From creating large scale construction of a new campus, to emergency response training and education in health-care and military applications, Mellanox InfiniBand has ultra-low latency and the high bandwidth needed to drive real-world VR applications.”

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/14/2016 10:52:24 AM
   of 30699
“Transform the Customer Experience by Unlocking the Value of IoT Data”....................................

This is just the start…of IoE bonanza.


Oracle Service Cloud Enables Brands to Transform the Customer Experience by Unlocking the Value of Internet of Things Data

Mon November 14, 2016 8:00 AM|PR Newswire|About: ORCL

PR Newswire

REDWOOD SHORES, Calif., Nov. 14, 2016 /PRNewswire/ -- Oracle today announced an innovative new solution that enables brands to quickly and efficiently leverage insights from the Internet (HHH) of Things (IoT) to power smart and connected customer service experiences. Powered by a packaged integration between Oracle Service Cloud and Oracle IoT Cloud, the new solution helps brands enhance the customer experience, increase operational efficiency and reduce costs by using IoT data to predict customer needs and proactively address customer service issues.

The explosive growth of the Internet of Things gives organizations the opportunity to deliver innovative new services faster and reduce risk by connecting, analyzing and integrating data-driven insights from connected "things" into business processes and applications. To help brands capitalize on this opportunity to drive next-generation customer service experiences, Oracle (ORCL) has introduced a new packaged integration between Oracle Service Cloud and Oracle IoT Cloud. The new IoT Accelerator is an open source integration that also includes implementation documentation to easily configure, extend and deploy.

"The Internet of Things is fundamentally changing the way consumers interact with brands and in the process, it is creating volumes of data that organizations can leverage to transform the customer experience," said Meeten Bhavsar, senior vice president, Oracle Service Cloud. "By delivering a packaged integration between Oracle Service Cloud and Oracle IoT Cloud, we are able to accelerate the time to value, while lowering the complexity of IoT projects. For brands, this also means they can easily take advantage of IoT data and make it actionable across engagement channels to deliver exceptional customer service experiences."

Oracle Service Cloud helps brands seamlessly integrate IoT device data into existing omni-channel operations. For example, Denon + Marantz, a leading provider of premium branded equipment, is leveraging customer insights from more than 200,000 connected devices globally to deliver personalized, positive and consistent customer experiences worldwide.

"Denon and Marantz products have always provided a high quality, immersive musical experience and with IoT data, we now have the opportunity to extend that first-class experience to our customer service team," said Scott Strickland, CIO, Denon + Marantz. "Leveraging Oracle Service Cloud's IoT integration capabilities, we have been able to improve our customers' experience and increase our internal efficiency and knowledge base. In addition, we can leverage IoT information via the Oracle Marketing Cloud to target campaigns based on how a consumer actually uses the product and not how we think they use it."

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/15/2016 12:01:02 PM
   of 30699
June convection oven with Tegra processor...............................................................

Everything will get brainer and more connected...bit intense.

Power in a subtle design June, the company that makes the oven of the same name, was founded by two tech industry vets: CEO Matt Van Horn, who co-founded Zimride (now Lyft), and CTO Nikhil Bhogal, who previously worked at Apple. The Silicon Valley background is evident when you look at the guts of the June oven. The appliance runs on an Nvidia Tegra processor, which companies commonly use in mobile devices. It connects to your home's Wi-Fi so you can control the June remotely from your iOS device and see a live stream of your food as it cooks. A high-definition camera built into the top of the oven makes the live stream possible.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/15/2016 12:08:02 PM
   of 30699
shill outlet cnbc: AI & Big Data huge... Duh....................................

Today, 2 shill parrots.

Years late. This isn't news. This is ancient history.
toooo busy providing forum for shills' contort n distort (presented as "news").

same old krapp.
different day.
with the best government (regulation) money can buy.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/15/2016 12:29:42 PM
   of 30699
NVDA: "World’s Most Efficient SuperComputer Powered by Pascal GP100".......................

NVIDIA Unveils DGX SATURNV – World’s Most Efficient SuperComputer Powered by Pascal GP100, Delivers 9.46 Gigaflops/Watt

NVIDIA has announced their latest DGX SATURNV Supercomputer that is designed to build smarter cars and next generation GPUs. The DGX SATURNV is termed as the most efficient supercomputer and utilizes NVIDIA Pascal GPUs.

NVIDIA’s DGX SATURNV SuperComputer Is The World’s Most Efficient – Utilizes Tesla P100 GPUsThe DGX SATURNV is ranked 28th on the Top500 list of Supercomputers and is also the most efficient of them all. The Supercomputer houses several DGX-1 units, which is NVIDIA’s custom designed server rack based on their Tesla P100 graphics chips. Right now, the most efficient machine on the Top500 list is rated at 6.67 Giga Flops/Watt. The NVIDIA designed DGX SATURNV delivers an incredible 9.46 GigaFlops/Watt which is a 42% improvement.

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10