We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksASML Holding NV

Previous 10 Next 10 
From: BeenRetired11/15/2016 1:43:25 PM
   of 31536
"Weebit ReRAM".................................................................................................

...while the slime street shills contort n distort about 20th HDD hotbox PCs...............

Weebit ReRAM technology transferred to Leti

Weebit Nano, the ReRAM start-up, has transferred its SiOx ReRAM from Rice University’s facilities in Houston, Texas, to Leti’s pre-industrialisation facility in Grenoble, France.

By David Manners 15th November 2016

Weebit Nano was founded in 2014 to develop a memory technology invented by Professor James Tour of Rice University with the potential to be 1000 times faster, more reliable, more energy-efficient and cheaper than flash.

Initial SiOx experiments at Leti’s pre-industrialisation facility confirm that Weebit’s nano-porous SiOx process is reproducible.

The next step is the development of a 1,000 bit array, followed by the development of a 1-million-bit array, which Weebit expects will demonstrate the ability to produce memory components for mass-storage applications.

Leti is expected to release a detailed report on the development process and optimisation of the technology in Q1 2017.

The report will outline plans to continue the development of the SiOx technology towards the creation of a 40nm ReRAM cell, which is expected in late 2017.

Weebit believes that achieving this milestone will open discussions with leading players in the semiconductor industry and pave the way towards commercialisation.

Once commercialised, Weebit’s technology will enable devices such as smartphones to have capacities of more than 1 terabyte (TB). The aim is to replace Flash.

Weebit Nano re-listed on the Australian Stock Exchange in August after completing its reverse takeover of iron ore company Radar Iron and raising A$5.04 million in a capital raise.

The money Is going towards R&D and fabrication of the Weebit technology, sales and marketing, business development, and expenses associated with the acquisition of Radar Iron.

The company’s executives are based in Tel Aviv, Israel, as are its internal R&D team.

David Perlmutter, formerly of Intel, is Chairman of Weebit

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/15/2016 2:07:06 PM
   of 31536
"XiP memory...ultimate solution for intelligent IoT".................................................

IoE stuff will simply explode in the 21st.
This is just the start....of the bit intense Age.
The EUV/ArF Age.

EcoXiP – System Accelerating NVM
OverviewDesigned from the ground-up to solve the challenges of XiP memory designs, Adesto’s new EcoXiP (ATXP Series) is the ultimate solution for intelligent IoT systems.

EcoXiP non-volatile memory replaces expensive, energy-inefficient architectures, making power and performance trade-offs unnecessary in a wide range of connected devices. EcoXiP more than doubles processor performance, lowers system power consumption and reduces system cost.

EcoXiP also offers users a range of power management features that provide the best standby power available in a XiP memory solution and features enhanced security with One-Time Programmable security registers.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/16/2016 6:39:34 AM
   of 31536
Google big machine learning, AI push by adding GPUs.....................................................................

"It will be able to run more efficiently thanks to the addition of GPUs to the CPUs."
Bit intensity....on steroids.
Nuff said.

Google is making a big machine learning and AI push in cloud services

Today, Diane Greene, the SVP for Google Cloud, announced a new push in Machine Learning and AI. There’s a new group under her division that will unify some of the disparate teams that had previously been doing machine learning work across Google’s cloud. Two women will take charge of the new team: Fei-Fei Li, who was director of AI at Stanford, and Jia Li, who was previously head of research at Snap, Inc. As Business Insider notes, Fei-Fei Li was one of the minds behind the Snapchat feature that lets you attach emoji to real-world objects in your snaps.The news came at the top of a slew of more announcements about the product roadmap for Google’s cloud services and how they’re expanding their use of machine learning. The announcements were all aimed at showing how Google’s cloud services include more than just renting time on a server — that it can provide services to its enterprise customers that are based on its machine learning algorithms. Those services include easier translation, computer vision, and even hiring.

For example, Google is talking up how it’s improving the infrastructure for Google Cloud. It will be able to run more efficiently thanks to the addition of GPUs to the CPUs its system already uses. Graphical processors are especially good at training machine learning systems more quickly. Google has also added some security layers to the GPU, something it claims isn’t necessarily common on other cloud platforms. So, Google says, there won’t be any data from a previous customer sitting in any of the GPU’s caches when the next customer starts spinning it up for their tasks. They’ll be available in 2017.

Google is also unifying its “cloud vision” API so the same system will be able to identify logos, landmarks, labels, faces, and text for OCR — making it simpler to implement. These systems will run on “Tensor Processing Units,” new hardware that’s optimized for Google’s TensorFlow platform. Google had previously unveiled the TPUs, but the new news today is that it’s cutting the price for “large scale deployments” by 80 percent.

Its natural language API is now globally available. It will be able to detect more “granular sentiment” in English, Spanish, and Japanese and also more “entity types” than it had in beta. Google’s natural language analysis will be able to handle morphology and syntax analysis. There’s also a new “premium translation” service.

Finally, Google is also introducing a new machine learning-based "Jobs API," which will apparently assist companies in doing massive "burst hiring" of hundreds of new employees. It allows computers to match up job openings with potential hires. Career Builder and Dice are signed up to use it, as is FedEx, Google says.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/16/2016 6:48:07 AM
   of 31536
Google PhotoScan...............................................................................

This is just the start...of The Mother of All Paradigm Shifts.
It will be very, very bit intense.

On the surface, Google Photos has a simple mission: to store all your pictures. Specifically, Google says it wants the service to be a home for all of your photos, and today that mission expanded to encompass the old photos you took on a point-and-shoot back in the '90s. A new app called PhotoScan was just released for iOS and Android, and it promises to make preserving the memories in your old printed photos much easier. That's not all — Google also released a number of updates and refinements to the core Photos app as well. PhotoScan is definitely the star of the show, though. According to engineers from Google who showed the app to the press earlier today, PhotoScan improves on the old "photo of a photo" technique that many now use to quickly get a digital copy of old prints; it's also a lot cheaper than sending pictures out to be scanned by a professional and a lot more convenient and faster than using a flatbed scanner.

When you open up the PhotoScan app, you're prompted to line up your picture within a border. Once you have the picture aligned, pressing the scan button will activate your phone's flash and start the process of getting a high-quality representation of the photo. Four white circles will appear in four different quadrants of the image; you'll be prompted to move your phone over each dot until it turns blue — once all four dots are scanned, the app pulls together the final image.

When moving the phone to scan each dot, the app is taking multiple images of the picture from different angles to effectively eliminate light glare — something Google cited as the biggest culprit that ruins digital pictures of photo prints. In practice, in Google's tightly controlled settings, it worked perfectly. It was easy to see how the lights in the room cast glare on the photo print and equally obvious how the app managed to eliminate it in the final scan. It's a bit of an abstract process to describe, but it worked like a charm. We'll need to test it further outside of Google's demo area, but early results were definitely encouraging.

The app also offers you the ability to adjust the crop to remove any hint of the background surface peeking into the photo, but it's otherwise a pretty minimal experience. Once you're done scanning, the app prompts you to save your scans. They're saved directly to your phone's storage; you can then upload them to Google Photos or the backup service of your choice. Google specifically said that it wanted this app to exist outside of Google Photos so that people could scan images and use whatever service they want to back them up.

Beyond PhotoScan are some noteworthy additions to the proper Google Photos app. The biggest change here is that there are a host of new photo-editing options on board. The Google+ app actually used to have a pretty robust set of editing options, but when Photos was liberated as a standalone app, the editing features were significantly culled.

As of today, Google Photos for both iOS and Android now has a entirely redesigned set of editing tools and filters. The "auto enhance" feature, which tweaks brightness, contrast, saturation and other characteristics of your photo has been improved thanks to the machine learning technology that is at the core of nearly all of Google's products. It can look at a photo and recognize what a photo editor might do to try and improve the image. Auto Enhance has long been a pretty solid feature, so seeing it continue to get smarter and better is definitely a good thing.

If you want to make further adjustments, the simple "light," "color" and "pop" sliders that were in the previous Google Photos app have been greatly expanded. Now, you can tap a triangle next to "light" or "color" to see a view with a host of more granular editing tools like exposure, contrast highlights, saturation, warmth and so on. Those tools aren't right in your face, so people who don't want to dive in can still make adjustments — but those who really want to go deep on editing their pictures will surely appreciate the option. I used to be a big fan of the Google+ photo editing tools so seeing these features come back is very welcome.

Google called out two of those adjustments in particular as things that only it can do with its vast store of photographic information. A new slider called "deep blue" saturates blues in an image like the sky or water to make them more vibrant, and it knows to specifically target those hues while leaving others unchanged. There's also a skin tone filter that can adjust saturation specifically on a subject's skin without altering the rest of the image. Other editing programs have similar filters, but Google says that this one is particularly accurate because of the millions of photos it has analyzed — it just has a better sense of what is skin and what isn't than other editors.

Lastly, Google added 12 new filters (of course it did) that take advantage of machine learning to be a little smarter than the average option. Rather than always slapping a default set of adjustments on a picture, Google Photos will make subtle improvements to the image first — it sounds like a combination of auto enhance as well as a filter. But those enhancements will be optimized to work well with the filter you're adding. It sounds nice, and the filters looked good on the images Google was showing off, but we'll need to spend some time playing around with it to see if they're really any better than what Instagram already offers.

Editing is the main addition to Google Photos, but there are a few other improvements here as well. If you're invited to a shared album, the app will prompt you with suggestions from your own photos to add. It's another place where Google's machine learning comes into play. And the movie maker, which can automatically select related photos and set them to a soundtrack, will be gaining some new event-focused options in the coming months.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/16/2016 6:53:45 AM
   of 31536
"The Future Is Now: Five Awesome Uses Of Virtual Reality In Marketing"

Just the start.


The Future Is Now: Five Awesome Uses Of Virtual Reality In Marketing

Augmented and virtual reality is technology that is becoming far more important to the societal landscape than just gaming. It’s found its way into the marketing arena as a heavyweight implementation tool as well. I’ve written extensively on how Pokemon GO provided small businesses with opportunities to leverage the augmented reality game for their marketing. But this is just the start of a massive influx of virtual and augmented reality into our every day lives.
In fact, if you’re a marketing professional, it’s time to take notice because the future is now for the use of virtual and augmented reality as a marketing tool. Just consider the following five excellent examples that have already been put into space.

  1. Test Drive: Clearly this is a case where you need to consider your target audience. Volvo introduced a virtual reality test drive for their XC90 SUV. The experience took the user down a breathtaking country backroad with the feeling of being in the leather-preened driver seat of the highly-rated, luxury sports utility vehicle. Would this experience work the same way for a stripped-down Ford Fiesta with cloth seating and 120 hp high revving little engine that could, sort of? Probably not, but for Volvo it makes perfect sense. Not everyone is sure the perceived luxury and performance of their brand is really worth the dollar value. It’s a big obstacle for marketers of luxury products to overcome. Therefore, VR and augmented reality can be used as a great way to overcome this preliminary objection. Just imagine what car manufacturers of even higher levels of luxury could do. Picture yourself perched in the cockpit of a fire red Ferrari or a devilishly attractive Lamborghini, complete with suicide doors and a slightly greater than 120 hp engine, racing down a European motorway at full throttle.
  2. Happy VR: From the high life of luxury automobiles to making the ordinary of fast food into something extraordinary, the VR world of marketing is already working its way through all industries. McDonalds has introduced a Happy Meal box in Sweden that is repurposed as a Google Cardboard VR headset. When Swedish children are done noshing on their cheeseburger, fries, and apple slices, they can strap on the box and play a skiing game called Slope Stars. So much for getting a cheap, plastic Disney character wrapped in a clear plastic bag, eh?
  3. Not Too Sexy for My VR: As early as 2014, VR was making a big impression in the marketing realm of the fashion world when London Fashion Week offered users a front-row view of the runway with a 360 degree panoramic video stream. In many ways, this was a groundbreaker for virtual reality technology. It was a brilliant use for the Oculus Rift headset that showed many industry spectators a good application for how the technology can be used for more than just gaming.
  4. Extreme Hiking without the Hiking: Ever wonder what it would be like to experience the awe-inspiring views and majestic beauty of traversing a dangerous mountainside without that whole threat of death via tragic free-fall thing? Well, that’s exactly what hiking boot manufacturer Merrell did when they created Trailscape, which was a VR marketing campaign that actually allowed users to walk around with their Oculus Rift headset on and gain an even deeper feel of virtual reality. It’s this type of memorable experience that will provide a punch for marketing professionals to utilize going forward that can influence buyers and sway consumer behavior toward their message most effectively.
  5. Beam Me Up … Marriott: Advertising vacation getaways seems an ideal fit for using VR technology in marketing. After all, what better way to advertise a vacation destination like Hawaii or London than by allowing your clients to get a little taste of those locales without ever having to show up at the airport? Marriott Hotels went a little beyond the traditional 360 degree video stream by putting users in telephone booth-like teleportation devices complete with heaters and wind jets for the real feel of a beach destination and more.
Although the preceding list of campaigns is impressive and revolutionary for the current industry standards, marketers have still only scratched the surface of what virtual and augmented reality can accomplish. Just think of some of the areas it can be applied to: space exploration, world history, live music, animation, and more. Now is the future. Now is the time to stay ahead of the curve and start designing your VR or augmented reality marketing campaign. Just be careful not to bump into a Pikachu while you’re at it.

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/16/2016 7:00:06 AM
   of 31536
"Slack is watching how you work"..................................................................................

Slack is watching how you work

If you use Slack for work, you’ve likely had this overwhelming experience: You come back from a meeting to a dozen slack rooms waiting for you with unread messages. It’s the new-age email inbox, full of stuff from colleagues you may or may not even know.

Slack thinks it has the combination to solve this issue: Learning how you work, and mixing in a little artificial intelligence to pinpoint exactly where you should spend your time.

“What we’re starting to build is what we’re internally calling our work graph,” Noah Weiss, Slack’s head of search and intelligence, said at the Code Enterprise conference in San Francisco on Tuesday.

“[This] is basically looking at you and seeing [which] people you seem to care the most about or respond the quickest to or interact with the most. What are the channels that you’re most active in? And then also, what are the topics that you seem to care the most about?”

Weiss continued: “That allows us to then build this layer of intelligence to then understand not just what your organization looks like, but then ... how do we personalize the service based on that.”

In other words: How can Slack make you more productive by watching how you work and whom you work with?

Slack is already doing some of this. The service will already make channel recommendations to some users, and Weiss says it’s starting to test message rankings, too, so that people know who to respond to first when they’re in a time pinch.

That idea may be a little creepy, but Weiss believes Slack can eliminate a lot of the wasted time that people spend trying to locate information or prioritize their time.

“Most knowledge work is a game of telephone,” Weiss said. “Part of our job at Slack is how do we short-circuit that game of telephone more effectively.”

Share RecommendKeepReplyMark as Last Read

From: BeenRetired11/16/2016 7:55:05 AM
   of 31536
Duh! SSS tipping point has arrived..................................................................

"relative price differential (the SSD is 175% more expensive) has narrowed into a small absolute differential. $70 is low enough that the obviously superior SSD technology will overwhelmingly be chosen, given the speed differential"

Jump from today's cutting edge 64-layer to tomorrow's 200-layer.
Some form of "X"RAM that might replace both RAM & NAND.

while shills contort n distort on the 20th HDD hotbox PC.

A few years ago?
slime street shills and sponsored "experts" told us this could never happen.
Then, SSS would only be niche.
Could never be for cold storage.

SSS will totally replace ALL HDDs, a power sucking, heat producing, space hogging, failure prone 20th bottleneck.

Seagate: The SSD Tipping Point
Nov. 15, 2016 11:56 AM

The SSD threat to HDDs has long been known.

However, right now I believe the pieces are in place to produce a tipping point.

As such, I expect consumer HDD demand to accelerate its drop, and for the higher-margin Enterprise business to also suffer.

It's long been known that SSDs (Solid State Drives) would provide a massive challenge to HDDs (Hard Disk Drives). I needn't remind that ultimately SSDs are much faster, more reliable, less power hungry and lighter. The benefits to computing, especially mobile computing, are entirely obvious.

For the longest of times, SSDs faced but one problem. They were a lot more expensive. As a result, they were only available in lower storage sizes, and these prices and lower storage sizes showed up as being a significant negative for many increasingly storage-hungry applications, such as video. Video, of course, was (and still is) enjoying a massive explosion in usage (at ever-higher resolutions, too).

As time went by, SSDs steadily saw their costs head lower, them being aided by Moore's law when it came to semiconductor manufacturing. Again, this isn't new. What is new, is that I believe the market might now be at a tipping point. Prices and capacities have reached a level where, I believe, suddenly desktop/laptop HDDs are going to see an extreme decline.

This decline will be aided not just by the SSDs becoming larger and cheaper, but also because video (and photo storage) has for a large part shifted towards the cloud. Now, this cloud doesn't store content on vapor, it also needs HDDs -- but the cost per bit at the cloud level will be lower than the cost per bit at the consumer level. Plus, of course, having a video in 1000 consumer devices consumes a lot more bits than having the video in 1 central location. Anyway, video started being heavily consumed in streaming form, and photos have gained from such freemium services as Google Photos, reducing the need for local storage.

The end result is SSD capacities continuing heading towards the "critical" 500GB-1TB levels, while local storage needs have, for the most part, stagnated at those same levels as well (for a mainstream consumer). And as these two meet at a decent price level, the tipping point happens: suddenly, consumers will no longer even consider buying a device which carries an HDD instead of an SSD.

Consider prices for components. A 500GB SSD today can easily be found for less than $110. A 500GB HDD is much cheaper, at $40. The price differential is now just $70 between the two. What can seem as a tremendous relative price differential (the SSD is 175% more expensive) has narrowed into a small absolute differential. $70 is low enough that the obviously superior SSD technology will overwhelmingly be chosen, given the speed differential. This is so because an HDD-equipped laptop will behave as a much lower-end device than the same laptop using an SSD. Thus, a 15% or so price differential between the two laptops will appear as minimal versus the performance boost.

The Tipping Point Is Already Becoming Evident

Take Seagate (NASDAQ: STX). Seagate reports its segments as being:

  • Enterprise.
  • Client Compute. This is where all PC desktop and laptop HDDs go.
  • Client Non-Compute. This relates to other devices using HDDs, like DVRs or, recently, gaming consoles (moved from Client Compute).
  • The tipping point I am talking about affects mostly "Client Compute". The year-on-year performance for this segment is already showing steep contraction, with volume drops of around 30%. These are the drops I actually expect to accelerate.

    Moreover, Seagate has shifted some of this segment into the Client Non-Compute by moving gaming consoles there (Xbox One, PS4). Alas, one could argue that these consoles are just 1 generation away from quitting HDDs as well, especially if Nintendo's Switch gains any market traction. The Switch is based on a mobile device which works as a hybrid console (fixed, on-the-go). Obviously, it doesn't use an HDD.

    Regarding DVRs, it might take longer since cost is a powerful argument and customers might be more amenable to poor performance.

    It should be said that Enterprise HDDs were 41% of revenue in the latest quarter, versus 24% for PC Client revenues. Thus, while in terms of volume Client Compute represented 41%, it's clear that Enterprise ASPs still bridge a lot of that gap. But this brings us to another problem.

    Another Problem - Enterprise

    The other problem relates to the Enterprise segment. This segment could broadly be divided into high performance HDDs and high capacity HDDs. Here's how Seagate defines Enterprise Performance HDDs:

    Enterprise Performance HDDs. Our 10,000 and 15,000 RPM Enterprise Performance disk drives feature increased throughput and improved energy efficiency, targeted at high random performance server application needs. Performance 10,000 RPM HDDs ship in storage capacities ranging from 300GB to 1.8TB, and our 15,000 RPM HDDs ship in storage capacities ranging from 146GB to 600GB.

    Now, these are certainly some of the higher ASP, higher margin, Enterprise products. The problem, of course, is obvious. This is the precise segment which gets eaten by SSDs the quickest. SSDs are superior in performance, and HDDs sacrifice capacity for performance. HDDs in such a segment are bound to lose market share particularly quickly.

    The other broad segment, focusing on higher capacity, is the one where HDDs will lose the slowest -- indeed, it's the one segment that's helping show stability, as it gains from the mobile->cloud content move as previously described. However over time 3D NAND improvements and Intel's XPoint are sure to also bring grief there.

    Yet Another Problem - SSDs Are A Different Market

    It might be that some will put forward the argument that Seagate will be able to sell SSDs as well. Indeed, Seagate already does and has an entire Enterprise segment dedicated to it:

    Enterprise SSDs. Available in capacities up to 3.8TB, the SSD features 12GB per second SAS, and delivers the speed and consistency needed for demanding enterprise storage and server applications. We also offer our Nytro family of accelerator cards with capacities up to 4TB.

    This, in my view, is a fallacy. It doesn't matter if Seagate will also sell SSDs. The problem here is that HDDs are currently an oligopoly served by a handful of competitors. These competitors control the underlying disk drive technology and don't share it with other potential entrants.

    The SSD market, however, is entirely different. The essential building blocks, be they NAND flash memory or controllers, are accessible to every entrant who wants to buy them and build SSDs with them. If there's an oligopoly, this oligopoly would be at the level of the NAND suppliers including Micron (NASDAQ: MU), Samsung, SanDisk, Hynix and Toshiba.

    At the SSD level, there's a gazillion competitors, none of which has any cost advantage. Seagate is only one more such competitor. The underlying economics for the SSD business are thus intrinsically horrendous at this point in time, and likely to remain so. Thus, for Seagate, substituting HDD sales for SSD sales, while it can keep the revenue line going, is also sure to lead to a negative margin impact over time.

    Share RecommendKeepReplyMark as Last Read

    From: BeenRetired11/16/2016 8:22:54 AM
       of 31536
    KLIC CC: Stacking results in a 44% increase.......................................

    It makes no sense for the gadget and thingy guys to use 10nm and below Logic and tolerate bottlenecking lagging edge discrete chips.

    The 3D interposer Age is here.
    The EUV/ArF Age is here.

    It's Cymer time.

    The lower price of solid-state-storage is accelerating consumer and business adoption of NAND-based solid-state-drive. Both 2D and 3D NAND’s production heavily relies on stacks of die connected using wire bonding technology. While we have historically served this space, emerging chance within memory market related to thinner die, taller die stacks and complex stacking arrangements have demanded new wire bonding features and process capabilities.

    We directly address this market opportunity with a recent released and feature rich memory bonder that is well-positioned to meet growing demand of NAND and also DRAM applications into the future. Since its release it has been very well received.

    Jonathan Chou

    Thanks, Joe. As we announced earlier this morning, we are very pleased to have once again exceeded the high-end of our guidance with $216.4 million of revenue for the third fiscal quarter. This deep 38% sequential revenue important marks the second consecutive quarterly ramp. As a reminder, our top line increased steeply by 44% during the March quarter.

    This solid performance was largely due to continuing demand of our core ball and wedge solutions, as well as consistent strength within our advanced packaging business supporting system in package opportunities.

    Share RecommendKeepReplyMark as Last Read

    From: BeenRetired11/16/2016 8:47:22 AM
       of 31536
    "GE Digital and SAP Partner to Advance IIoT"...............................................

    GE knows the future is bit intense.
    Heard their jet engines have chips.
    Their locomotives keep getting smarter.
    The ONLY path to success goes thru Chipland.


    GE Digital and SAP Partner to Advance Industrial Internet of Things (IoT)
    Wed November 16, 2016 8:00 AM|PR Newswire

    The companies intend to collaborate on cloud-to-cloud interoperability- Collaboration also to focus on SAP® Asset Intelligence Network- Initial focus of collaboration on joint customers within the oil and gas industry- Reinforced commitment to standards and reference architecturesPR Newswire SAN FRANCISCO and WALLDORF, Germany, Nov. 16, 2016 /PRNewswire/ -- SAP SE ( SAPGF) (NYSE: SAP) and GE Digital (NYSE: GE) today announced their intention to explore collaboration in the area of the Industrial Internet of Things (IoT). The announcement is driven by the fundamental shared belief in the power of the Industrial IoT, and the ability for joint customers to drive further efficiency, savings and utilization from their investments into assets. The announcement was made at GE's Mind + Machines conference being held November 15 and 16, 2016, in San Francisco.

    Share RecommendKeepReplyMark as Last Read

    From: BeenRetired11/16/2016 9:18:41 AM
       of 31536
    PSTG PR: Our SSS is NVMe ready (HDD? Not so much)............................................

    All the R&D cash being thrown at SSS. SSS no longer has to stuffed into 20th HDD configurations or run on HDD centric software.
    Both PCIe and DIMM are specifically for SSS. HDD cannot compete, doomed.
    Newer Intel CPUs are XPoint ready.

    This is not your Dad's 20th word processor PC era.
    This is the 21st All Silicon Solution Age.
    This is the EUV/ArF Age.


    Pure Storage Future Proofs for NVMe; Introduces NVMe-Ready Guarantee
    Wed November 16, 2016 9:00 AM|PR Newswire|About: PSTG

    PR Newswire MOUNTAIN VIEW, Calif., Nov. 16, 2016 /PRNewswire/ -- Pure Storage ( PSTG) (NYSE: PSTG), the market's leading independent solid-state array vendor, today announced the introduction of its NVMe-Ready Guarantee. Pure Storage guarantees that every newly purchased FlashArray//M can be upgraded to full NVMe through its EvergreenTM Storage program.

    NVM Express, or NVMe, a next-generation memory-class protocol for CPU-to-flash communication, is poised to drive a shift across the storage industry to NVMe architectures. Leading industry analysts forecast that NVMe, which is enabling the next generation of flash performance and density, will become the leading interface protocol for flash by 2019. A critical mass of consumer devices has already shifted to NVMe, and the enterprise will not be far behind.

    "NVMe protocol is set to dominate enterprise flash, because it allows much greater performance than the SAS and SATA interfaces used currently in data center flash drives. SAS was designed for disk, but NVMe was designed both for flash and the new solid-state, non-volatile memories that are waiting in the wings," said Tim Stammers, senior analyst at 451 Research. "Pure's FlashArray//M systems are already future-proofed for this change, and that is a very unusual and important aspect."

    NVMe is faster and more parallel than the existing Serial-Attached SCSI (SAS) storage protocol, and offers 64K parallel queues in comparison to a single SAS channel, which enables direct communication paths to Solid State Drives (SSDs). This massive parallelism eliminates the "serial-connection" bottleneck – enabling significantly higher performance for coming technological advances, including massively multi-core CPUs, super-dense SSDs, new memory technologies and high-speed interconnects.

    "Seven years ago, we saw the potential of flash as an industry standard. We have reached a similar inflection point with NVMe, and enterprises will need to be prepared for the shift to take full advantage of technological improvements in 2017 and beyond," said Matt Kixmoeller, VP of Product, Pure Storage. "Any organization buying new storage today needs to be NVMe-ready to protect its investment. By proactively engineering for NVMe, Pure has widened the gap between purpose-built arrays and legacy retrofits, and is best positioned to lead the industry transition to NVMe."

    In anticipation of the now inevitable shift to NVMe, Pure Storage engineered FlashArray//M to be NVMe-ready from the beginning, starting more than three years ago. Every FlashArray//M ships with dual-ported and hot-pluggable NVMe NV-RAM devices, engineered by Pure – an industry first when released in 2015. Additionally, the FlashArray//M chassis is wired for both SAS and PCIe/NVMe in every flash module slot, which enables the use of SAS-connected flash modules today as well as a transition to NVMe in the future.

    "As a solution provider and adviser, long-term customer relationships are critical to our ongoing success," said Bruce Poor, Vice President of Sales, Hogan Consulting Group. "Customers deserve investment protection and the NVMe-Ready Guarantee from Pure Storage gives us increased confidence that we are giving customers the best value now, and into the future. A customer shouldn't be penalized for not knowing what the future holds in their technology choice."

    Each of the slots within the FlashArray//M chassis is capable of using NVMe and SAS-capable flash modules, and controllers are non-disruptively upgradable to transition internal and external networks from SAS to NVMe. The Purity Operating Environment is optimized for NVMe with massively parallel and multi-threaded design, as well as global flash management across the entire flash pool. As a result, customers will be able to convert any FlashArray//M to NVMe-enabled controllers and capacity without a forklift upgrade or disruptive migration.

    "End-user experience is the key to attracting and keeping great customers. The greatest challenge is keeping end-user experience first-class and consistent," said Jason Michaud, President of MacStadium. "Pure Storage anticipates future technological needs like NVMe, and offers a clear, non-disruptive upgrade path, which enables us to stay up and running through periods of tremendous growth with our globally dispersed IT staff."

    Upgrades to NVMe-enabled controllers are planned to be generally available prior to December 31, 2017. The full terms of NVMe-Ready Guarantee are available upon request

    Share RecommendKeepReplyMark as Last Read
    Previous 10 Next 10