SI
SI
discoversearch

   Technology StocksCloud computing (On-demand software and services)


Previous 10 Next 10 
To: Sam who wrote (1398)8/30/2017 6:42:29 AM
From: Glenn Petersen
1 Recommendation   of 1419
 
Amazon has a target on its back.

Another anti-Amazon alliance announced yesterday:

Google and VMware are teaming up with a $2.8 billion startup to get an edge in the cloud wars with Amazon

businessinsider.com




Share RecommendKeepReplyMark as Last Read


From: FUBHO9/18/2017 9:27:21 PM
   of 1419
 
infoq.com

Using the new Web App for Containers capability, developers are able to

Pull container images from GitHub, Docker Hub or a private Azure Container Registry, and Web App for Containers will deploy the containerized app with your preferred dependencies to production in seconds. The platform automatically takes care of OS patching, capacity provisioning, and load balancing.


Share RecommendKeepReplyMark as Last Read


From: FUBHO9/19/2017 4:39:13 PM
1 Recommendation   of 1419
 
IBM Launches Its Own Shippable Cloud Data Migration Device





Hardened storage device offers 120 TB and uses AES 256-bit encryption


One of the barriers for enterprises storing data in the cloud is data migration, a process that has traditionally been slow and costly, hindered by network limitations. IBM wants to remove this barrier for its customers with a new cloud migration solution designed for moving massive amounts of data to the cloud.

IBM Cloud Mass Data Migration is a shippable storage device, which offers 120 TB and uses AES 256-bit encryption. The device also uses RAID-6 to ensure data integrity, and is shock-proof. The device is a flat-rate, and includes overnight round-trip shipping.

The device is about the size of a suitcase, and has wheels so it can be easily moved around a data center, Michael Fork, distinguished engineer and director, cloud infrastructure, IBM Watson and cloud platform said. Fork said that the solution allows customers to migrate 120 TB in seven days.

“When you actually look at the networking aspects of this, for example if you were to transfer 120TB over a 100 Mbps internet connection, that would take 100 or more days,” he said.


Similar options on the market include the AWS Snowball Edge, which was launched last year and offers 100 TB of usable storage capacity. In June, Google introduced Transfer Appliance, which offers up to 480TB in 4U or 100TB in 2U of raw data capacity. In the chart below, Google broke down how long data transfer can take over different connections.

“Previously we supported two main transfer methods. One was an IBM solution called IBM Data Transfer service, and this allows you to ship us a USB hard drive or CD/DVD, and so you could migrate in up to 10 TBs of data pretty easily using that service,” Fork said. “The other solution IBM supports is through IBM Aspera, a network-based transfer.”

IBM Cloud Mass Data Migration is designed for any customer that has large amounts of data to migrate to IBM Cloud, Fork said, pointing to customers who move large SAP datasets or datasets for use with IBM Watson or other cognitive services.

“VMware customers are bringing to IBM Cloud large amounts of data, VMDKs, machine images, they need a fast and efficient way to move large amounts of those,” he said.

Beyond Lights-Out: Future Data Centers Will Be Human-Free
A new generation of data centers will be optimized for extreme efficiency, not for human access or comfort.
Critical Thinking, a weekly column on innovation in data center infrastructure. More about the column and the author here.

The idea of a “lights-out” data center is not new, but it is evolving. Operators such as Hewlett Packard Enterprise and AOL have been long-term proponents of remote monitoring and management to reduce, or entirely replace, the need for dedicated on-site staff. The most well-known current advocate is probably colocation provider EdgeConneX that has integrated a lights-out approach into the fabric of its business.

However, despite the efficiency benefits, lights-out, or “dark,” sites are still viewed with skepticism in some quarters; not having staff readily on-hand to deal with outages is deemed just too high-risk. Data center certification body Uptime Institute, for example, recommends that one to two qualified staff are needed on-site at all times to support the safe operation of a Tier III or IV facility.

But while lights-out may be a niche option now, developments in remote monitoring, analytics, AI, and robotics could eventually see it taken much further.


These technologies combined with the elimination of all concessions to human comfort will enable ever more efficient and available data centers, some experts argue. Technology analyst firm 451 Research recently coined the phrase “ Datacenter as a Machine” (subscription required) to define unstaffed facilities that are primarily designed, built, and operated as units of IT rather than buildings. “As data centers become more complex, with tighter software-controlled integration between components, they will increasingly be viewed as complex machines rather than real estate,” the analyst group argues.

A facility designed and optimized exclusively for IT, rather than human operators, could enjoy a range of advantages over more conventional sites:

Improved cooling efficiency: There is good evidence that facilities could be operated at higher temperatures and humidity without impacting the reliability and performance of IT equipment. Progressive operators have made efforts to move into the upper reaches of ASHRAE’s recommended, or even allowable, temperature ranges. But the approach isn’t more pervasive due in part to its impact on human comfort. IT equipment may be functional at 80F and up, but it’s not a pleasant working environment for staff. Other highly efficient forms of cooling could make things even more uncomfortable. For example, close-coupled cooling technologies, such as direct liquid immersion, capture more than 90 percent of the IT heat load in a dielectric fluid but make no concession for the human operator. For the technology to become widely deployed in conventional sites additional, inefficient, perimeter cooling would be required in some locations just to keep the operators cool.

Better capacity management: Everything from rack height to access-aisle width is designed to make it easier for staff to install and maintain equipment rather than to optimize for efficiency. But if this space requirement was eliminated, equipment (power and cooling permitting) could be fitted into a much smaller footprint with, for example, potentially much higher, robot-accessible racks.

Reduced downtime and improved safety: According to a 2016 study by the Ponemon Institute, human error was the second-highest cause (behind power chain failures) of data center downtime. Electrocution – via arc-flash or other causes – also remains a real and present threat without the correct safety precautions. Use of hypoxic fire suppression – lowering oxygen levels – also has benefits for fire safety but again makes for a difficult working environment. A facility that was essentially off-limits to all but periodic or emergency access by qualified specialists could reduce the potential for human error and minimize the risk of injury to inexperienced staff.

But if on-site staff were effectively designed out of facilities, who or what would replace them? The kind of pervasive remote monitoring platforms already used at lights-out sites -- such as EdgeConneX’s edgeOS -- would likely play an instrumental role. Emerging tools, such as data center management as a service (DMaaS), which is effectively cloud-based data center infrastructure management, or DCIM, software – could also enable suppliers to take remote control (including predictive maintenance) of specific equipment or even an entire site. Eventual integration with AI/machine learning could also lead to more IT and facilities tasks being automated and self-regulated. Robotics is also likely to play a greater role in future data center management. Indeed, if facilities are designed to optimize space, then so-called dexterous robots may be the only way to access some parts of the site.

But despite the potential, a number of impediments will need to be overcome before unstaffed data centers become widely adopted. The biggest of these is obviously the perception that such designs would introduce additional risk. As such, early adopters would probably be limited to companies that are already comfortable with some form of lights-out approach. Facilitating technologies, such as DMaaS, AI-driven DCIM, and advanced robotics, are also still very nascent.

But there are still good reasons to think that, in specific use cases, unstaffed sites will eventually become the norm. For example, new micro-data center form factors to support edge computing are expected to proliferate in the next five to ten years and are likely to be monitored remotely and only require periodic visits from specialist maintenance staff.

Ihe prognosis doesn’t necessarily have to be all bad for facilities staff. To be sure, there will be fewer in-house positions in the future, but specialist third-party facilities management services providers – capable of emergency or periodic visits -- could expand headcount to meet the expected growth in new colocation and cloud capacity.

Ironic as it may sound, the future looks rather bright for the next generation of lights-out data centers.

Share RecommendKeepReplyMark as Last Read


From: FUBHO10/3/2017 3:38:39 PM
   of 1419
 

1 Million Container Instances in 2 Minutes Draws Rare Applause

sdxcentral.com



October 2, 2017
11:09 am PT
ORLANDO, Florida – Big numbers drew applause to what are typically rather staid affairs at the Microsoft Ignite event last week.

During a panel session entitled: “Orchestrating 1 million containers with Azure Service Fabric,” Mani Ramaswamy, principal program manager at Microsoft, did indeed show the creation and orchestration of one million containers. Even more impressive was that the demonstration took less than two minutes to complete.


Though this drew audience applause from what are typically sleepy afternoon sessions on the last real day of the conference, Ramaswamy seemed to want a bit more.

“I expected dancing in the aisles,” Ramaswamy joked (or at least it seemed like he was joking). He added that the more impressive part of the platform was that it was able to hold the reliability and availability of the instances at hyperscale.

“You never again have to worry about whether the platform can meet scale demands,” he said. “It’s the application that you have to worry about, not the platform.”

A container instance is a single container that is designed to start within seconds and can be billed by the provider in second increments. That billing typically includes the cost of turning up an instance, and charges for the processing and memory needed to run the instance.

Containers can run with a public or private IP address, with the former able to support consumer services accessed via the Internet, and the latter typically used for internal processes.

Ramaswamy said some of Microsoft’s competitors have been able to show public demonstrations of “a few hundred thousand” container instances created. Those rivals would seem to include Amazon, which has its ECS container instance.

The demonstration was the crescendo to Ramaswamy’s presentation on the flexibility and capabilities of Microsoft’s Azure Service Fabric.

Microsoft, during the show, launched general availability of its Azure Service Fabric on Linux. The product is a platform-as-a-service (PaaS) that supports running containerized applications on Service Fabric for Windows Server and Linux.

Developers can manage container images, allocate resources, run service discovery, and tap insight from operation management suite (OMS) integration. This work can then be ported between Windows Server and Linux without needing to alter code.


While the product can support both Windows and Linux, it can’t support both at the same time. Ramaswamy said Microsoft was looking to add that form of support in the coming months.

Microsoft announced last year initial general availability of Azure Fabric Service.

Share RecommendKeepReplyMark as Last Read


From: The Ox10/5/2017 8:07:11 AM
   of 1419
 

m.eet.com

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen10/5/2017 11:00:31 PM
   of 1419
 
Switch prices IPO above range at $17, raises more than $500 million

By Jeremy C. Owens
MarketWatch
Published: Oct 5, 2017 7:25 p.m. ET

Data-center operator Switch Inc. SWCH, +0.00% priced its initial public offering higher than expected Thursday evening to pull in more than half a billion dollars. The Las Vegas-based data-center company, which owns three large data centers and is developing a fourth, announced that it would sell 31.25 million shares at $17 apiece, after previously stating a target range of $14 to $16. At that price, Switch stands to collect at least $531.25 million at a valuation of about $4.2 billion; underwriters have access to another 4.7 million shares, which could push the take even higher. The company has said it will use the proceeds to buy out investors in Switch Ltd. and take control of that company though Switch Inc., which was just incorporated in 2017. A multi-class share structure will allow founder and Chief Executive Rob Roy to maintain control, as his shares will have 10 times the voting rights of common shares. Switch is expected to begin trading Friday morning on the New York Stock Exchange under the ticker symbol SWCH.

marketwatch.com

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen10/13/2017 9:48:05 AM
   of 1419
 
How Amazon, Google, Microsoft, And IBM Sell AI As A Service

The tech giants with cloud computing businesses are using artificial intelligence offerings to distinguish themselves and win business.

By Fast Company Staff

10.11.1712:30 pm




The success of Amazon’s Alexa voice assistant has reverberated throughout the business world, making AI- powered chat the next big thing. [Illustration: Daniel Zender]
_____________________

Alphabet, Amazon, and Microsoft have all discovered that the artificial intelligence they use to make their own products better can be turned into a service and sold to corporate customers as a value-added service on top of their booming cloud-computing businesses.

Alphabet and its best-known subsidiary, Google, have put considerable resources into machine learning going back to 1999, the first year that Google acknowledged publicly that it used AI to improve Google Search, then its only product. Once Google decided to get more serious about its cloud computing business and serving enterprise customers—Google Cloud storage officially launched in 2010—it has found more ways to take its AI investment and acumen and use it to serve others. Diane Greene, SVP of Google Cloud, has admitted that enterprise customers had been wary of Google because the company has been so consumer focused; its AI capabilities have played a meaningful role in winning them over.

Alphabet has two major divisions working on AI: Google Brain and DeepMind, which it acquired for $500 million in 2014. Both groups have worked on applying AI in healthcare, for example, which then allows Google Cloud to better serve businesses in that field. The company’s efforts in image recognition can become valuable for Airbus and other aerospace businesses that need to process and glean insights from large volumes of satellite imagery. All of Google’s work on Google Translate can now help any global business with a call center. Although most of the value in Google’s AI accrues to its own products and services, the company has stated that Google Cloud is one of its fastest-growing business units.

Amazon has a much more natural synergy between its AI efforts and how it can sell those initiatives to others via its industry-leading cloud computing service. As CEO Jeff Bezos wrote earlier this year in his letter to shareholders, “Much of what we do with machine learning happens beneath the surface . . . quietly but meaningfully improving core operations.” The examples Bezos cites include demand forecasting, fraud detection, and translations—all features that any business would value. As our feature on the Great AI War recounts, a sheriff’s department in Oregon pays Amazon about $6 a month to use Amazon’s facial-recognition service on an ongoing basis.

More than any of its rivals, Amazon has electrified the public with its audacious vision for an AI-powered future. Its line of Echo devices, brought to life by the artificially intelligent Alexa, has defined the path for the next generation of home automation and commerce and made voice-powered speakers arguably the hottest segment in consumer electronics. That success has enabled Amazon to release the technology powering Alexa as its own product so that any company can develop its own intelligent voice applications.

This strategy is central to Amazon’s history of success; it has largely always relied upon its ability to transform something it built for itself into something it can then sell to millions of businesses. Amazon started as a mere bookseller and then opened up its marketplace to let other retailers take advantage of its e-commerce platform. After it built warehouses to fulfill orders for customers, it offered Fulfillment by Amazon to those same marketplace businesses. Amazon Web Services started because Amazon had had to build excess computing capacity to support its business during the busiest shopping season; it could then sell that capacity to a host of others. This is how Amazon’s famous “flywheel” works and AI-powered services are its next frontier.

To that end, keep a close eye on the company’s retail concept called Go. It relies on computer vision and machine learning to present a different kind of shopping experience. Amazon has yet to open this new take on the convenience store to the public almost a year after announcing the idea. But once the company gets Go working, do not expect the company to roll out thousands of Go stores across the country. It is far more likely that Amazon will offer up this AI-powered retail infrastructure to existing shopkeepers who will pay Amazon a recurring fee to use it.

Also note that Amazon Web Services currently represents almost 10% of the company’s annual revenue and it is a part of Amazon’s business that investors monitor very closely. The more Amazon can keep AWS humming, the more its entire enterprise thrives.

Unlike Alphabet/Google or Amazon, almost all of Microsoft’s business lies in serving enterprise customers. It is the tech giant most focused on converting AI directly into revenue. “Our company’s identity is fundamentally about creating technology so that others can create more technology,” CEO Satya Nadella told Fast Company recently. “And it’s essential that it is being used for empowering more people.”

Artificial intelligence “is at the intersection of our ambitions,” Nadella told an audience of Microsoft partners in September 2016, suggesting that it will let the company “reason over large amounts of data and convert that into intelligence.” A few months later, Microsoft officially closed its $26.2 billion acquisition of LinkedIn, giving the company a large amount of data about employees, companies, and recruiting to reason over and try to make smarter.

In August, it debuted a real-time AI system for its enterprise cloud customers, which could help the company win business from companies who want to deploy such business initiatives as dynamic pricing and retail personalization. Microsoft’s mission to help companies in a wide range of industries to be more productive and effective means that it is the one company whose AI work is most keenly connected to its future prospects.

Similarly, IBM’s approach has been to target specific industries, from healthcare to retail, and learn those domains so that its Watson-branded AI (which IBM calls cognitive computing) can alleviate drudge work and wrangle impossibly large sets of data. “There’s a reason we call it cognitive [computing],” IBM CEO Ginni Rometty told the CNBC personality Jim Cramer in June 2017. “It’s about augmenting what you and I do so we can do what we’re supposed to, our best.”

IBM’s argument to customers is that it is the only company offering sector-based AI solutions and those businesses within them can own their own AI rather than just rent it. It’s also made the most overt effort to connect its industrial internet of things initiative to Watson, as best seen in IBM’s 2016 acquisition of The Weather Company for approximately $2 billion. The deal gave IBM access to 2.2 billion forecast points worldwide, a trove of data that Watson churns through to fuel multiple client services. These efforts have generated a lot of attention and Watson is arguably the strongest brand in AI, but they haven’t yet turned around IBM’s business.

A version of this article appeared in the November 2017 issue of Fast Company magazine.

fastcompany.com

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen10/24/2017 10:06:57 AM
   of 1419
 
h/t Sr K

U.S. Will Curb ‘Sneak-and-Peek’ Searches Microsoft Sued Over

By Dina Bass and Chris Strohm
Bloomberg
October 23, 2017, 8:46 PM EDT Updated on October 23, 2017, 9:09 PM EDT

> Microsoft had sued Justice Department citing free-speech right

> New federal guidelines call for more selective use of practice

The U.S. Justice Department is moving to scale back the use of orders forcing technology companies to turn over customer data without alerting users to the clandestine interception of their information.

Microsoft Corp., which sued the government over the practice last year, and other internet giants have argued that the future of cloud computing is in jeopardy if customers can’t trust that their data will remain private. Microsoft declined to comment Monday on whether it will drop its lawsuit, which was backed by rivals including Alphabet Inc.’s Google and Amazon.com Inc.

The rapid growth of the cloud, in which customer data is stored by providers like Microsoft, Apple Inc., Amazon and Google in the technology companies’ own data centers, has increased the frequency of warrants seeking data.

Going forward, prosecutors must “conduct an individualized and meaningful assessment" of whether a secrecy order is needed, according to a memo issued by Deputy Attorney General Rod Rosenstein. For internet users whose data is sought, the government shouldn’t delay notifying them for more than a year, except "barring exceptional circumstances," according to the memo. Microsoft argued in court that too many data requests carry secrecy provisions, often of indefinite duration, that violate the company’s free-speech rights.

The Justice Department said the changes will protect the rights of citizens and preserve companies’ relationships with their customers.

“This update further ensures that the department can protect the rights of citizens we serve, while allowing companies to maintain relationships with their customers by notifying those suspected of crimes, or believed to have information relevant to a crime, in a timely manner that information was obtained relating to their user accounts,” the department said in an emailed statement.

The dispute centered on the application of the Stored Communications Act, part of the 1986 Electronic Communications Privacy Act, a law that predates the advent of the World Wide Web. Microsoft contended that while some cases might require secrecy because disclosure could create a risk of harm or endanger the government’s case, the practice had become far too common.

In the 18 months before Microsoft sued in April 2016 in Seattle, the company said 2,756 of the legal demands it received from the U.S. government came with secrecy orders and two-thirds appeared to extend indefinitely. Microsoft defeated the government’s bid for dismissal of the suit in February, though the judge didn’t rule on the merits of the case.

Microsoft in September announced new cloud encryption technology that could offer an end-run around government secretive snooping by enabling customers to control access to content stored in Microsoft data centers.

bloomberg.com

Share RecommendKeepReplyMark as Last Read


From: Sam10/26/2017 4:49:17 PM
   of 1419
 
Microsoft, Alphabet and Amazon all beat and are rising AH. The Cloud is on fire!


Microsoft Rising: FYQ1 Beats, Nadella Says Cloud Ahead of Plan
By Tiernan Ray
Updated Oct. 26, 2017 4:20 p.m. ET

Microsoft ( MSFT) this afternoon reported fiscal Q1 revenue and earnings per share that easily topped analysts' expectations, sending its shares higher in late trading.

Revenue in the three months ended in September rose to $24.5 billion, yielding EPS of 84 cents.

Analysts had been modeling $23.52 billion in revenue, and 72 cents a share in earnings.

Chief Satya Nadella said that the company's "commercial cloud" business topped $20 billion in "annualized recurring revenue," which he said was faster than the company's promise two years earlier.

Microsoft's "productivity" division, including its "Office" suite, saw sales rise 28%, to $8.2 billion.

Revenue in the "Intelligent Cloud" group was up 14% at $6.9 billion. That includes Azure cloud computing, which saw sales rise 90%.

The "More Personal Computing" division saw sales decline 1% in constant-currency terms, to $9.4 billion. That includes a 4% increase in the manufacturer revenue for Windows software licenses on PCs, a 12% jump in Surface computer devices, and a 1% rise in video game revenue, including a 21% rise in Xbox software and services.

Microsoft stock is up $1.10, or 1.5%, at $79.86, in late trading.

barrons.com

[Microsoft is now trading at $81.71-81.74 AH.]

Share RecommendKeepReplyMark as Last Read


From: Sam10/26/2017 4:59:23 PM
   of 1419
 
Microsoft's Services Revenue Lifts Quarterly Results
DOW JONES & COMPANY, INC. 4:57 PM ET 10/26/2017

Symbol Last Price Change
78.76 +0.13 (+0.17%)
QUOTES AS OF 04:00:00 PM ET 10/26/2017


Microsoft Corp. (MSFT) has ridden the cloud-computing wave for several quarters, and once again its revenues surged on the strength of its emerging business of selling web-based, on-demand computing services.

In the fiscal first quarter, the two biggest pieces of Microsoft's(MSFT) cloud-computing operations -- its Azure infrastructure services and Office 365 online-productivity business -- saw revenue soar 90% and 42%, respectively.

While the software giant doesn't disclose revenue figures for those businesses, it said its commercial-cloud run- rate -- the last month of sales of its Azure and Office 365 products, multiplied by 12 -- hit $20.4 billion.

"They crushed it again," said Stifel Nicolaus & Co. analyst Brad Reback. "These were really strong growth rates."

Those gains continued to offset the company's Windows PC operating-system franchise, which has slowed in recent years. Revenue in Microsoft's(MSFT) More Personal Computing segment, which includes Windows as well as the mobile-phone and gaming businesses, stayed flat at about $9.4 billion. Microsoft(MSFT) doesn't break out revenue for its Windows business. Earlier in October, International Data Corp. reported world-wide PC shipments fell 0.5% in the third quarter.

Overall, Microsoft(MSFT) posted $6.58 billion in net income, or 84 cents a share, compared with a profit of $5.67 billion, or 72 cents a share, a year ago.

Revenue gained 12% to $24.54 billion.

Analysts surveyed by S&P Global Market Intelligence Microsoft to report per-share earnings of 72 cents on $23.56 billion in revenue.

Shares rose 2.6% to $80.74 in after-hours trading after results beat expectations. If Microsoft(MSFT) shares stay at these levels when markets open Friday, it would be an all-time high. The stock has gained 27% so far this year.

The engines of Microsoft's(MSFT) growth have been its Intelligent Cloud segment, which includes Azure, and its Productivity and Business Processes segment, which includes the Office franchise. Revenue in Intelligent Cloud rose 14% to $6.92 billion, while revenue in Productivity and Business Processes climbed 28% to $8.24 billion.

The Productivity and Business Processes unit also includes Microsoft's(MSFT) Dynamics business, which sells software and services to help sales representatives manage customer relationships and finance departments manage corporate resources. It is a market where Microsoft(MSFT) competes with Salesforce.com Inc., among others, and one in which the company has placed growing emphasis. Dynamics revenue grew 13%, though the company didn't disclose a revenue figure.

Microsoft (MSFT) purchased LinkedIn, the professional social network, in December for $27 billion, in part, to boost the Dynamics business. In the quarter, LinkedIn added $1.14 billion in revenue and posted a $294 million operating loss.

To support its growing cloud business, Microsoft(MSFT) is doling out huge sums to build expensive data centers around the world. In the quarter, Microsoft(MSFT) spent $2.7 billion in capital expenses, with much of that money going toward its data- center expansion. A year ago, Microsoft(MSFT) recorded $2.3 billion in capital expenses.

Microsoft (MSFT) launched a bevy of new Surface computers earlier this year, including a refreshed Surface Pro tablet- laptop hybrid and a lightweight laptop to compete with Apple's MacBook Air.

Microsoft (MSFT) didn't break out specific revenue figures for the devices, but noted that Surface revenue gained 12% in the quarter.

Write to Jay Greene at Jay.Greene@wsj.com

[MSFT is trading at $81.62-81.65 AH right now.]

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10 

Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.