SI
SI
discoversearch

   Technology StocksCloud computing (On-demand software and services)


Previous 10 Next 10 
From: Sam6/19/2017 11:10:48 PM
1 Recommendation   of 1405
 
Gartner confirms what we all know: AWS and Microsoft are the cloud leaders, by a fair way
Paranormal parallelogram for IaaS has Google on the same lap, IBM and Oracle trailing
19 Jun 2017 at 06:01, Simon Sharwood

Gartner has published a new magic quadrant for infrastructure-as-a-service (IaaS) that – surprising nobody – has Amazon Web Services and Microsoft alone in the leader's quadrant and a few others thought outside of the box.

Here's the Soothsaying Square in all its glory.



Gartner's Magic Quadrant for Cloud Infrastructure as a Service, Worldwide June 2017.
Click here to embiggen


That Oracle and IBM are rated visionaries may turn heads, as both strut like cloud leaders: Oracle regularly says its cloud is superior to Amazon's. Yet Gartner rates Oracle's cloud “a bare-bones 'minimum viable product'” that offers “only the most vitally necessary cloud IaaS compute, storage and networking capabilities.” The analyst firm also worries about the Oracle cloud's “limited operational track record” and warns that “Customers need to have a very high tolerance for risk, along with strong technical acumen.”

continues at theregister.co.uk

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: Sam who wrote (1383)6/19/2017 11:16:14 PM
From: Sam
   of 1405
 
Microsoft's Azure cloud feels the pinch in price war with Amazon's AWS
Ah, the old 'Windows upsell' one-two
28 Apr 2017 at 14:07, Gavin Clarke

Analysis Sales of Surface, falling 26 per cent year-on-year, wasn't the only wrinkle in Microsoft's third-quarter trading period.

Management beat the cloud drum for Wall Street on Thursday, talking across-the-board growth.

Azure, Dynamics 365 and Office 365 commercial saw the biggest revenue growth, according to Microsoft – 93 per cent, 81 per cent and 45 per cent respectively.

Getting a more material breakdown is impossible as these are buried within bigger business units. Microsoft rarely reveals individual numbers – and when it does they are cherry-picked.

Overall, the unit responsible for Azure, Intelligent Cloud, converted that 93 per cent growth into $6.7bn revenue – up 10 per cent year-on-year.

The stable housing Office and Dynamics, Productivity and Business Process, converted the online apps' growth into overall revenue of $7.9bn, up 22 per cent.

It should be noted that Productivity and Business Process also includes the on-prem as well as the cloud, 365-branded versions of its business suites.

Intelligent Cloud includes such popular perennials as SQL Server and Windows Server.

So far, so good – but runaway sales aren't translating into runaway cash growth for Microsoft.

Income for the Azure group is more or less flat year-on-year – growing less than 1 per cent to $2.1bn.

Apps was worse for Microsoft. Income fell 6.6 per cent to $2.7bn for the Productivity and Business Process unit.

Over at Microsoft's number-one cloud competitor, Amazon, things were smaller but rosier. AWS made less than both Microsoft's units combined during the last three months – revenue of $3.6bn, up 23 per cent – but income grew, up 41 per cent to $724m.

The pair have been in an ongoing price war, which seems to be taking its toll on Microsoft.

Redmond made a round of cuts in February, hacking virtual machines and storage by up to 51 per cent. AWS had made 53 price cuts by the end of 2016.

Investors confronted Microsoft management about this during its third-quarter conference call. "To what degree do those price cuts actually affect you," asked Morgan Stanley's Keith Eric Weiss.

Top brass swerved a direct answer but made a pitch about "price competition at the lower level".

"Because we're able to continue to move people up the stack, including all the way up to the business process layer, I think you'll continue to see us be confident in our ability to move and create margin and growth," chief financial officer Amy Hood said.

It's the same play as Windows of old: pushing customers to the "premium" SKUs – the versions with the supposedly better features.

Microsoft's strategy is clear – onboard people cheaply and as they generate data and come to depend on your various cloud services, either up the prices or ensure they take on enough features that help push up the overall bill.

Whether that washes in the face of a keen price competitor such as AWS is unclear and it'll depend on whether Microsoft's customers are willing to surrender their choice on cloud infrastructure as they did, or were forced to, on PC and server infrastructure in decades past. ®

theregister.co.uk

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: Sam who wrote (1384)6/19/2017 11:20:13 PM
From: Sam
1 Recommendation   of 1405
 
Migrating to Microsoft's cloud: What they won't tell you, what you need to know
Of devils and details
19 Jun 2017 at 09:04, Sonia Cuff
theregister.co.uk

“Move it all to Microsoft’s cloud,” they said. “It’ll be fine,” they said. You’ve done your research and the monthly operational cost has been approved. There’s a glimmer of hope that you’ll be able to hit the power button to turn some ageing servers off, permanently. All that stands in the way is a migration project. And that is where the fun starts.

Consultants will admit that their first cloud migration was scary. If they don’t, they’re lying. This is production data we’re talking about, with a limited time window to have your systems down. Do a few migrations and you learn a few tricks. Work in the SMB market and you learn all the tricks, as they don’t always have their IT environments up to scratch to start with. Some of these traps are more applicable to a SaaS migration, particularly to Office 365. Some will trip you up no matter what cloud flavour you’ve chosen.

How much data? The worst thing you can do is take your entire collection of mailboxes and everything from your file servers and suck it all up to the cloud. Even in small organisations that can be over 250GB of data. If your cloud of choice doesn’t have an option to seed your data via disk, that all has to go up via your internet connection. At best, we’re talking days. Remember a disk seed isn’t always viable if you’re not located in a major city close to your cloud’s data centre. If it has to go via courier and then a plane, any data on a portable disk better be encrypted and again, you’re talking days for transport time. How do you put a lock on your production files in the meantime, assuming you’ll have no way to sync changes (more applicable to files than mailboxes).

Your two best options (pick one or both) are a pre-cloud migration archiving project and/or a migration tool that will perform a delta sync between the cloud and your original data source. Get ruthless with the business about what will be available in the cloud and what will stay in long-term storage on-prem. You seriously don’t want to suck up the last 15 years of data in this migration project. Once the current, live stuff is in the cloud by all means run a separate project to upload the rest of your older historical data if you wish. Email migrations seem to handle this the best, with tools like SkyKick and BitTitan MigrationWiz throttling the data upload over time, performing delta syncs every 24 hours and even running final syncs after you’ve flipped your MX records to the cloud. No email left behind!

Piece of string internet connection

Don’t even start a cloud project until you’re happy with your internet speeds. And don’t ignore your lesser upload speed either. That’s the one doing all the hard work to get your data to the cloud in the first place and on an ongoing basis if you are syncing all the things, all the time. Another tip: don’t sync all the things everywhere all the time. If you’re going to use the cloud, use the cloud, not a local version of it. Contrary to popular belief, working locally does not reduce the impact on your internet connection, it amplifies it with all the devices syncing your changes.

Outlook item limits

Office 365 has inherited some Microsoft Exchange and Outlook quirks that you might hope are magically fixed by the cloud. Most noticeable is performance issues with a large number of items or folders in a mailbox. This includes shared mailboxes you might be opening in addition to your own mailfile. Add up the number of folders across all of your shared mailboxes and you may have issues with searching or syncing changes if you are caching those mailboxes locally. We’ve seen Microsoft’s suggestion to turn off caching (i.e. work with a live connection to the cloud mailbox via Outlook) cause Outlook to run even slower and users to run out of patience.

The answer? You’re really left with just the option of a pre-cloud migration tidy-up. Local archiving is fairly easy to implement to shrink the mailbox, then online archiving policies take care of things once you are working in the cloud. If you don’t want the cost of an Office 365 E3 licence just to get archiving, look at adding an Exchange Online Archiving plan to the mailboxes that need it. This can include any shared mailboxes, but they’ll need to also be allocated their own Exchange Online plan licence to enable archiving to be added too.

DNS updates and TTL

When you are ready to flip your MX records to your new cloud email system, it’s going to take time for the updated entry to filter out worldwide across the global network of secondary DNS servers. Usually things will settle down after 24 hours, which is fine if your organisation doesn’t work weekends but challenging if you are a 24x7 operation. Some time before cut-over date, check the Time To Live (TTL) setting on your current MX record and bump it down to 3,600 seconds. Older systems can be set to 24 hours, meaning that’s how long someone else’s system will go with your old record before checking to see if it’s changed. Setting your TTL to 3,600 is a nice balance between update frequently versus don’t query the authoritative server every five minutes.

Missing Outlook Stuff

Lurking in the shadows of a Microsoft Outlook user profile are those little personal touches that are not migrated when a mailfile is sent to the cloud. These are the things you’ll get the helpdesk calls about. The suggested list of email addresses (Autocomplete), any text block templates (Quick Parts) and even email signatures all need to be present when accessing the user’s new email account. Depending on your version of Outlook, do some research to find out where these live and how to migrate them too or use a migration tool that includes an Outlook profile migration.

One admin to rule them all If I had a dollar for every time someone locked themselves out of their admin account and the password recovery steps didn’t work, I wouldn’t need to be writing this. Often your cloud provider can help, once you’ve run the gauntlet of their helpdesk. Save yourself the heartache by allocating more than one administrator or setting up a trusted partner with delegated administration rights. Office 365 does this very well, so your local helpful Microsoft Partner can unlock you with their admin access.

Syncing ALL the accounts

Even if your local on-prem directory is squeaky clean (with no users who actually left in 2012), it will contain an amount of service accounts. The worst thing you can do is sync all the directory objects to your cloud directory service, which then becomes a crowded mess. Take the time to prepare your Microsoft Active Directory first. Then use filtering options for Azure AD Connect to control what accounts you are syncing with the Cloud.

Compatibility with existing tech

Older apps don’t support TLS encryption that is required by Office 365 for sending email. This can impact software and hardware, such as scanners or multifunction devices. On the other hand, newer scanners can support saving directly to the cloud – Epson devices will back up to OneDrive, but not OneDrive for Business.

Ancient systems

You thought the migration went smoothly, but now someone’s office scanner won’t email scans or a line of business application won’t send its things via email. Chances are those ancient systems don’t support TLS encryption. Now things are going to get a little complicated. There are direct send and relay methods, but it might easier to buy a new scanner.

Metadata preservation

This one’s for the Sharepoint fans. True data nerds love the value in metadata – all the information about a document’s creation, modification history, versions etc. A simple file copy to the cloud is not guaranteed to preserve that additional data or import in into the right places in your cloud system. Learn that before you’re hit with a compliance issue or discovery request. Avoid the problem by investing in a decent document migration tool in the first place, like Sharegate.

Long file names

Once upon a time we had an 8.3 character short file name and we lived with it. Granted, we created much fewer files back then. With the arrival of NTFS we were allowed a glorious 260 characters in a full file path and we use it as much as we can today. Why? Because search sucks and a structure with detailed file names is our only hope of ever finding things again on-prem. Long file names (including long-named and deeply nested folders) will cause you grief with most cloud data migrations.

If you don’t run into migration issues with this, just wait until you start syncing. We’ve seen it both with OneDrive and Google Drive and on Macs too. Re-educate your users and come up with a new, shorter naming standard. And watch out for Microsoft lifting the 260-character limitation in Windows 10 version 1607. Fortunately, it’s opt-in.

Of course, I’ve omitted the need to analyse who needs access to what and ensuring you mimic this in the cloud, because it feels like a given. That is until someone calls to say they can’t see the emails sent to sales@ or access a particular set of documents. There are probably other migration gotchas that have bitten you and you’ll know to avoid next time. What else would be on your list? This kind of discussion among ourselves is more valuable than any vendor migration whitepaper you’ll ever read. ®


Share RecommendKeepReplyMark as Last Read


From: FUBHO6/29/2017 11:41:30 AM
   of 1405
 
Wal-Mart Prods Partners, Vendors to Leave AWS for Azure

BY ALDRIN BROWN ON JUNE 28, 2017

One provider of big data management services said it opted to host applications on Microsoft Azure instead of AWS, expressly to win business from a tech firm with a Wal-Mart account. Read More

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: FUBHO who wrote (1386)6/29/2017 11:43:12 AM
From: FUBHO
1 Recommendation   of 1405
 
Alibaba’s Increasing Cloud Data Center Footprint

BY CHRISTINE HALL ON JUNE 28, 2017

Latest addition is a 50,000-square foot GDS facility in China Read More

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen7/2/2017 3:42:31 PM
   of 1405
 
Dropbox Is Getting Ready for the Biggest Tech IPO Since Snapchat\

Reuters
12:07 PM ET

Data-sharing business Dropbox Inc is seeking to hire underwriters for an initial public offering that could come later this year, which would make it the biggest U.S. technology company to go public since Snap Inc, people familiar with the matter said on Friday.

The IPO will be a key test of Dropbox 's worth after it was valued at almost $10 billion in a private fundraising round in 2014.

Dropbox will begin interviewing investment banks in the coming weeks, the sources said, asking not to be named because the deliberations are private.

Dropbox declined to comment.

Several big U.S. technology companies such as Uber Technologies Inc and Airbnb Inc have resisted going public in recent months, concerned that stock market investors, who focus more on profitability than do private investors, would assign lower valuations to them.

Snap, owner of the popular messaging app Snapchat, was forced to lower its IPO valuation expectations earlier this year amid investor concern over its unproven business model. Its shares have since lingered just above the IPO price, with investors troubled by widening losses and missed analyst estimates. It has a market capitalization of $21 billion.

Still, for many private companies, there is increasing pressure to go pubic as investors look to cash out.

Proceeds from technology IPOs slumped to $6.7 billion in 2015 from $34 billion in 2014, and shrunk further to $2.9 billion in 2016, according to Thomson Reuters data.

Dropbox 's main competitor, Box Inc, was valued at roughly $1.67 billion in its IPO in 2015, less than the $2.4 billion it had been valued at in previous private fundraising rounds.

San Francisco-based Dropbox , which was founded in 2007 by Massachusetts Institute of Technology graduates Drew Houston and Arash Ferdowsi, counts Sequoia Capital, T. Rowe Price and Greylock Partners as investors.

Dropbox started as a free service for consumers to share and store photos, music and other large files. That business became commoditized though, as Alphabet Inc's Google, Microsoft Corp and Amazon.com Inc started offering storage for free.

Dropbox has since pivoted to focus on winning business clients, and Houston, the company's CEO, has said that Dropbox is on track to generate more than $1 billion in revenue this year.

The company has expanded its Dropbox Business that requires companies to pay a fee based on the number of employees who use it. The service in January began offering Smart Sync, which allows users to see and access all of their files, whether stored in the cloud or on a local hard drive, from their desktop.

fortune.com

Share RecommendKeepReplyMark as Last Read


From: FUBHO7/17/2017 5:03:21 PM
1 Recommendation   of 1405
 
One Click and Voilà, Your Entire Data Center is Encrypted

BY WYLIE WONG ON JULY 17, 2017

IBM says its new encryption engine will allow users to encrypt all data in their databases, applications, and cloud services with no performance hit. Read More

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: FUBHO who wrote (1389)7/17/2017 5:03:55 PM
From: FUBHO
   of 1405
 
Report: VMware, AWS Mulling Joint Data Center Software Product

BY YEVGENIY SVERDLIK ON JULY 17, 2017

No details yet, but hybrid cloud product most likely Read More

Share RecommendKeepReplyMark as Last Read


From: Glenn Petersen7/18/2017 7:38:56 PM
   of 1405
 
Google’s Quantum Computing Push Opens New Front in Cloud Battle

By Mark Bergen
@mhbergen More stories by Mark Bergen
Bloomberg Technology
July 17, 2017

-- Company offers early access to its machines over the internet

-- IBM began quantum computing cloud service earlier this year

For years, Google has poured time and money into one of the most ambitious dreams of modern technology: building a working quantum computer. Now the company is thinking of ways to turn the project into a business.

Alphabet Inc.’s Google has offered science labs and artificial intelligence researchers early access to its quantum machines over the internet in recent months. The goal is to spur development of tools and applications for the technology, and ultimately turn it into a faster, more powerful cloud-computing service, according to people pitched on the plan.

A Google presentation slide, obtained by Bloomberg News, details the company’s quantum hardware, including a new lab it calls an "Embryonic quantum data center." Another slide on the software displays information about ProjectQ, an open-source effort to get developers to write code for quantum computers.

"They’re pretty open that they’re building quantum hardware and they would, at some point in the future, make it a cloud service," said Peter McMahon, a quantum computing researcher at Stanford University.

These systems push the boundaries of how atoms and other tiny particles work to solve problems that traditional computers can’t handle. The technology is still emerging from a long research phase, and its capabilities are hotly debated. Still, Google’s nascent efforts to commercialize it, and similar steps by International Business Machines Corp., are opening a new phase of competition in the fast-growing cloud market.

Jonathan DuBois, a scientist at Lawrence Livermore National Laboratory, said Google staff have been clear about plans to open up the quantum machinery through its cloud service and have pledged that government and academic researchers would get free access. A Google spokesman declined to comment.

Providing early and free access to specialized hardware to ignite interest fits with Google’s long-term strategy to expand its cloud business. In May, the company introduced a chip, called Cloud TPU, that it will rent out to cloud customers as a paid service. In addition, a select number of academic researchers are getting access to the chips at no cost.

While traditional computers process bits of information as 1s or zeros, quantum machines rely on "qubits" that can be a 1, a zero, or a state somewhere in between at any moment. It’s still unclear whether this works better than existing supercomputers. And the technology doesn’t support commercial activity yet.

Still, Google and a growing number of other companies think it will transform computing by processing some important tasks millions of times faster. SoftBank Group Corp.’s giant new Vision fund is scouting for investments in this area, and IBM and Microsoft Corp. have been working on it for years, along with startup D-Wave Systems Inc.

In 2014, Google unveiled an effort to develop its own quantum computers. Earlier this year, it said the system would prove its "supremacy" -- a theoretical test to perform on par, or better than, existing supercomputers -- by the end of 2017. One of the presentation slides viewed by Bloomberg repeated this prediction.

Quantum computers are bulky beasts that require special care, such as deep refrigeration, so they’re more likely to be rented over the internet than bought and put in companies’ own data centers. If the machines end up being considerably faster, that would be a major competitive advantage for a cloud service. Google rents storage by the minute. In theory, quantum machines would trim computing times drastically, giving a cloud service a huge effective price cut. Google’s cloud offerings currently trail those of Amazon.com Inc. and Microsoft.

Earlier this year, IBM’s cloud business began offering access to quantum computers. In May, it added a 17 qubit prototype quantum processor to the still-experimental service. Google has said it is producing a machine with 49 qubits, although it’s unclear whether this is the computer being offered over the internet to outside users.

Experts see that benchmark as more theoretical than practical. "You could do some reasonably-sized damage with that -- if it fell over and landed on your foot," said Seth Lloyd, a professor at the Massachusetts Institute of Technology. Useful applications, he argued, will arrive when a system has more than 100 qubits.

Yet Lloyd credits Google for stirring broader interest. Now, there are quantum startups "popping up like mushrooms," he said.

One is Rigetti Computing, which has netted more than $69 million from investors to create the equipment and software for a quantum computer. That includes a "Forest" cloud service, released in June, that lets companies experiment with its nascent machinery.

Founder Chad Rigetti sees the technology becoming as hot as AI is now, but he won’t put a timeline on that. "This industry is very much in its infancy," he said. "No one has built a quantum computer that works."

The hope in the field is that functioning quantum computers, if they arrive, will have a variety of uses such as improving solar panels, drug discovery or even fertilizer development. Right now, the only algorithms that run on them are good for chemistry simulations, according to Robin Blume-Kohout, a technical staffer at Sandia National Laboratories, which evaluates quantum hardware.

A separate branch of theoretical quantum computing involves cryptography -- ways of transferring data with much better security than current machines. MIT’s Lloyd discussed these theories with Google founders Larry Page and Sergey Brin more than a decade ago at a conference. The pair were fascinated and the professor recalls detailing a way to apply quantum cryptography so people could do a Google search without revealing the query to the company.

A few years later, when Lloyd ran into Page and Brin again, he said he pitched them on the idea. After checking with the business side of Google, the founders said they weren’t interested because the company’s ad-serving systems relied on knowing what searches people do, Lloyd said. "Now, seven or eight years down the line, maybe they’d be a bit more receptive," he added.

bloomberg.com

Share RecommendKeepReplyMark as Last Read


From: FUBHO7/22/2017 3:32:26 AM
1 Recommendation   of 1405
 



IBM Is Worst Performer on Dow as Cloud Services Unit Falters | Data Center Knowledge




Chairwoman and CEO of IBM Ginni Rometty speaks onstage at the FORTUNE Most Powerful Women Summit in 2013 in Washington, DC. (Photo by Paul Morigi/Getty Images for FORTUNE)by Bloomberg on July 21, 2017

Gerrit De Vynck (Bloomberg) — IBM fell the most in three months after reporting revenue that missed estimates, with sales in a key unit declining for the second consecutive period.

The quarterly results, released Tuesday after the close of trading, further extend Chief Executive Officer Ginni Rometty’s turnaround plan into its fifth year without significant progress. The company, once considered a bellwether for the tech industry, was the worst performer on the Dow Jones Industrial Average Wednesday.

Revenue in the technology services and cloud platforms segment dropped 5.1 percent from the same period a year earlier, even though executives had said in April that they expected key contracts to come through in the quarter. The unit is a marker for the strength of the company’s push into newer technologies. Total revenue fell to $19.3 billion, the 21st straight quarter of year-over-year declines.

The stock tumbled as much as 4.7 percent in intraday trading Wednesday in New York, the most since April, to $146.71. The shares have lost 7.2 percent this year through the close Tuesday, and have missed out on the technology stock rally that propelled companies like Amazon.com Inc. and Alphabet to records.

International Business Machines Corp. has been working since before Rometty took over in 2012 to steer the company toward services and software, and she has pushed it deeper into businesses such as artificial intelligence and the cloud. Still, legacy products like computers and operating system software have been a drag on overall growth. Some investors are getting tired of waiting for the turnaround to catch on. Warren Buffett’s Berkshire Hathaway Inc. sold about a third of its investment in IBM during the first half of this year.

Several analysts cut their price targets on the company.

James Kisner, an analyst at Jefferies, said the “poor earnings quality aims to mask ongoing secular headwinds” in the software business and competitive pressures in services that may result in more investor disappointment. He rates the stock underperform and cut the price target to $125 from $154.

Better MarginsGross margins in the second quarter were 47.2 percent, slightly beating the average analyst estimate of 47 percent. That’s better than last quarter, when a surprise miss on margins sent the stock tumbling the most in a year.

“We will continue to see, on a sequential basis, margin improvement from the first half to second half,” Chief Financial Officer Martin Schroeter said in an interview.

Operating profit, excluding some items, was $2.97 a share, compared with the average analyst estimate of $2.74 a share. That measure got a boost from tax benefits, which added 18 cents to the per-share number, IBM said.

The company’s cognitive solutions segment, which houses much of the software and services in the newer businesses and includes the Watson artificial intelligence platform, has shown the most promise in recent periods, growing in each of the previous four quarters. Yet sales in the unit fell 2.5 percent in the second quarter.

AI CompetitionWatson, for which the company doesn’t specifically break out revenue, might never contribute a significant amount to the company, Jefferies’ Kisner said in a July 12 note.

Competition in the artificial intelligence market is heating up, with major investments from the world’s biggest technology companies, including Microsoft Corp., Alphabet Inc. and Amazon.com Inc. On top of that, hundreds of startups are jumping in.

“IBM appears outgunned in the ‘war’ for AI talent,” Kisner said. “In our base case, IBM barely re-coups its cost of capital from AI investments.”

The company’s total revenue fell 4.7 percent from the same period a year ago, and missed analysts’ average estimate for $19.5 billion.

Oppenheimer & Co. managing director Ittai Kidron said the results show IBM “isn’t out of the woods yet.”

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10 

Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.