|Migrating to Microsoft's cloud: What they won't tell you, what you need to know |
Of devils and details
19 Jun 2017 at 09:04, Sonia Cuff
“Move it all to Microsoft’s cloud,” they said. “It’ll be fine,” they said. You’ve done your research and the monthly operational cost has been approved. There’s a glimmer of hope that you’ll be able to hit the power button to turn some ageing servers off, permanently. All that stands in the way is a migration project. And that is where the fun starts.
Consultants will admit that their first cloud migration was scary. If they don’t, they’re lying. This is production data we’re talking about, with a limited time window to have your systems down. Do a few migrations and you learn a few tricks. Work in the SMB market and you learn all the tricks, as they don’t always have their IT environments up to scratch to start with. Some of these traps are more applicable to a SaaS migration, particularly to Office 365. Some will trip you up no matter what cloud flavour you’ve chosen.
How much data? The worst thing you can do is take your entire collection of mailboxes and everything from your file servers and suck it all up to the cloud. Even in small organisations that can be over 250GB of data. If your cloud of choice doesn’t have an option to seed your data via disk, that all has to go up via your internet connection. At best, we’re talking days. Remember a disk seed isn’t always viable if you’re not located in a major city close to your cloud’s data centre. If it has to go via courier and then a plane, any data on a portable disk better be encrypted and again, you’re talking days for transport time. How do you put a lock on your production files in the meantime, assuming you’ll have no way to sync changes (more applicable to files than mailboxes).
Your two best options (pick one or both) are a pre-cloud migration archiving project and/or a migration tool that will perform a delta sync between the cloud and your original data source. Get ruthless with the business about what will be available in the cloud and what will stay in long-term storage on-prem. You seriously don’t want to suck up the last 15 years of data in this migration project. Once the current, live stuff is in the cloud by all means run a separate project to upload the rest of your older historical data if you wish. Email migrations seem to handle this the best, with tools like SkyKick and BitTitan MigrationWiz throttling the data upload over time, performing delta syncs every 24 hours and even running final syncs after you’ve flipped your MX records to the cloud. No email left behind!
Piece of string internet connection
Don’t even start a cloud project until you’re happy with your internet speeds. And don’t ignore your lesser upload speed either. That’s the one doing all the hard work to get your data to the cloud in the first place and on an ongoing basis if you are syncing all the things, all the time. Another tip: don’t sync all the things everywhere all the time. If you’re going to use the cloud, use the cloud, not a local version of it. Contrary to popular belief, working locally does not reduce the impact on your internet connection, it amplifies it with all the devices syncing your changes.
Outlook item limits
Office 365 has inherited some Microsoft Exchange and Outlook quirks that you might hope are magically fixed by the cloud. Most noticeable is performance issues with a large number of items or folders in a mailbox. This includes shared mailboxes you might be opening in addition to your own mailfile. Add up the number of folders across all of your shared mailboxes and you may have issues with searching or syncing changes if you are caching those mailboxes locally. We’ve seen Microsoft’s suggestion to turn off caching (i.e. work with a live connection to the cloud mailbox via Outlook) cause Outlook to run even slower and users to run out of patience.
The answer? You’re really left with just the option of a pre-cloud migration tidy-up. Local archiving is fairly easy to implement to shrink the mailbox, then online archiving policies take care of things once you are working in the cloud. If you don’t want the cost of an Office 365 E3 licence just to get archiving, look at adding an Exchange Online Archiving plan to the mailboxes that need it. This can include any shared mailboxes, but they’ll need to also be allocated their own Exchange Online plan licence to enable archiving to be added too.
DNS updates and TTL
When you are ready to flip your MX records to your new cloud email system, it’s going to take time for the updated entry to filter out worldwide across the global network of secondary DNS servers. Usually things will settle down after 24 hours, which is fine if your organisation doesn’t work weekends but challenging if you are a 24x7 operation. Some time before cut-over date, check the Time To Live (TTL) setting on your current MX record and bump it down to 3,600 seconds. Older systems can be set to 24 hours, meaning that’s how long someone else’s system will go with your old record before checking to see if it’s changed. Setting your TTL to 3,600 is a nice balance between update frequently versus don’t query the authoritative server every five minutes.
Missing Outlook Stuff
Lurking in the shadows of a Microsoft Outlook user profile are those little personal touches that are not migrated when a mailfile is sent to the cloud. These are the things you’ll get the helpdesk calls about. The suggested list of email addresses (Autocomplete), any text block templates (Quick Parts) and even email signatures all need to be present when accessing the user’s new email account. Depending on your version of Outlook, do some research to find out where these live and how to migrate them too or use a migration tool that includes an Outlook profile migration.
One admin to rule them all If I had a dollar for every time someone locked themselves out of their admin account and the password recovery steps didn’t work, I wouldn’t need to be writing this. Often your cloud provider can help, once you’ve run the gauntlet of their helpdesk. Save yourself the heartache by allocating more than one administrator or setting up a trusted partner with delegated administration rights. Office 365 does this very well, so your local helpful Microsoft Partner can unlock you with their admin access.
Syncing ALL the accounts
Even if your local on-prem directory is squeaky clean (with no users who actually left in 2012), it will contain an amount of service accounts. The worst thing you can do is sync all the directory objects to your cloud directory service, which then becomes a crowded mess. Take the time to prepare your Microsoft Active Directory first. Then use filtering options for Azure AD Connect to control what accounts you are syncing with the Cloud.
Compatibility with existing tech
Older apps don’t support TLS encryption that is required by Office 365 for sending email. This can impact software and hardware, such as scanners or multifunction devices. On the other hand, newer scanners can support saving directly to the cloud – Epson devices will back up to OneDrive, but not OneDrive for Business.
You thought the migration went smoothly, but now someone’s office scanner won’t email scans or a line of business application won’t send its things via email. Chances are those ancient systems don’t support TLS encryption. Now things are going to get a little complicated. There are direct send and relay methods, but it might easier to buy a new scanner.
This one’s for the Sharepoint fans. True data nerds love the value in metadata – all the information about a document’s creation, modification history, versions etc. A simple file copy to the cloud is not guaranteed to preserve that additional data or import in into the right places in your cloud system. Learn that before you’re hit with a compliance issue or discovery request. Avoid the problem by investing in a decent document migration tool in the first place, like Sharegate.
Long file names
Once upon a time we had an 8.3 character short file name and we lived with it. Granted, we created much fewer files back then. With the arrival of NTFS we were allowed a glorious 260 characters in a full file path and we use it as much as we can today. Why? Because search sucks and a structure with detailed file names is our only hope of ever finding things again on-prem. Long file names (including long-named and deeply nested folders) will cause you grief with most cloud data migrations.
If you don’t run into migration issues with this, just wait until you start syncing. We’ve seen it both with OneDrive and Google Drive and on Macs too. Re-educate your users and come up with a new, shorter naming standard. And watch out for Microsoft lifting the 260-character limitation in Windows 10 version 1607. Fortunately, it’s opt-in.
Of course, I’ve omitted the need to analyse who needs access to what and ensuring you mimic this in the cloud, because it feels like a given. That is until someone calls to say they can’t see the emails sent to sales@ or access a particular set of documents. There are probably other migration gotchas that have bitten you and you’ll know to avoid next time. What else would be on your list? This kind of discussion among ourselves is more valuable than any vendor migration whitepaper you’ll ever read. ®