1. Azure Migration Strategy
So next up we’re going to talk about migrations. Now, migrations is a big topic. It is actually fairly complex because it’s very difficult to go from so many different configurations within your own network to a more standardized configuration within Microsoft Azure. Every company is going to have a slightly different set of applications. The way that applications are hosted, whether it’s VMware or other types of virtualized environments or physical environments, operating systems, there’s so much out there that mixes and matches together that trying to come up with a strategy to get that into the cloud is one thing. To assess what you have is a challenge. A lot of companies have hundreds or even thousands of servers out there in their environment.
And to try to document, get up to date information on what it is that you have running, that is a whole project into itself, essentially. So let’s just start with a statement that might be self evident. But if you’ve got an application that’s being used by users all day, every day, migrating that working application into the cloud or into Azure, it’s not going to be easy. There’s not a snap of your fingers and then your application was running in your own environment in 1 second and then suddenly it’s in the cloud. Now, yes, we can copy the code, you can recompile the code, push it out there, test it, stuff like that. But the more complicated you have an application, the more work you have to put in front making sure that you’re going to do the right thing.
And in actual fact, the code itself might not be the difficult bit. So if you were using modern development practices and you’re using to see Sharp or NodeJS or any of those other languages that are common out there, getting that to run inside of Azure, that may not even be difficult. Within Visual Studio, it’s just a click of the Publish command. Within other environments, you’re just going to zip up that code, copy it to a file folder, and then unzip it and configure IIS or configure Azure App service and suddenly boom, you’re running. But that the trick, is the data.
Now the other thing that comes into play is, are you going from a physical machine in your environment to Azure virtual machine within Azure, that’s relatively easy to go from a hosted environment to a hosted environment. But if you’re going to start to take advantage of Azure’s Advanced Database Services or some of their storage capacities and things like that, if you’re going to start to incorporate some of the big benefits of moving to Azure at the same time that you’re making the migration, then there’s a degree of difficulty that comes into play there as well.
Now the real trick, besides the code, of course, is the data. And so you can’t just take a 1 TB database upload that over the internet into the cloud and then nothing changed about that. So there’s always going to be people logging in, even just the log files or data that gets processed or information that comes in to change the data. Data is a constantly changing pattern. And so there’s a real strategy to doing the initial upload and then doing the evaluation and testing and then doing a delta. Again, there’s going to be some sort of downtime involved. I don’t think you can avoid it in a big case like that. So what we want to do is we want to minimize the downtime. We want to put that downtime when it’s least inconvenient to the end user. So this always ends up being 01:00 A. m. On a Saturday morning or something like that. But that migration happens. The other thing with migrations, of course, is there’s risks. Okay, so I’ve been involved in a lot of projects over the years that have involved migrations. And you spend so much time when you do the initial migration testing, testing, testing, trying to find out what are the surprises, what is it that we thought would work that doesn’t actually work? And so you do not want to be in a position where you migrate everything over, tell everyone everything’s great, and then people start finding a bunch of issues and you’re scrambling to fix them. It becomes a real disruptive event for the end users. So preparation is key, of course, but because you’re actually physically moving from one environment to the other, maybe doing a refractoring, maybe changing your database back end, you’re introducing risk.
And what we want to do is mitigate that. Now, the other challenge with migration, I guess it’s tangential to that, is are you actually saving money by moving to the cloud? Can we look at what we have running in our data center and say, oh, we’re going to go from $100,000 a month down to $50,000 a month? And so how do you go through and predict those costs and be able to say that’s what we’re expecting to pay with some degree of certainty? There’s so many moving parts, and some of this stuff is you just have to make educated guess in terms of what those parts are. But there’s just so much that goes into migration. And so in this section, we’re not going to be able to do a complete migration master class at this point, but we should understand at least what the challenges are and some of the inputs that go into designing a migration.
2. Data Migration Strategy
So we mentioned that when it comes to migrations, one of the most challenging components is how do you get the data out of your existing environments and into Azure? And how do you do it in a timely fashion such that you don’t lose any data in the process, and you also minimize the downtime of the original application. There’s a lot of requirements that companies come with when it comes to data import and export. In many of our applications locations, we just don’t have those down times where data doesn’t move. So you’re working in a lot of us are. And whether it’s batch jobs running overnight or real users using the system, we just don’t have several days in a row where the data remains static. And so we just have to accept that even when you do a backup, you take last night’s backup from SQL Server and upload that into Azure.
You’ve already several hours behind when it comes to the data that you just uploaded. The other component, of course, other than the changing, is the size. Now, it’s one thing to work with databases that are like 20 megabytes, 30 megabytes, and those are files that you can reasonably upload in minutes. But how do you transfer 100GB of data? How do you transfer 1000GB of data? We’re dealing with multiterabyte size databases. How do you get that from your system into Azure? And if those things just think about the way that the Internet works with Internet transfer times. Just to get 100GB from one server to another. Over a network is slow over the Internet, super slow. Well, there are uploads limits into uploaded limits into Azure. So even if you had a high speed connection to the Internet, that stuff still travels over the Internet into the Azure network. And you’re going to be facing ten megabits per second, 20 megabits per second kind of limits into Azure. And so there’s a mathematical formula you can run to see how long uploading a file will take. One of the cool things that came out within the last couple of years is called Azure Data Box. Got a chance to see one last year at Microsoft Ignite. But basically it’s a physical piece of hardware that you can order online. They’ll ship it to you, you copy all the data that you want into it. So if you want some terabytes in there, you can do that. Then you ship it back to Microsoft and they’ll plug it in and basically you can access it from your Azure account and copy it and do whatever you need to do with it.
So skipping the Internet portion entirely is one way to get big pieces of data into Azure. Now, if your data is not as big as those tens of terabytes that we’re talking about, you can upload files into a storage account using AZ copy. If you’ve got a SQL Server backup and you just need to 100 megabytes a couple of hundred megabytes. You want to get that into a storage account, you can just use a tool, command line tool called AZ Copy, get that into the storage account and then from there create a new Azure SQL Database based off of that backup. So like I said, if you’re dealing with really big files, you could be start the process on a Monday and basically upload your files. That could take you a better part of a week to get a lot of data into Azure.
Or you use Azure Data box, then a week or two for testing and make sure that the code is working, making code fixes testing again. It could be a month between the time when you first upload your backup of your database and now you’re ready to do a final migration. And so what are you going to do with that? Are you going to then take another backup, wipe away what you had working within Azure and upload that again? But then again we’re talking about downtime to shut your application down and put a web page up that says, sorry, we’re down for maintenance for that final migration. Now Azure does have a feature that will allow you to do what is called a Delta.
So if you do an upload of a large bit of data, it’ll take a snapshot of that date and time and say, oh, this is from 200 and 01:00 p. m on Thursday the 15th. And if it takes you a week to test it and to make sure it’s all verified, then the next time you do a data upload, you can just do what is called the delta, which is the difference, which is only a week’s worth of data. Should be something you can do online and should be a lot less troublesome. We can look at a tool that was called Data Migration Assistant that is specifically for SQL Server and SQL Server compatible databases, but it’ll assess your database, see if there’s any challenge in getting that. It will make those fixes to that script, upload that script into Azure and basically Database Migration Assistant is piece of software you can install that will help you get that stuff into Azure.
On a bigger scale. There’s also the Azure Database migration service. So this is not specific to SQL Server and it will work with SQL Server, but it works with Oracle and MySQL PostgreSQL Ibmdb Two and others. Now this diagram is a little complex, but you can see that the Azure Database Migration Service can help getting your on premises SQL Servers into a SQL Server in a VM, or if it is in a SQL Server in a VM, getting that into an Azure SQL Database. Or you can skip the VM entirely and just go from your premises to an Azure SQL Database, et cetera, et cetera. So you can go into Oracle, it’s an Oracle VM, or you can actually migrate from oracle to Azure SQL database. If you want to change database vendors, same with MySQL PostgreSQL. So we can sort of see that this database migration service is the many to many hookup between different services.
Now, we should mention we talked about Azure Site Recovery in the backup section of the Business Continuity, but Azure Site Recovery is one of those tools that is ideal for migration as well. So if you have a primary site running in a physical server or HyperV virtual machine or a VMware virtual machine using Azure site recovery to get the backup of that posted into the cloud, get the network set up. Get all the data synchronized, it does an ongoing going synchronization so that as the data changes, the data in the cloud is kept up to date. And then you can do a we saw this in the Business continuity, a planned, planned failover where it shuts it down, make sure that there’s no one accessing the application, no one accessing the data, and then switches over to use Azure.
3. Application Migration Strategy
So when it comes to migrations, we talked about the data and what about the applications themselves. So getting the applications from your environment into Azure, there’s a chance that this could be a fairly simple and straightforward thing. And there’s also a chance that it could get fairly complex depending on how many changes you want to make to the application to incorporate some of the cloud features. Now, Microsoft does define four different styles of migration and we’ll talk about them briefly here. But basically what used to be called the Lift and Shift migration, where all you’re doing is taking an existing application, making minimal changes to it.
Maybe you could change the connection string, or maybe you changed how like, say it uses a lot of local files and you want to change it to use an Azure file service. So making small little changes to an application, but generally the application is the same, and then rebuilding that and publishing that into the cloud, that’s basically application re hosting. You’ve got an Azure VM in the cloud, you have a physical server in your environment and Azure VM in the cloud. And you’re basically replicating your existing environment as precisely as possible. That’s a fairly low risk strategy. You’re not ultimately maximizing some of the benefits, so you’re not using Cosmos, DB or any of these really cool cloud benefits that Microsoft provides.
Maybe you’re not maximizing savings, but just to get from your own environment into the cloud and just do whatever rehosting is one way to get into the cloud and one way to save money. Of course, now there’s also the possibility, and this temptation does come up, of doing some refactoring. So while you’re looking at the code you’re about to do this migration, you might realize, you know what, why don’t we move to a SQL database back? And if we do that, then we’re going to have to change the way we do this. And maybe we have a whole data layer that needs to be completely rewritten. So you can analyze bits and pieces of your application and say, okay, we’re going to take advantage of this moment of migrating to the cloud and we’re going to do things properly here, or we’re going to make it so that the application pushes off metrics and we can see that within the cloud portal, et cetera. So lots of little options that you can do. It does increase the risk a little bit, requires a bit more testing, but then you do start to reap some of those benefits of the cloud a little bit more because you’re willing to change again the way that you do things. Now, if you really want to get into Azure, virtual machines and containers are pretty much Lift and Shift. If you really want to get into sort of taking real advantage of the cloud, actually rearchitecting your application could be a possibility. So, for instance, you can just shed your whole authentication and identity management and just use Azure Active Directory. And so one third of your application or one fifth of your application gets cut out because you’re replacing it with a service that Microsoft provides. And we can say that a lot about different parts of your application, or if you want to get into service fabric or Azure functions and logic apps.
So then you can start to say, you know what, we can use a lot of the code like in an Azure function and logic app. The basic logic of what we’re trying to do and what APIs we call and what Blobs we create is pretty much the same. It’s just we need to make it smaller pieces of code finishing quickly, running within the Azure function environment. So when you want to move to a new home, a paradigm and get out of the VM space and get out of the even traditional monolithic web apps, this is obviously the way you do that. Now, finally it’s just the scrap it and rebuild it approach. And depending on the application, you might just say, you know what, we were going to migrate this, but then we took a look at it and it’s just old and it’s fragile and it’s not ideal.
And you make that assessment and say, you know what, it’s not worth keeping and so we’re going to start fresh, do this the proper way, make a service oriented architecture, et cetera, et cetera. You do it a completely different way. So certainly that’s another option. So those are the four big options when it comes to migrating your code into the cloud. It’s the lift and shift option. There’s some refactoring, which is sort of a minor thing, rearchitecting, which maybe can even just go into a whole new style and then just completely scrapping it and getting fresh new code, which a lot of developers like. But it’s good to also keep bits of the past if it still works.
4. *NEW* Storage Migration Service
Now, when it comes to migrating file servers from your local environment into Azure, you have a couple of different options. One is you can do a simple lift and shift migration where you take the machine from your local environment, do an image of it, and restore it directly into Azure. And you get the same operating system, exact same settings. It’s an identical copy from your local environment into Azure. So that is certainly possible. Another way you can do it is you can spin up a new virtual machine in Azure and basically migrate the files and the shares and the permissions and security from your old machine to your new machine. And so that’s what the storage migration service is used for. Now, this is not inside of the Azure Portal. This is a Windows Admin service function.
And so you’re going to basically download a piece of software, an agent if you will, into your environment, and you’re going to do an analysis of your environment. It’s going to do what’s called the inventory, find all of the servers with the shares on it and understand completely what your requirements are. And you can either migrate those file servers from your existing environment to another server in your environment. So let’s say you’re running on Windows Server 2008, which is past its end of life. If you need to migrate that into Windows Server 2019, you would use this exact software.
It can migrate, basically change operating systems, upgrade the server, but ensure that you have all of your files, shares, security, permissions, and all that stuff. So this storage migration service also can work with migrating to Azure. So we can look at this diagram. It can show you all of your on premises servers from 20 03, 20 08, 20, 12, 20, 16 2019 and basically pull that in to an Azure Virtual Machine IaaS that can have a file share on it. You can use file sync to ensure that the files are synchronized between multiple servers, et cetera. So I’m not going to be able to demonstrate this to you. I just don’t have the Windows environment set up. But just understand that there is this storage migration service that can pull files that are running as file servers.
The key part being here the SMB shares, which are the Windows File Share protocol, and recreate those within Azure. I mean, you could also recreate them in other places in your existing environment, but it’s set up to be able to push within Azure. So the environments have to be Windows Server 2019. Obviously it can work for twelve and 16, but it’s a little slower. But you’re obviously migrating from an older environment, or from a different environment into a more modern environment. That would be one of the requirements.