3. AZ-203/204 – More on storage accounts
Hello and welcome back! Now, in this chapter, let’s go over some other aspects of storage accounts. The first distinction is between general purpose and two accounts. This is the most commonly used type of storage account. This is recommended by Microsoft for most scenarios. Now, when you create a storage account of this type, it basically provides the blob, the file, the queue, and the table service. Since there’s a lot of stress on using Blobs as a service when it comes to storage accounts, So this is object storage for the cloud. Here you can store massive amounts of unstructured data on the cloud. Now, this is highly recommended when you want to store objects or blobs such as images, documents, videos, and audio files. Now, when you start using the blob service, you have to create something known as a container. That container is then used to store the blob objects. Now, there are different types of blobs that are available. So first, you have the basic block blobs. These are used for storing text and binary data. You then have appended blobs. This is ideal for logging data. So if you have log files that need to be stored in the Azure blob service, you can make use of appended blobs.
Then there are Blobs. So these are used to store virtual hardness files for Azure virtual machines. Now, next we come to replication of data within Azure storage accounts. So this enables high availability for the underlying data that’s stored in a storage account. So the first technique that we have is locally redundant storage. So this is something that’s, by default, something that you can choose for your storage account. So here the underlying data is replicated synchronously three times within a physical location in the primal region. So within one physical location, your data is replicated three times so that if any part of the physical location goes down, you still have other data sources in place. Now, if you want higher availability than this, you can choose zone-redundant Redundant Storage.Now, data is replicated synchronously across three Azure availability zones. So, Azure Availability Zones are basically a collection of data centers. So this is good when you want to have your data in place even in the event of a data centre failure.
See, in locally redundant storage, your entire data and all three copies of it are within one data center at one physical location. So if the entire data centre goes down, then your data is not available. Whereas in zone redundant storage, your data is replicated across multiple data centres via their logical availability zone representation. So even if one data centre were to go down, you would still have a copy of your data available. Then there is geothermal energy and storage. Now here, your data is replicated three times, synchronously within a primary region, and then also replicated asynchronously onto a secondary region. So this was acquired when a company wanted to implement a disaster recovery scenario. So in case the primary region goes down and they basically want to have their data in place, they can basically access their data from a secondary region. Now, the secondary region data will only be available if the primary region goes down.
So this is a massive amount of storage. Please keep in mind that as we increase the availability, Zone Redundant Storage becomes more highly available in terms of your data than Zone Redundant Storage. Furthermore, zone redundant storage is more reliable than locally redundant storage. The cost also increases. So the cost is the highest when it comes to geo-redundant Redundant Storage.When you compare these three replication techniques, that’s because your data is being replicated to another region. So you’re paying for your data storage in both regions as well as for the data transfer. So keep this in mind: as you keep adding higher availability to your storage accounts, you are also adding to the cost. However, there are times when businesses require that high availability. We have now read axis Georgeundant storage.
Now, here again, it’s the same as georedundant storage. The only difference is that over here, both the data in the prime region and the secondary region are available for you. The data in the secondary region is available for read-only access, even if the prime region is available. So remember, in just plain georedundant storage, the data in the secondary region is only available if the prime region goes down. But in read-only georedundant storage, even though your prime region is still available, you can have a read-only copy of your data in the second region, which is also available as well.And then we have two more replication techniques, which are still currently in preview state.
So this is Geodon storage that is redundant. So, in this case, data is replicated synchronously across 30 availability zones in the prime region before being replicated synchronously onto a second region. And then we have the same concept, but this time it’s called axis-geometric redundant storage. So over here, it is doing a quick comparison when it comes to the different account types. So you can see that in general-purpose VT2, you have features such as something known as “access tiers,” which you will see in the next slide. You also have all of the support services.
You also have all of the replication options. If you look at General Purpose Version 1, you can see you don’t have all of the replication options in place. There is no way for you to access any tiers. In Blob storage, you only have access to the Blob service. You have some of the replication techniques in place. But again, if you want to use Blob storage, it’s recommended to use general-purpose v Two. So next, we have the access tiers. So we have three access steps when it comes to Azure Blob storage. So first, you have the hot access tier. Now, this is optimized for storing data that is frequently accessed frequently. This can be set at the account level. We then have the cool access tier.
This is optimized for storing data that is accessed infrequently and kept for at least 30 days. This can also be set at the account level. Now a quick note for the cool access tier: When your blobs are stored in this tier, the storage costs are lower than those of the Hot Tier. But then, if you want to access a Blob in the Cool Tier, the access costs are higher. So remember, only change the tier of your objects within the account to the “Cool Tier” if you are sure they’re not going to be accessed that frequently. And then we finally have the Archive Tier for archive data. As a result, this is optimized for storing data that is rarely accessed and will be kept for at least 180 days. This can be set only at the blob level. And also note that when a blob is in the archived tier, you can’t access the data in that blob. In order to access the data, you first have to rehydrate the blob before it can be accessed. And again, the storage costs are the lowest when it comes to the Archive Access Tier. But the access costs are high, right? So in this chapter, I want to go through all the aspects of storage accounts that are important from an exam perspective.
4. AZ-203/204 – Lab – Creating a storage account
Hi, and welcome back. Now in this chapter, let’s go ahead and see how we can work with storage accounts. So we can go on to the storage accounts section, and we can go ahead and add a storage account so we can choose our resource group. We can give the name of the storage account name.So this has to be a unique name. Now I’m going to select the Central United States region. Now, you can choose the performance as either “standard” or “premium.” So if you’re using the storage account to store discs for a virtual machine, then you can go ahead and choose Premium. Otherwise, you can choose Standard.
Now, in the account kind, if you want to use services such as the file service, the queue service, the table service, and the Blob service, you can go ahead and choose either Gel Purpose v.2 or Gel Purpose v.1. So Microsoft recommends using or choosing Gelpurpose v.Two because you get more features in this newer version of storage accounts. Then you have replication. So we have a separate chapter on replication for storage accounts. We can leave it as locally redundant storage in the access tier. We keep it warm in the networking. We can have it as a public endpoint in the future. We’re making certain that secure transfer is required.
So all requests on the storage account have to go via STPs. You can leave all the other settings as they are, go on to next for tags, and let’s go ahead and review and create a storage account. Now, once your storage account is in place, you can go ahead and access the resource. So now, if you go ahead and explore the storage account, firstly, you have all the services in place. So you have the Blob service, which you can make use of in terms of containers. You have the file-sharing service, the table service, and the queue service. So these are the four services that are available as part of either General Purpose One or General Purpose Two storage accounts. We just quickly want to start off with the creation of storage accounts.
5. AZ-203/204 – Lab – Working with the BLOB service
Hi, and welcome back. Now in this chapter, let’s go ahead and see how we can work with storage accounts. So we can go on to the storage accounts section, and we can go ahead and add a storage account so we can choose our resource group. We can give the name of the storage account name.So this has to be a unique name.
Now I’m going to select the Central United States region. Now, you can choose the performance as either “standard” or “premium.” So if you’re using the storage account to store discs for a virtual machine, then you can go ahead and choose Premium. Otherwise, you can choose Standard. Now, in the account kind, if you want to use services such as the file service, the queue service, the table service, and the Blob service, you can go ahead and choose either Gel Purpose v.2 or Gel Purpose v.1. So Microsoft recommends using or choosing Gelpurpose v.Two because you get more features in this newer version of storage accounts. Then you have replication. So we have a separate chapter on replication for storage accounts. We can leave it as locally redundant storage in the access tier. We keep it warm in the networking. We can have it as a public endpoint in the future. We’re making certain that secure transfer is required.
So all requests on the storage account have to go via STPs. You can leave all the other settings as they are, go on to next for tags, and let’s go ahead and review and create a storage account. Now, once your storage account is in place, you can go ahead and access the resource. So now, if you go ahead and explore the storage account, firstly, you have all the services in place. So you have the Blob service, which you can make use of in terms of containers. You have the file-sharing service, the table service, and the queue service. So these are the four services that are available as part of either General Purpose One or General Purpose Two storage accounts. This marks the end of this chapter. We just quickly want to start off with the creation of storage accounts.
6. AZ-203/204 – Lab – Using Azure Storage Explorer
Hi, and welcome back. Now, in this chapter, I want to go through the Azio Storage Explorer. Now, this particular tool is also available within Azure itself. So in your storage account, there’s something known as the Storage Explorer, which is currently in preview state. So here you have access to all of these services: your Blob containers, your file shares, your queues, and your tables. Now, at the same time, you could also go ahead and download the Azure Storage Explorer tool. So this is a free tool that’s available for download.
So when you want external customers or you want other people to go ahead and fetch data from your storage account, you could ask them to use the Azure Storage Explorer. So it’s a very simple “download and install” option. So once you’ve gone ahead and downloaded the tool, you can go ahead with the installation. Once the installation is complete, you can go ahead and launch the Azure Storage Explorer. Now, to start working with the Storage Explorer, you can go on to manage accounts. Over here, you can add an account. Now, there are different ways in which you can actually authenticate to an Azure Storage account. So let’s go with the first option, which is to add an Azure account. I’ll click on “next,” so “next” will ask us to authenticate.
So I’ll enter the user ID and password. After you’ve been authenticated and can see your subscription, you can proceed to click Apply. So now you can go ahead and see the storage accounts that you have as part of your subscription. Here you can see your Blob containers. You can go over to your container. Here you can see the object within your container. Blobs can be uploaded to this container. You can download the blob. You can create a new folder. So there are a lot of aspects that you can actually carry out on your storage account using the Azure Storage Explorer, right? So this marks the end of this chapter.
7. AZ-203/204 – Lab – Using Access Keys
Hi, and welcome back. Now in this chapter, I want to go through some other ways in which you can access storage accounts. So if I go on to my storage account, I’ll go on to one of my storage accounts; it’s a general-purpose storage account. So one of the ways you can actually connect to a storage account is via the Azure Storage Explorer. ,,,……. of a, into the,,, into the – – – – – – – -Now, if you go to your storage account, you can go to access keys, which gives you another way to access your storage account.
Now, by default, you get two keys. So you get key one, and you get key two. Now, the reason why there are two keys given for your storage account is in case this key gets compromised. So if a malicious user has gotten hold of this key, you can then start allowing your application to start using the second key. After that, you can regenerate the first key to restore access while revoke the original key.That’s the reason why you have two keys in place, so you can go ahead and take the key. So copy it to the clipboard. Now, in Azure Storage Explorer, if I go to accounts, let me add an account. So over here, I could go ahead and use my storage account name and a key or go onto Next. So let me add the key, and let me add the name of the account.
So you can go back over here, take the name of the account, and place the account name over here. You can also place it as the display name, go to Next, and click on Connect. And now you have the storage account in place, which is connected via a key. So now, when you connect to your global account, you can see all of the storage accounts that are part of your global account as part of your account.When you only use a key, you can only see the services that are part of that specific storage account.Now, an important aspect to understand when you are using access keys is that in your storage account, you get access to all of the services that are part of the storage account. So, whether it’s Blob containers, fileshares, queues, or tables, right?So in this chapter, I just want to show you how you can access your storage account via access keys.
8. AZ-203/204 – Lab – Azure Storage Accounts – Azure CLI
Hi, and welcome back. Now in this chapter, let’s see how we can use the Azure CLI to go ahead and create a storage account, to go and create a container in the storage account to upload a blob, to set the access permissions for the container, to list the blobs in the container, and how to download a blob. So the first statement goes ahead and actually creates a new Azure storage account. So over here, we are using the AZ storage account create command.
We’re giving the storage account a name here. What is the name of an existing resource group, what is the location for the storage account, and what is the replication? So we’re talking about a standard LRS with locally redundant storage. Next, we’re going to go ahead and create a container in our storage account. Here we give the account name and the name of the container that we want to create. Next, we can go ahead and upload a blob to the container. Here we give the name of the storage account.
What is the name of the container? What is the name that we want to give to our blob? So over here, sample TXT, and in the file parameter, we’ll mention what the name of the file is in our local system that will upload a blob onto the container. Next, we’re going to go ahead and set the permissions for the container so that we can go ahead and access the blob. So we’ll ensure that we enable public access at the blob level. Next, we’ll go ahead and list the blobs in our container. Here we give the storage account name, the name of the container, and how we want to see the output. And finally, we’ll go ahead and download the blob from our container. We give the name of the storage account, the container name, the name of the blob that we want to download, and the name we want to give our local file once the blob is downloaded. So let’s go ahead and execute these commands. So, first, let me do a simple login in the command prompt. So I’ll go ahead and log into Azure. I’ll use my Azure Admin account. Now let me go on to my storage accounts in Azure. So these are the storage accounts I currently have.
So I’ll go ahead and take the first statement. I’ll go on to the command prompt so you can see we are locked in. Now let me go ahead and execute the command to create a new storage account. Now that this is complete, if I go onto my storage accounts and I click on Refresh, you can now see our new storage account. Now let’s go on to the next command: go ahead and create a container in the storage account. So if I go ahead and go onto the storage account, if I go on to containers, here we can see that we have no containers in place. So in the command prompt, let me go ahead and execute the command to create a new container once you get an output back. If you go ahead and click on Refresh, you can see we have our container. Now, in our container, we don’t have any blobs. So let’s go ahead and execute the next statement. Now, in the command prompt in this local directory, I have a file known as sample.txt.
So it just contains some simple text. So let me go ahead and execute the command to upload this blob onto our container. So that is also done. If I go ahead and click on Refresh, you can see we have our file in place. If you proceed to the file, take the URL, and open a new tab. So we should be getting an “access denied” error. This is because the container that is created by default has private access permissions. So over here, you can see that the public access level is private. Now, in order to change the public access level, we have one more command, and that is to go ahead and set the permissions on the container. So let’s go ahead and execute the command. Let me clear the screen. So we are setting the public access level as “blob,” which is given over here. So if I go back onto my storage account, if I go and just click on Refresh for the container now, you can see the public access level is blob.
And if I go ahead and now refresh this URL, we should be able to see the contents of the file. Now, next, you can go ahead and list what other blobs we have in our container. So here you can see you’re getting the output, you can see the name of the file, and you can see what type of blob it is. Now, our last statement is to go ahead and download a blob. As a result, in the command prompt So in the command prompt, let me first go ahead and delete the existing file that we have. So in our directory, we don’t have any files. Now let me go ahead and download the blob. Now, if I go ahead and look at the directory, you can see that we have our file in place, right? So in this chapter, I want to go through the Azure CLI commands. So working with storage accounts
9. AZ-203/204 – Lab – AzCopy tool
Hello and welcome back! In this chapter, I’d like to go over another tool known as AZ Copy. This is specifically a command-line utility that is used to work with your blobs and files in a storage account. Now, depending on the platform, scroll down here to see the various ways you can download the AZ Copy tool.So I’ve gone ahead and downloaded the 64-bit version, the zip file. Here, we’re going to mention our tenant ID. Where do we get our tenant ID from? So, if you log in to your Azure account or Azure Active Directory, this is where you’ll find your tenant ID.
So you can just go ahead and copy it from here. Once you go ahead and log in, we already have a storage account in place that was done as part of our previous demo when working with the Azure CLI. This time we’re going to go ahead and make a new container known as demo. So in order to make the container, we have to go ahead and give the full URL of our storage account, along with the name of our new container, which is demo. Next, we’re going to go ahead and copy a file from our local system, a sampleTX file, onto this demo container. And we’re going to give the name of the blog, which we’re going to set for our blog, which is going to upload from our local system. And then we can go ahead and use the same command as before: easy copy copy.However, this time we can provide the source URL.
So this is going to be used for downloading our blob onto our local system. So let’s go ahead and see how to work with these commands. So first, let me go ahead and do an easy copy login. So I’ll be directed to this page and asked to enter the following code to authenticate.So I’ll go on to the URL; I’m going to manually go and enter the code. So once I enter the code, let me click on “Next.” So it’s showing me that I’ve signed in. So the login has succeeded. Let me go ahead and clear the screen. Let’s go on to our next command: go ahead and create a container. So if I go on to my storage accounts, let’s go on to the storage account if I go on to containers. So I’ve got a demo container here.Let me go ahead and delete this container. We don’t have any containers in our storage account now. So let me go ahead and issue the command to go ahead and make the container, so you can see that’s done.
So in the storage account, you can now see our demo container. Now, before we go ahead and upload a blob onto the Azure storage account, there is a step we need to perform. So this is different from the time when we worked with the storage accounts and uploaded blobs using the Azure CLI. When using the AZ Copy tool, we have to go ahead and set the permissions for a user to go ahead and upload blobs onto the storage account. As a result, we must proceed to Access Control.Now, please note, we have separatechapters on role based Access Control. But for the purpose of this particular chapter, let’s see what we have to do. So we have to go ahead and add a role assignment for the storage account. So in terms of the roles, there are various roles that are available. So these roles give the required permissions for the storage account. So I’m going to go head-on into the storage Blob data contributor role. So this will allow a user to go ahead and contribute blobs to the storage account. Now, I’m going to go ahead and search for my admin account. Now, please note that even though I’m the Azureadmin account, I still have to specifically grant this role for the storage account. So let me go ahead and click on “Save.” Now, please give around two to three minutes after the role assignment for the role to be propagated onto the storage account now that you’ve gone ahead and executed the command. So you need to wait at least five minutes.
I said that the drool-based access control would take effect once you went ahead and did that. So over here, you can see the total number of transfers completed is one. If you get a failed transfer, that means the file has not been transferred to the storage account. So now if you go onto the storage account, if you go onto a container, and if you click on Refresh, you can see a sample TXT file. Let’s go on to our last command: go and download the blob. So let me clear the screen. Let me delete my local file so that I don’t have the file in place. Let me go to the command line and execute the command to download the blob from our container. So that is also done. You can see we have our file over here, right? So this marks the end of this chapter, wherein we looked at how to use the easy copy command.
10. AZ-203/204 – Lab – Azure Blob storage – .Net
Hi, and welcome back. Now in this chapter, we are going to go through how to use a net programme to work with Azure Blob Storage. So this programme is part of.NET Co. So this is just a quick sample of how you can work with Azure Blob Storage. You can then use this logic in your own applications, whether they are web projects or NetCore applications. Now, for this particular project, if I go to NuGet Package Manager, if I go to packages for the solution So I’ve ensured that I’ve installed Azure Storage. Blobs has my package, version 12.4.1. Now, please note that there is a difference between version twelve and version eleven when working with blobs. So please ensure that you refer to the Microsoft documentation. since this is the most recent version.
That’s why I’m using Version 12. So in version twelve, there are different classes than in version eleven when it comes to interacting with Blobs. Now, if I go on to my programme, first, I have my connection string. So this connection string is used to connect to my Zero Storage account. So I have an existing Azure storage account in place. So if I go on to the storage account and get access keys, I can go ahead and get the connection string. So I could go ahead and take either the connection string for key one or the connection string for key two. Either one will do. So I’ve just gone ahead and copied the connection string. This is required for our programme to be able to access our storage account.
Next, I have the name of the container. So we have a method that will be used to go ahead and create a container known as Data. So if I go on to the containers in the Blob service so currently I only have a demo container. I don’t have a data container. The container method, on the other hand, is quite simple. So what I’m doing is that we are using the Blob service client class by using the function Object() [native code] and passing in the connection string. Then for this class, we are using a method known as Create Blob container async and passing in the container name. So let’s go ahead and run this programme so I can see the operation is complete. If I go ahead and click on Refresh, we can see our data container currently in the container, and we don’t have any blobs. Let’s go on to the second part of our program.
That is, go ahead and make a blob. So if I go on to that method again, we are using the Blob service client class to connect to our storage account. We are then using a method to go ahead and get a handle on the container. So I’m using the GetBlob container client method to get a handle to the container. Then we are going ahead and using the Get Blob client method to get a handle on our new blob. So this is for the same class. Next, we’re using another class known as the Blob client. So we are getting an instance of the object for this particular class by calling the Getblob client method on the previous container client that we created. Then in my local path in Cowork, I have a sample TXT file. I’ll go ahead and read that file, which has a file stream. I’m using the upload async method to go ahead and upload that file using that file stream. So let’s go ahead and run this program. So I can see the operation is complete.
If I go ahead and click on Refresh, you can see my file over here, right? So now let’s move on to our next method, which is to just get a specific blob in our container. So the first three statements are the same. When we’re getting a connection string, a handle onto a container, and then a handle onto our Blob itself, we then have another class known as Blob download information. So over here, we can invoke the download async method. This time I’m using a file stream for the right purposes. Right? So in this chapter, I want to go through the basics of operations when it comes to working with the Blobsomet program.
11. AZ-203/204 – Lab – Azure Blob properties and metadata
Hello and welcome back! In this chapter, I want to go over the properties and metadata for blobs in Azure storage. So you have a storage account in place. Now, if I go on to containers, if I go on to an existing container, over here I have a sample TXT file. So this is a blob in my storage account. Now over here, we have the properties of the blob. At the same time, if you want to go ahead and Now I want to show you how you can work with this from the Internet as well. So I’ve got a Net programme running over here. This is a simple DotNetCore console-based application. Again, if I go on to the new Get package manager, I have the same package as your storage blobs, version 12.4.1, if I go back onto the program. So I have my connection string at the name of my container as well as the name of the blob’s file. We already know where to get this from. You get it from your keys in the Azure storage container. Now let me go on to the first method, which is to get the properties of a blob.
So first, I need to go ahead and connect to my storage account using the blob service client class. Next, I go ahead and call the GetBlob container client method to get a handle on my container. Next, I’ll go ahead and create a blob object. So I’ll make use of the Get blobclient method to get a hold of my blob. Once I have a handle on my blob, I can go ahead and use the blob properties class to get the properties of my blob. And then I’m going ahead and displaying the access tier and the content length. So you can go ahead and view the properties of your blob. Now, if I go on to the method “get metadata,” most of it remains the same. The only difference is that in the properties, we can also access the metadata, which is a collection, and we can go ahead and get the keys and values. Now let me go ahead and run this program. So you can see access to the content length and metadata here. Now let’s say you want to completely set the values for the metadata for your particular blob. You can go ahead and do that as well. So here I have a method known as “set metadata” over here.
This is the key-value pair I want to set for my blob. Again, everything remains the same in terms of getting a connection to your storage account, the container, and your blob—your existing blob. You can then go ahead and create a dictionary object. You add the key-value pair onto the dictionary object, and then you go ahead and call the set metadata method. So we can go ahead and actually run this as well. So I can probably call setmetadata before getting the metadata itself. Let me go ahead and run this program. So now you can see that the metadata for your blob has been modified or changed. If you go on to Azure, if you goand click on Refresh here, you can see thekey and value has been modified, right? So this marks the end of this chapter, wherein we looked at properties and metadata for your blobs.
12. AZ-203/204 – Lab – Shared Access Signatures
And welcome back. Now in this chapter, we are going to look at another way we can access content in our storage accounts. So early on, we had seen how we could access storage accounts using our Azeo ID. So if I log in, let’s say with my root account, I can access all of the storage accounts that are part of my Azure account. If you want to access a particular storage account, you can also go ahead and use access keys. But remember, access keys will give access to all services within that particular storage account. Now, there is another way of granting access that’s known as shared access signatures. So let’s go ahead and look at the concept of shared access signatures. Now, this can be applied both at the blob level and at the account level. So first, let’s go ahead and look at how to apply or create a shared access signature at the blob level. So in a storage account, I’ll go on to the container section. So here I have a container in place, and I have an existing file.
Now, currently, before I just go on to the file, let me go ahead and just change the access level. So I want to make sure that it is private. As a result, no anonymous access. Click on OK to confirm that we cannot access the blob via its URL. Now, if I just go ahead and open the sample HTML file, take the URL, and open a new tab, you can see you’re getting the error resource is not found. So now let me go directly to the Generate SAS option. So this is to generate a shared access signature. So the first thing that you can specify is the permissions. As a result, I’ll leave it as read permission. Now, in addition to the permissions you can see over here, You can also specify the start and end dates. See, the shared access signature is going to basically generate a link. That link will allow you to access the sample HTML file. But the best thing about the share access signature is that you can give a time duration for the link itself. As a result, the link will no longer be valid after the expiry date. So this is an advantage of using the shared access signature.
Apart from that, you also have the ability to have allowed IP addresses, so you can say that only workstations that come under the purview of that IP address range are allowed. When they go ahead and use the link, the shared access signature link, then they have access to this particular blob. Please note that in order to generate the shared access signature, one of the keys will be used to sign the shared access signature. So let me go ahead and generate the SAS token and the URL. Now let me go ahead and take the Blob SAS URL—that’s a shared access signature URL—and go on to the tab. So now you can see that I have access to the contents of the file. So please note that this link will now only be valid until the expiry date that’s mentioned over here, right? So we can generate a shared access signature at the blob level. Now, we can also generate a shared access signature at the account level. So if I go back into my storage account, go to Settings, and then Shared Access Signature, over here I can now generate a shared access signature at the account level. So over here, you can first decide which services you want to allow as part of the shared access signature. So this marks the end of this chapter, where we have looked at shared access signatures.
13. AZ-203/204 – Lab – Shared Access Signature –Net
Hi, and welcome back. Now in this chapter, I want to show you a dot-plot programme that can be used to generate a shared access signature for a particular Blob. So in the program, I have variables for the account name and the account key, the container name, and the blob name. So, in an earlier chapter, we saw that in a storage account, we added a container called Data and a blob called Sample. TXT. Now we are going to go ahead and use the shared access signature to go and access this blob. Now remember, in order to generate a shared access signature, you still need to have the keys in place. So you can have one module of your system generate shared access signatures, and then you can have other modules of your system make use of the shared access signatures. So I have a method to go ahead and generate the shared access signature. Now, in order to get the account key again, it’s very easy if you go on to your storage account. If you go on to access keys, you can go ahead and use either key one or key two.
This would authorise your programme to access the Azure storage account. So now over here, I’m using the Blob SAS builder class. So over here, I’m ensuring that I specify my container name, and I’m saying that the shared access signature should only last for 5 hours. Over here, I’m setting the permissions if I scroll down. So we are creating a new object of storage shared-key credentials using the account name and the account key. We are then going ahead and getting the SaaS token. Now, once we get the SaaS token, it is our responsibility to build that full Uri, which can be used to access our Blob. So that’s what I’m actually doing over here itself.So I’m going ahead and generating the entire URI, which can be used to access our Blob. Once you have the shared access signature, you can proceed to download the blob using the same logic as before, but instead of introducing account keys or connection strings, you can simply use the SAS Uri generated by the previous method. So let me go ahead and run this program. So over here you can see the contents of the file.
You can also see the shared access signature. So, even if I go ahead and copy this and go on to a new tab, I’ll be able to download my sample TXT file. And remember, this is a file that I have in a container known as Data. So this is a sample. TXT file, right? So, in this chapter, I’ll show you how to use the.NET programme to work with shared access signatures quickly.
14. AZ-203/204 – Lab – Storage Accounts – Access tiers
Hi, and welcome back. Now I’m in ISO mode. So I want to go through the point of access tiers when it comes to blobs in storage accounts. So I have a number of storage accounts over here. For the purpose of this demo of this lab, I’m going to go ahead and choose Demo Store. So this is a general purpose storage account vs. two. Let me go on to the storage account.
So over here, the first thing you notice is that it has the performance of standard and the access tier of hot. Now, please note that when you’re setting the access tier, you can set it for the entire storage account. So that means the blobs in the storage account will automatically inherit the access tier that’s assigned to the storage account. At the storage account level, you can only mark the access tier as either hot or cool. So if I go on to the configuration for the storage account, over here you can see that we can only mark the access tier as either cool or hot. And even when you go ahead and create the storage account, it’s the same case. Now if I go on to the containers, I have one container in place over here. So if I click on this file, I already have one file in example HTML. So here also, you can see the access tier of this file is “hot.” Now we can also go ahead and actually change the access level of the file itself.
So if I click on “Change tier,” over here you can mark the access tier as “cool,” and then click on “Save.” So now remember that you are not paying less for this particular blob in terms of storage, but if you access this blob frequently, then you’ll be paying more. So if you’re going to go ahead and access this blob frequently, then it’s better to have the access tier set to “Hot.” Now you can note that if you click on Editso, you can see the contents of this particular blob. Now let me go ahead and open the file again. Let me now change the access level to Archive. So remember that when you have blobs that have not been accessed for a long duration of time and there are cases from a security perspective or compliance perspective wherein you have objects that need to be in archive storage for a period of time, then you can go ahead and archive your objects in your storage account. But keep a note over here; it’s also giving a warning. So by setting the access tier to Archive, that means your blob will be inaccessible until it is rehydrated back onto the hot or cold tear, which could take several hours. So only if you have the specific requirement wherein you need to archive your data, then consider using the access tier of Archive for your underlying blobs. So if I go ahead and click on “Save,” So, now that it’s been changed to the archive tier, if I go to the edit page, you can see that we don’t have access to edit the object, and it’s giving the message over here. So now the only way to go ahead and access this particular object or this particular Blob is to go ahead and change the access tier again. So, without a change to hot or cool, we have the rehydrate priority over here. So these are all the different access tiers that are available when it comes to Blobs.