Pass Microsoft Azure Data DP-900 Exam in First Attempt Easily
Latest Microsoft Azure Data DP-900 Practice Test Questions, Azure Data Exam Dumps
Accurate & Verified Answers As Experienced in the Actual Test!
Check our Last Week Results!
- Premium File 295 Questions & Answers
Last Update: Dec 21, 2024 - Training Course 32 Lectures
- Study Guide 672 Pages
Download Free Microsoft Azure Data DP-900 Exam Dumps, Azure Data Practice Test
File Name | Size | Downloads | |
---|---|---|---|
microsoft |
1.4 MB | 1354 | Download |
microsoft |
459.9 KB | 1244 | Download |
microsoft |
453.9 KB | 1317 | Download |
microsoft |
368.7 KB | 1296 | Download |
microsoft |
364.4 KB | 1285 | Download |
microsoft |
225.3 KB | 1344 | Download |
microsoft |
225.2 KB | 1630 | Download |
microsoft |
225.3 KB | 1452 | Download |
microsoft |
225.3 KB | 1608 | Download |
microsoft |
208 KB | 1637 | Download |
microsoft |
185.6 KB | 2155 | Download |
Free VCE files for Microsoft Azure Data DP-900 certification practice test questions and answers, exam dumps are uploaded by real users who have taken the exam recently. Download the latest DP-900 Microsoft Azure Data Fundamentals certification exam practice test questions and answers and sign up for free on Exam-Labs.
Microsoft Azure Data DP-900 Practice Test Questions, Microsoft Azure Data DP-900 Exam dumps
Welcome to the DP-900 Azure Data Fundamentals course
1. Welcome to the course
Well, hello and thank you for checking out this course on the DP-900 exam, Microsoft Azure Data Fundamentals. My name is Scott Duffy. This course is complete preparation for the DP-900 certification that's offered by Microsoft Azure. The DP-900 certification is a foundational, fundamental exam. It is for people who are just beginning to work with data and databases in the cloud. If you take and pass the DP-900 exam, you will be able to call yourself certified in Data Fundamentals, and you'll get a badge that you can put on your resume or on your LinkedIn. Now, this is an optional certification in that there are no other certifications that are dependent on it. If you're interested in becoming a data engineer or a database administrator, you do not need to take and pass the certification. That being said, this could be a stepping stone to get you into those more advanced certifications. It is also less expensive than other certifications because it is a fundamentals exam. It's only $99 or equivalent, compared to $165 for the other certifications. If you take and pass the DP-900 exam, this could lead you into several of these database paths within Azure. You can go to the database administrator route or to the data engineer route. In fact, if we look at one of the recent publications from Microsoft, we can see that we have a fundamentals course, in this case DP-900, and we have a data scientist, a data engineer, AI-related stuff, or even go ahead to the Azure Solutions Architect. Now, because this is a Microsoft exam, the databases that are being discussed are Microsoft databases. This is a focus on Microsoft Azure, specifically in the cloud. So there are not going to be any other questions on this exam. If you go to your favorite search engine and enter DP-900, you'll be taken to a page such as this, which is the landing page for the Azure Data Fundamentals exam. This page will tell you everything that you need to know, including the specific requirements that are on the exam. We can break down the core areas of the exam as follows: one topic of the exam is core data concepts. These are sort of the absolute basics when you're working with data, whether you're working in the cloud or working with on-premises databases. They cover 15% to 20% of the exam. moving into relational databases. This is the traditional set of SQL Server tables, columns, keys, etc. for 25% to 30% of the exam. The third section is non-relational data. So we're talking about Cosmos, DB, and Azure. Table storage is 25% to 30% of the exam. And finally, what you do with this data—the data analytics portion—is also 25% to 30%. So we can see that this exam covers the basics of Microsoft Azure data topics. And this course is going to cover them one by one. Thank you so much for checking this out. and we're going to get right into it.
2. DP-900 Exam Requirements
So let's talk about the requirements of the DP-900 exam. As I said in the last video, if you go to your favorite search engine and enter the code DP-900, probably with the word Azure for added measure, you should find this, the landing page for this exam. Now, as I record this, the exam is still in beta, and I've taken this exam in beta. It's going to take a little while for it to go live and for us to get other results. Candidates for this exam, first and foremost, it is intended to test your knowledge of some fundamental concepts in data, specifically data in the cloud. And so if you do have a long history of working with databases, as I do, you're going to find the core concepts pretty straightforward. And if you've got any experience working in the cloud, then that should basically cover that aspect of it. If you don't have either, or if you have one but not the other, this course and exam will be a good litmus test to get you to the next step. Now again, this isn't beta, but it is expected to be a cheaper exam, at $99. Exams will normally cost $165 when it goes live. You can take this exam online or at a testing center if those centers are open near you. So clicking this orange button here will allow you to go through the process of registering for the test. In the last video, I did go over the four topics covered by this exam, and we can see the percentages here. Microsoft does publish a more detailed skills outline, and you can see the link here. I'm going to open it up, and we're going to go into that page. I'll zoom in a little bit. Now here are the detailed requirements of the exam. So where it says Core Data Concepts, we can actually see it's talking about batch and streaming data. It's talking about data analytics and charting ETL and ELT. And that accounts for 15% to 20% of the exam. We talked about relational data, and we can sort of see it listing the databases that it wants you to know about and some of those tools that it wants you to know about. Underneath this section of the exam, non relational data, also 25%. Typically this is talking about CosmosDB, but they also mentioned table storage, blob storage, and file storage. There are other ways of storing data in a non-relational sense. And finally, the analytics section does talk about how you're going to get data into a data warehouse, the difference between transactional data and analytic data, and the difference between batching in real time. And some of those tools that Microsoft provides for analyzing data include storing it in a data lake using Signups Analytics, which is a SQL data warehouse component, using tools like Databricks to run analytics on top of it, et cetera. If you're not familiar with my training style, I go through this basically line by line. I'll start up here. The first section of the course is talking about core data workloads, and the first video of the core data workloads will be talking about batch and streaming data and relational data. Then the second video talks about data analytics. And we're going to talk about these concepts in that video. So there's basically going to be one or more videos on each of these sections in this course. Anyways, thank you so much for being here again. And I'm so happy to have you here. We're going to go through this, basically, section by section.
Core Concepts
1. Database Core Concepts
So the first topic of the exam says CoreData Concepts, and it's worth 15% to 20%. Now, I should point out that these data concepts are extremely basic, and so if you have any kind of history working with databases, you might find this extremely simplistic. So feel free to skip over the section, put the course player at twice the speed, or what have you. But basically, we're starting at the very beginning here. Now, when you're talking about core data concepts, one of the things you're going to talk about is what's called a workload. There are two basic types of workloads. One would be batch data and the other would be stream data. So let's talk about that batch data. Now, batch data is basically any kind of data that already exists and is sitting somewhere static. So it could be a file, like a CSV file, a text file, or some other type of file. It could be another database in another location; any type of data that has already been gathered and is ready to be imported into a database could be considered batch data. And so I put together some examples here. We just talked about comma- or tab-separated files, any kind of JSON data, or XML data. There's a technology called Apache Parquet. It is claiming to be more efficient for large amounts of data than CSV files. So if you have hundreds of megabytes of data, storing it in CSV format becomes very inefficient to process. Parquet is a more efficient format. You can store files in a Blob storage account and just have them sitting there or some other external database that you have SQL Server, Oracle or any other kind of database on premises and you need to get that into the cloud and any kind of cash. So, as another example, if you generated something and simply stored it in an Azure RDBMS, that is data that is sitting there batched. Now, the alternative to batch data is what's called streaming data. So streaming data is the concept of data that is always being generated, basically never has a beginning and never has an end. I took this graph from Microsoft's website, and you can see some type of construction crane. This is just an example, and the construction crane has sensors on it that are just always pushing out data. So every second the crane knows what angle it's facing, how far down the cable goes, how much weight is on it, and it knows its status. It's pushing that out as sensor data. And the red arrow on screen basically shows that path where the sensor data is being pushed out every few seconds. and that's a different type of data. So you can't just store that in a file and then import that. What you need is stream processing. So Azure has technologies like IoT Hub leading into stream analytics that can do intelligent decision making based on the stream of data. As an example, suppose you want to know when the crane is carrying more than ten tons of weight for more than ten minutes. There's some constraint where it can handle large amounts of weight, but only for short periods of time. Once it exceeds that period of time, you want to raise that as some kind of alert. Basically, you could monitor the sensors, and stream analytics would be able to tell you that it had more than ten tons on it for more than ten minutes, and so on. So here are some more examples of streaming data. The IoT Hub we just looked at, even Blobs can offer streaming for. If you have a website that has logs and the logs are just pending appending, as people are visiting your website, another line gets added to the file. Well, that's a never-ending file, and that can be stream data. Apache has a Kafka service which is basically another type of streaming processing engine outside of data. I want to say, of course, that you're watching this course, and this course is generally delivered in a streaming player. You did not have the full MP4 file downloaded to your computer before it started to play. And so it's just another example of streaming video. Or Netflix and YouTube do that in a streaming format, too. So just think about data as being interpreted while it's still being generated. So another topic in this section would be: What is a relational database? You may have heard this term, relational database." We've got a whole section in this course. We saw that 25% to 30% of the exams were going to be based on relational databases. So I'm just going to sort of skim through this here. But relational databases generally have tables. They've got rows and columns. So you can just sort of imagine it like an Excel spreadsheet; you can have different views of that data. So instead of querying the table directly, you can query the view. Every row has a unique identifier. So when you're searching for a particular row, it has a single unique key. It doesn't have to be a single column, but there is a uniquely identifying key. The tables then have relationships. This is what makes it relational. So you have a parent-child relationship where the order is in one table and the details of the order are in another, the customer is in another, and the products are in another. Four or five tables come together to make up an order, and finally the database enforces integrity. What that means is that you can't insert a record into Atable if the other records it relies upon don't exist. So you can't insert an order for a customer that doesn't exist. So there is an order of operations. This is the relationship aspect of a relational database. Table on screen is a very simplified example of an order table with order details, and there are basically ID columns, so the order details have an order ID that corresponds to the order table. On the right is the same kind of data in JSON format, so it's not restricted to this type of graph. But this is relational data, effectively right.as long as it has structure in a schema.
2. Chart Core Concepts
So let's talk about some of the core concepts of data analytics. Of course, we have a section of this course that goes into detail on Microsoft-specific offerings in the data analytics area, but we're talking about the core concepts in this section. Now, one of the first core concepts is the concept of data visualization. In general, data visualization is the process of distilling hundreds of thousands of rows of data into a graphic form so that someone can understand the story that the data is telling with a single glance. Now, right off the bat, I'll tell you that you shouldn't expect on this exam that Microsoft is going to quiz you on the difference between a line chart, a bar chart, and a pie chart. They're not going to start by asking you what type of chart this is, and they're not even really going to lay out a scenario and see which is the best kind of chart. So this is really just when you're in the space of creating reports, and you do need to understand that some charts are better at telling certain stories and other charts are better at telling different stories. But you should not expect that Microsoft is going to spend a lot of time trying to quiz you on chart types; it is a requirement of the exam. The thing with data visualization is we're getting into this culture now where we collect everything, right? Data storage is so cheap—two cents per gigabyte per month within Azure storage—and we can just collect every mouse click, every page, and view everything. We can just start collecting things, sorting them onto a table somewhere, and it will get cheaper and cheaper as time goes on. But now the problem becomes trying to interpret that. And if I were just to show you a printout of a list of all the visits to your website in text form, how are you going to interpret what that means? As a result, simply looking at numbers makes it difficult to see trends and gain insights. Human beings are not able to process thousands or millions of rows of data just by looking at them. So it's the visualizations that let you see the trends. Here's one example, a geographical representation. It's a picture of the Seattle, Washington, area, and in this case it's ticket sales. And so it's showing between members and non-members how many tickets were sold. And you can sort of see pockets where ticket sales are high. If you're familiar with the Seattle area, or if you're familiar with this company that sells tickets, this could mean something to you. Instead of listing out 50,000 rows of data or even sorting it or having summary numbers, just looking at this gives you some intelligence. In this case, the intersection with that and the geography, area codes, zip codes, and things like that There are so many new and innovative ways of displaying data that we can see charts on the screen. In the top left, there are relationships between countries; in the top right, some very beautiful graphs. And so data scientists are trying to come up with new ways of showing data. You can just go as simple as a word cloud, where the size of the font indicates the frequency of the word. So basically, one of the requirements of this exam is to understand the different chart types. Now, I am not going to go through each of the charts one by one. That would be, first of all, boring for you, and second of all, I've already said Microsoft is not even going to test you on what the different chart types are. But it would be kind of a disservice not to talk about how different charts lend themselves to different types of data. We can see a pie chart here, for instance, and just by looking at the pie, you can see which are the biggest slices and which are the smallest slices. It does require, basically, labels for those slices in order to make any sense. So there has to be text and, in this case, numbers on screen. This chart would not be good if you are trying to show the difference over time. So, if the data was sales by month, you'd want to see camera sales by month. A pie chart would not be an appropriate type of chart for that. But as a moment in time, the difference between these limited number of categories is that there are only six numbers being displayed on screen, so there's an upper limit to the amount of data you can stuff into a pie chart as well. Now, Microsoft has a tool—we'll talk about it later in this course—called Power BI. And we can sort of see this mixture of a line chart over time and then the absolute value of the most recent number on the right. You're able to put a lot of very relevant data on screen at once. And so you don't need to know the individual values for 2 hours ago, 4 hours ago, or 6 hours ago, but you can see the lines and how that went up and down, and if there was a temperature spike, it would sort of come out into this type of charge. So this is a really good dashboard for whatever it's trying to show. And with something like Power Bi, you can start to really mix things up. You can have the absolute numbers on the left line chart, a bar chart overlapping each other, a pie chart, and dots and sizes of dots, ranges, and things like that. So this person has created a sort of dashboard of many different types of charts, all of which have their own specific uses. So this is to get your head wrapped around what charts are good for what purposes. Now, I found a website called SQL Bit that has a downloadable reference for all the different chart types in Power BI. And so this is a screenshot from them, but you can sort of see they gather different types of charts together and actually label them as being what they are good for. So the comparison charts, like the bar charts and the line charts, are good for comparing two things against each other. For instance, the line chart here is good for changes over time if you need to do some type of part two, whole distribution, correlation, or any of these different chart types, and it sort of outlines it. So this is downloadable. There's more to it than what I'm showing here, but I would probably recommend that you go and look into the different chart types. This is one of the basic understandings of what would make a good chart and what would not. We'll pick up the analytics techniques as part of the next video. Bye.
3. Analytics Techniques
So next up, we're going to talk about the core analytics techniques. Now I do want to point out to you that this is something that can be very easily asked about on the exam. So I would spend some time understanding the different types of analytics. Now there are five types of analytics. The most basic is called descriptive. Then we're moving up to diagnostic analytics, prediction, prescription, and cognitive analytics. And we're going to go through each of these one by one. Now, descriptive analytics simply tries to answer the question, "What happened?" And it's the type of report where you go into the database, sum up columns, get a total, and put that on a report; it could be a sales report or a list of future appointment bookings. It could be some mathematical formula between revenue and cost of goods sold that determines your margins. It's basically existing data records that are assumed together, organized, and displayed in a way that is useful. This is also what could be called hindsight. So this is something that's already happened, and you're just trying to get a report out of the system to show you yesterday's sales, last month's sales, year-over-year comparisons, et cetera. And this is the most basic type of reporting. It's called descriptive. One level up from descriptive is diagnostic analysis. That's the answer to the question: why did it happen? So if you're looking at a sales report for yesterday and the sales are lower than they were the day before, why is it different? This type of report can give you an insight into why it's different. And so now you can look at the different regions, the different cities, and the different stores, comparing regions against each other and running reports that are maybe based on some kind of theory that customers who get their problem solved on the first visit drive higher revenue than customers who have to come back two and three times to get their problem solved. So any kind of query that you're running to try to figure out why something is happening is a diagnostic. So answering the question why not just what happened, but why did it happen? That's the next level. Getting into the third type of analytics is called predictive. Now this is sort of a forecast what will likely happen in the future based on past trends. And so if you can sort of look at your sales and your understanding that you're gaining 5% per month pretty consistently over the past twelve months, you can look into the future, add that 5% per month, and say, "Okay, looking into the next twelve months, our sales will be at this point; our service visits will be at this point." We're going to need an extra store over here because the store is going to reach certain limits, et cetera. So you're going to start to get reports that will show you the future. And by this, I mean this could even be like a weather report. A weather report is predictive because it's basically saying we've got clouds that are traveling from this geography to that geography, and by tomorrow they're going to reach here. And so, being able to look at things over time and then make an assumption about tomorrow, about next week, next month, next year, You can see in the Azure Portal that there's a cost management page, and it does try to predict your bill based on the current usage. And so it doesn't always get it right, but that is a predictive report, the fourth type of report. And again, we're getting more complicated and more advanced as we go along. I've seen places online that have talked about the descriptive reports being like the 1970s and then the diagnostic reports being the 1980s and 1990s. And as we get into the2000s, we get into predictive stuff. And now we're into what's called prescriptive. and prescriptive reports try to advise you on what you should do. And so there's now an intelligence, a basic level of intelligence, that says something is clearly happening and you should probably do this. And so Google Maps now has traffic as part of its navigation. And it will actually offer you an alternative route. So based on the data and what it knows, it can expect that you would go two minutes quicker to your destination if you took a side street versus staying on the highway. and that's a prescription. Analytics is looking at the data, and it's making a suggestion to you. Some of these prescriptions can be found on Netflix or Amazon. People who like this also like that. If you like this show, then you're probably going to like that one because we've seen this in our data. And finally, you can use certain SEO tools or search engine optimization tools on your website, and they will tell you you're not using the h1 or h2 headers enough, or you're not using this word enough, or you're using it too often. As a result, any tool that can analyze it and say, "You're not doing what we're looking for," is a prescription. And the fifth type of analytics is the most advanced. It involves artificial intelligence and machine learning. This is basically modeling. So now we get into whether there are weather reports that are based on advanced models of the way that clouds form and temperatures and the ocean, the heat of the ocean, et cetera. So if you can generate a model and then make future predictions based on that model and basically take that data back into the system, improve your model, and improve your predictions, that's cognitive analytics. And so you can get a Twitter stream coming in and then understand natural human language to understand if people are positive or negative about this. A self-driving car basically operates on data. It happens to be visual data that gets translated into digital data effectively. And so any kind of thing that involves artificial intelligence or machine learning to make predictions based on a model, then that could be cognitive analytics. So those are the five basic types of analytics. I do suggest that you learn about this. Spend some time in here because you would expect to see this on the site test.
4. ELT and ETL
So the next important core concept that we're going to be talking about are two acronyms. One is ELT and the other is ETL. You may have heard of these if you have been involved in data processing for a while, or maybe not. Now, ELT and ETL are basically descriptive terms that mean the style in which you take data from an external system and bring it into your internal system and prepare it for use. So rarely is data sitting outside your company or in an external data source ready for use in your application. So typically, you need to do what's called a transformation. And so the T in ELT or ETL stands for transformation. So typically, you want to do some type of transformation to take the data from an external source and bring it into your system. So the ELT stands for extraction, loading, and transformation. If you do the transformation before loading, it's ETL, the terms being the same. Extraction is how you get the data from your outside source. Loading is getting that data into the database that you're going to be using. And the transformation typically involves some type of change to that data. It could be the format; maybe the dates are in a different format (day, month, year) in the external system and you need them in year, month, day in your system. Maybe the name columns are two separate columns and you need it as a single full name. Perhaps there is a calculated computed field. Maybe there's an external database that you need to look up to get a value into a new column that doesn't exist in the other one. There are numerous ways and reasons for transformation. So the typical ELT is basically the process of taking data from an external source and loading it into a data store that you can use, then performing any kind of transformation that you need to do. So the extraction goes directly into the data source that you're planning to use before you do any manipulation on it. Of course, the ETL has a middleman. There's usually some type of engine called transformation engine" that takes the data from the data source, manipulates it, and then saves it into your target database. So that's the concept of ELT versus ETL. Now, in this section of the course, we're not going to talk about which is better. It depends on the technologies that you're using, the size of the data, and whether you're using it in some sort of real-time situation or batch loading, et cetera. So the capabilities of the target database and the abilities of your software that you're using are going to drive whether you do ETL or ELT.
5. Data Processing Core Concepts
So at the beginning of this section, we talked about how batch processing and stream processing are two types of data processing. Let's take a quick look at a more detailed example. So this is a typical batch processing testing workflow. Now, your workflow could look different than this, but I found this graph and thought, "Well, that's interesting in terms of showing what types of Azure technologies are being used in each step of the way." On the left, you have your data source. This could be an external system. This could be your website. This could be any source that generates data in a batch format; that data needs to get into Azure somehow. It could also be saved in a Blob storage account. A data lake store is just another type of Blob storage account with a different type of file system specifically designed for large amounts of data. Or you could even store that data in a database like a SQL database, Cosmos DB, etc. In fact, it might be that your application uses Azure and the data is already sitting in a SQL database as a transactional database. In this case, we have a batch processing step. So this would be the T in ETL that is doing some type of manipulation on that data. Now, these are typically called big data solutions. What I'm seeing on screen is your sequel. Hive pigeon spark Those last three are Apache projects that are available within Azure. And basically, it allows you to run queries on terabytes, petabytes, and exabytes of data—large, large amounts of data. You can use these technologies to process them, copy them from their source, and move them into the analytical data store. The analytical data store is the datastore that is specifically designed for reporting. So it used to be called a SQL data warehouse. Now there are sign-ups and analytics that have a data warehouse component to them. You've also got a Spark HBase. These are other types of databases that you can store it. And on the right, you're going to have some type of program. It could be Power BI, it could be Excel, or it could be queries that are generated in a query editor. And so you're going to be able to pull that data into a reporting mechanism. So that's typical big data processing in batch form. Now the same type of diagram is done for stream data. So you do have your data sources. There may or may not be a data storage element where your stream data is being pushed into a Blob, or is being pushed into an event queue, or some other type of data storage format. The data is ingested either directly from the data sources or from your Blob account. And the specifically designed event hubs and IoTHub are basically processing the stream and pushing that into the stream processing applications. Kafka is another Apache project, and I believe that it is specifically designed for massive amounts of data in a stream. Now for stream processing, you're not going to be using USQL, Hive, and Pig for stream processing. Stream analytics typically involve the data coming in, which you're either going to want to accumulate and then store all at once in some sort of data store. You're going to want to check it over a period of time and take some actions based on that, or at least manipulate the data as it comes in. Again, converting your data that comes in one format into a new format is the purview of something like stream analytics. Patchy Storm. And again, Spark has stream capability as well. So that data goes through the stream processing and ends up in the same type of data store. So, once it's been pushed into the SQL data warehouse, it's no longer stream data; it's now data in a database; it's not transactional data; it's essentially post processing. And at that point, you can run your power broker and your reporting at the back end of a table.
Microsoft Azure Data DP-900 Exam Dumps, Microsoft Azure Data DP-900 Practice Test Questions and Answers
Do you have questions about our DP-900 Microsoft Azure Data Fundamentals practice test questions and answers or any of our products? If you are not clear about our Microsoft Azure Data DP-900 exam practice test questions, you can read the FAQ below.
Purchase Microsoft Azure Data DP-900 Exam Training Products Individually