PDFs and exam guides are not so efficient, right? Prepare for your Microsoft examination with our training course. The AI-900 course contains a complete batch of videos that will provide you with profound and thorough knowledge related to Microsoft certification exam. Pass the Microsoft AI-900 test with flying colors.
Curriculum for AI-900 Certification Video Course
Name of Video | Time |
---|---|
1. Introduction to Azure |
5:00 |
2. The Azure Free Account |
5:00 |
3. Concepts in Azure |
4:00 |
4. Quick view of the Azure portal |
4:00 |
5. Lab - An example of creating a resource in Azure |
11:00 |
Name of Video | Time |
---|---|
1. Machine Learning and Artificial Intelligence |
2:00 |
2. Prediction and Forecasting workloads |
1:00 |
3. Anomaly Detection Workloads |
1:00 |
4. Natural Language Processing Workloads |
2:00 |
5. Computer Vision Workloads |
1:00 |
6. Conversational AI Workloads |
1:00 |
7. Microsoft Guiding principles for response AI - Accountability |
2:00 |
8. Microsoft Guiding principles for response AI - Reliability and Safety |
1:00 |
9. Microsoft Guiding principles for response AI - Privacy and Security |
1:00 |
10. Microsoft Guiding principles for response AI - Transparency |
1:00 |
11. Microsoft Guiding principles for response AI - Inclusiveness |
1:00 |
12. Microsoft Guiding principles for response AI - Fairness |
1:00 |
Name of Video | Time |
---|---|
1. Section Introduction |
1:00 |
2. Why even consider Machine Learning? |
4:00 |
3. The Machine Learning Model |
9:00 |
4. The Machine Learning Algorithms |
9:00 |
5. Different Machine Learning Algorithms |
3:00 |
6. Machine Learning Techniques |
4:00 |
7. Machine Learning Data - Features and Labels |
5:00 |
8. Lab - Azure Machine Learning - Creating a workspace |
6:00 |
9. Lab - Building a Classification Machine Learning Pipeline - Your Dataset |
11:00 |
10. Lab - Building a Classification Machine Learning Pipeline - Splitting data |
7:00 |
11. Optional - Lab - Creating an Azure Virtual Machine |
9:00 |
12. Lab - Building a Classification Machine Learning Pipeline - Compute Target |
6:00 |
13. Lab - Building a Classification Machine Learning Pipeline - Completion |
6:00 |
14. Lab - Building a Classification Machine Learning Pipeline - Results |
8:00 |
15. Recap on what's been done so far |
2:00 |
16. Lab - Building a Classification Machine Learning Pipeline - Deployment |
7:00 |
17. Lab - Installing the POSTMAN tool |
4:00 |
18. Lab - Building a Classification Machine Learning Pipeline - Testing |
6:00 |
19. Lab - Building a Regression Machine Learning Pipeline - Cleaning Data |
9:00 |
20. Lab - Building a Regression Machine Learning Pipeline - Complete Pipeline |
3:00 |
21. Lab - Building a Regression Machine Learning Pipeline - Results |
3:00 |
22. Feature Engineering |
3:00 |
23. Automated Machine Learning |
6:00 |
24. Deleting your resources |
2:00 |
Name of Video | Time |
---|---|
1. Section Introduction |
2:00 |
2. Azure Cognitive Services |
1:00 |
3. Introduction to Azure Computer Vision solutions |
3:00 |
4. A look at the Computer Vision service |
5:00 |
5. Lab - Setting up Visual Studio 2019 |
4:00 |
6. Lab - Computer Vision - Basic Object Detection - Visual Studio 2019 |
12:00 |
7. Lab - Computer Vision - Restrictions example |
2:00 |
8. Lab - Computer Vision - Object Bounding Coordinates - Visual Studio 2019 |
3:00 |
9. Lab - Computer Vision - Brand Image - Visual Studio 2019 |
2:00 |
10. Lab - Computer Vision - Via the POSTMAN tool |
5:00 |
11. The benefits of the Cognitive services |
2:00 |
12. Another example on Computer Vision - Bounding Coordinates |
2:00 |
13. Lab - Computer Vision - Optical Character Recognition |
5:00 |
14. Face API |
2:00 |
15. Lab - Computer Vision - Analyzing a Face |
3:00 |
16. A quick look at the Face service |
3:00 |
17. Lab - Face API - Using Visual Studio 2019 |
6:00 |
18. Lab - Face API - Using POSTMAN tool |
5:00 |
19. Lab - Face Verify API - Using POSTMAN tool |
7:00 |
20. Lab - Face Find Similar API - Using POSTMAN tool |
8:00 |
21. Lab - Custom Vision |
9:00 |
22. A quick look at the Form Recognizer service |
2:00 |
23. Lab - Form Recognizer |
8:00 |
Name of Video | Time |
---|---|
1. Section Introduction |
1:00 |
2. Natural Language Processing |
3:00 |
3. A quick look at the Text Analytics |
1:00 |
4. Lab - Text Analytics API - Key phrases |
4:00 |
5. Lab - Text Analytics API - Language Detection |
1:00 |
6. Lab - Text Analytics Service - Sentiment Analysis |
1:00 |
7. Lab - Text Analytics Service - Entity Recognition |
3:00 |
8. Lab - Translator Service |
3:00 |
9. A quick look at the Speech Service |
1:00 |
10. Lab - Speech Service - Speech to text |
4:00 |
11. Lab - Speech Service - Text to speech |
1:00 |
12. Language Understanding Intelligence Service |
2:00 |
13. Lab - Working with LUIS - Using pre-built domains |
8:00 |
14. Lab - Working with LUIS - Adding our own intents |
4:00 |
15. Lab - Working with LUIS - Adding Entities |
2:00 |
16. Lab - Working with LUIS - Publishing your model |
2:00 |
17. QnA Maker service |
2:00 |
18. Lab - QnA Maker service |
9:00 |
19. Bot Framework |
2:00 |
20. Example of Bot Framework in Azure |
3:00 |
Name of Video | Time |
---|---|
1. About the exam |
5:00 |
100% Latest & Updated Microsoft Azure AI AI-900 Practice Test Questions, Exam Dumps & Verified Answers!
30 Days Free Updates, Instant Download!
AI-900 Premium Bundle
Free AI-900 Exam Questions & AI-900 Dumps
File Name | Size | Votes |
---|---|---|
File Name microsoft.examlabs.ai-900.v2024-10-12.by.bonnie.78q.vce |
Size 755.69 KB |
Votes 1 |
File Name microsoft.pass4sures.ai-900.v2022-01-29.by.max.74q.vce |
Size 793.88 KB |
Votes 1 |
File Name microsoft.pass4sures.ai-900.v2021-10-27.by.grace.75q.vce |
Size 752.46 KB |
Votes 1 |
File Name microsoft.examcollection.ai-900.v2021-09-24.by.layla.66q.vce |
Size 811.06 KB |
Votes 1 |
File Name microsoft.prep4sure.ai-900.v2021-09-06.by.mila.53q.vce |
Size 896.22 KB |
Votes 1 |
File Name microsoft.pass4sure.ai-900.v2021-05-15.by.levi.54q.vce |
Size 791.93 KB |
Votes 1 |
File Name microsoft.selftestengine.ai-900.v2021-02-12.by.bobby.51q.vce |
Size 786.62 KB |
Votes 2 |
Microsoft AI-900 Training Course
Want verified and proven knowledge for Microsoft Azure AI Fundamentals? Believe it's easy when you have ExamSnap's Microsoft Azure AI Fundamentals certification video training course by your side which along with our Microsoft AI-900 Exam Dumps & Practice Test questions provide a complete solution to pass your exam Read More.
Before moving forward, I thought let's just have a quick review of what we have done so far. So we have looked at how to build a pipeline that's based on classification. That's the machine learning type. And we have gone ahead and trained a model, right, a machine learning model. We have gone ahead and used Azure Machine learning studio. That's available after creating an Azure machine. learning workspace. We used an inbuilt data set. We also used an inbuilt algorithm. That is, there is a machine learning algorithm. We ensured to have both a training data set and a test data set. We also went ahead and tested our model. All of this ran on a compute cluster. So we only had one virtual machine running as part of that compute cluster. So we have our model in place. Now, We can actually go ahead and invoke that model basically on data that doesn't have a label. So let's say we want to now go ahead and predict what would be the income of a person based on the data that we provide. Because this is the entire purpose of building a machine learning model. To go ahead and give you information, predictions, etc. Based on what data that you provide. So that is what we are going to look at next.
Hi and welcome back. So in the earlier chapter, we saw how to train an amodel, and let's say we are happy with the model itself. So how do we make use of this training model? So I said the entire purpose of having a machine learning model in place is to go ahead and use the model and, let's say, give us a prediction based on the data that we submit to the model. Now for that, we have to go ahead and deploy our machine learning model. So after training the model, we have to go ahead and see what outputs we get from real-time data.
So for that, we have to go ahead and deploy the model. So for this, you have to go ahead and convert the pipeline to something known as a real-time inference pipeline. So what this does is that it actually goes ahead and removes any sort of training modules in the pipeline, and then it adds the web service inputs and outputs. So, the web service inputs allow us to go ahead and invoke the model via kind of a web API. And we can also go ahead and get the output as we desire. So this will actually be used to go ahead and handle the requests that are made on the pipeline that is the web service inputs. Now, after creating the inference pipeline, you have to go ahead and create the inference cluster. So remember, we had a virtual machine compute cluster that was used for training our model, but now we want to go ahead and make use of that model in a real-time scenario. And for that, we have to go ahead and create something known as an inference cluster. So this cluster will actually be used to go ahead and take in requests and perform the required processing to give us the required results.
Now, the cluster that you create is based on the Azure Kubernetes service. Now, Azure Kubernetes is nothing but a container-based orchestration service. So over here, your model would actually be deployed as containers on Azure Kubernetes. Again, all of this is going to be managed for you. You don't have to go out and manage the underlying infrastructure that will be hosting your model with the use of your Kubernetes. There are a lot of benefits. So you have your models running in containers, and the entire Azure Kubernetes service can ensure that your models are always up and running. So let's go ahead and deploy our machine learning model. So over here in our pipeline, in the authoring section over here, we can now go ahead and create an inference pipeline. So let me go ahead and choose a real-time inference pipeline. Once we do that, let me go ahead and just cancel this. So over here, now you can see that you have a real-time inference pipeline itself. Over here, you still have your training pipeline.
So that is still in place. So if you want to go ahead and retrain your pipeline, you can go ahead and do that. And here you have your real-time inference pipeline. Over here you can see that you have the web service inputs and outputs, so you have the web service input and the web service output. So using the web service input, we can actually go ahead and make a call on this pipeline with our data. And the web service output will give you the results. Basically, it will give you an income based on the data that you actually provide. Now, to go ahead and make sure to create this real-time inference pipeline, first we have to go ahead and click on Submit. Now currently it's going to go ahead and create that inference pipeline using that same compute target. We are then going to go ahead and deploy this pipeline. Remember, on something known as an Azure Kubernetescluster, that will be an inference cluster. Here we are just going ahead and ensuring that we have that real-time inference pipeline in place. Now I could go ahead and choose the same experiment twice. Let me go ahead and click on Submit. So it will actually go ahead and create this inference pipeline. Now what I'll do is let's go on to the compute section.
So in another tab I have my learning space. I'm going on to the compute section and let's go ahead and create an inference cluster. So let me go ahead and hit create I'm going to go ahead and create a new Kubernetes service. So over here, let me go ahead and choose the location. So I'll go ahead and choose North Europe. Let me go ahead and choose four cores and eight gigs of RAM. Let me go on to the next. Let's go ahead and give this computer a name. I'm going to go ahead and choose the development test. So we just have one node in place, let's go ahead and hit Create. So now it's going to go ahead and create an Azure Kubernetes cluster. Let's come back once we have our cluster in place and once we have our inference pipeline also in place. This could take around five to ten minutes. Now, once you have the inference cluster in place, and once you also have the inference pipeline in place, you can now go ahead and hit on deploy, which will go ahead and deploy something known as a new realtime endpoint on that Azure Kubernetes cluster. So over here you can go ahead and give a name to the endpoint, and over here in the compute name, you can go ahead and choose your Kubernetes cluster, and then you can go ahead and hit on Deploy. Right, so let's mark an end to this chapter. So let's go ahead and see how we can make use of this deployment in the subsequent chapters.
Hi and welcome back. Now, in order to go ahead and make a call to our real-time endpoint, which will be available via our inference cluster, we're going to go ahead and make use of a tool that's known as Possible Man. Now, Postman is a very useful tool that is used for API development. So, if you want to have more flexibility when it comes to issuing API calls, you can use Postman as well. This is a freely available tool. So now over here on a Windows ten machine, I want to go ahead and show you how to install the Postman tools.
A very simple installation So I've gone on to the Download section. So let me go ahead and download the Postman application. So I'll go ahead and choose the Windows 64-bit version. I'll go ahead and click on "Run." So after some time, it will actually go ahead and download the Postman installation and start installing the tool itself. Now, once you have the tool in place, you can either go ahead and create a free account, or you can just go ahead and skip it, and directly go on to using the application. So now over here, you can actually go ahead and start issuing API calls. So, for example, I could go ahead and click the plus symbol. Over here, I could go ahead and choose the type of request I am making. Over here, I could go ahead and enter the URL. So let's say Google.com. I could go ahead and click on Send. So over here, I'm getting a response if I go on to the preview. So over here you can see the information itself. So the benefit of using the Postman tool is that you can actually go ahead and issue different types of requests. Over here, you can go ahead and issue a Get request, a Post request, etc. You also have flexibility when it comes to adding authorization or when it comes to adding headers in the request that is being made.
So, for example, if I just make a request over here. So for example, if I'm making a request in the browser for Google.com, this is going in and actually posting a Get request onto the Google service, so we're getting the information back over here. But let's say that when we send information onto the servers, we want to go ahead and have more flexibility when it comes to the request itself, which we are making. So I want to go ahead and change the header information. So for that, I can actually make use of the Postman tool.
So, even in Google Chrome, if I go to moretools, Developer Tools over here, Network section, and just browse for Google.com again, I can see all the requests that have been made if I go onto the request for Google.com itself. Over here, you can see something known as the request headers. So the method is getting over here because you're trying to get information from Google servers. But let's say you want to go ahead and post some information, and you want to go ahead and have the ability to go out and change these request headers. We want more flexibility when it comes to issuing the request. Then we should go ahead and make use of a tool known as Postman. This is one of the primary reasons why you should use the Postman tool. So I don't want to go ahead and use this tool in subsequent chapters. So you can also go ahead and ensure that you have the tool installed on your local system.
Now that we have our inference pipeline in place on our Kubernetes cluster, So over here you'll see that you have something known as the viewing of the real-time endpoint. So now we can go ahead and click on View the Realtime Endpoint Point. And over here, you'll actually be directed to the endpoint section, which is available in Azure Machine Learning Studio. So now with the use of this endpoint, if I go ahead and scroll down, you have the rest endpoint in place. You can now go ahead and issue API calls. You can go ahead and submit your data and see the prediction from your machine learning model.
Now, from here itself, you could also go ahead and test the machine learning model. So, as you can see, there is an interface in place where you can go ahead and submit the various values for the age, workloads, and so on. This is going on to your web service input zero. If you go ahead and scroll down over here, you will see that you have a value for the income as well. But remember, the entire purpose of this machine learning model is to go in and tell us what could be the income. But remember, the entire purpose of the machine learning model is to go ahead and tell us what could be the probable income, whether it would be less or equal to 50K, or whether it would be greater than 50K.
So all we have to do is make sure that this is empty and tell us, or tell the model, what the income should be. So you can go ahead and change the settings if you want to, and you can go ahead and click on Test. Now, once you go ahead and click on Test, you get your web service output. So remember, these were the input and output points that were added to your inference pipeline. Now, if you go ahead and scroll down over here, you can see the label it has given you. So it has gone ahead and predicted that the income would be less or equal to this. You can see the probability. So it's not such a great probability at all. Nevertheless, we've got an output in place.
Again, you have to go and ensure that you have a proper machine learning algorithm to go out and train your machine learning model, right? So over here itself, you could go in and test your endpoint. If you go on to consume, you also have your rest endpoint. And over here you have your authentication types. So again, to authenticate yourself, you have to go out and choose or use an authentication type. And over here, the great thing is that if you're going to use C, or you're using Python, or you're using any other language, it will actually go ahead and give you the code that you can actually add onto your application for calling that endpoint. So, everything is great over here. So now let's see how we can use the Postman tool to go ahead and issue a request to that API endpoint and get a response back. So the first thing I do is to ensure we convert this into a post request. So we want to go ahead and post information onto the server and get some information back. Next is a URL. So over here we could go ahead and copy the rest of the endpoint. I'll paste it over here. Next, we have to go on to the headers. In the headers, we have to go in and add an authorization header. And over here, the value should be the key word bearer space and the key, so what is the key? So over here, this is the key. We could go ahead and take either the primary key or the secondary key. So let's go ahead and take the primary key.
Let me copy it and we will paste it over here. Right, so now we are authorising the Postman tool to go ahead and make a request against this endpoint. Now I'll go on to the body of the request. Now over here, I'd go ahead and change the body of the request to Raw. And over here, I have to go ahead and change this to JSON because we have to go in and submit our feature information. And this needs to be in Jason's format. So over here, I'm going to go ahead and add the JSON data. So it needs to be in this format. So we have our inputs. We have the web service input zero. We have all of the features over here, which we want to go ahead and add. And over here in the income,again, I'm not specifying any income. Let me go ahead and click on Send once this is done, if I go ahead and scroll down. So now I have my web service output. Over here, again, I have the scoring label,and over here I have the probability. So over here, the probability is much better because of the data that I have provided. So this data is much closer to the data which it has learned based on that training model. Based on this data, it predicts that the income would be less than or equal to 50K, right? So in this chapter, we have actually gone ahead and learned how to consume our end point.
Hi and welcome back. Now in this chapter we are going to see how to train a model using a regression-based algorithm. So early on, we had actually gone ahead and looked at the two-class binary classification algorithm. Now over here, we are going to go ahead and see how to use the regression based algorithm. So, from the perspective of the exam, this is important. So you'll go ahead and use regression algorithms to go ahead and predict the value of a new data point based on historical data. So, for example, it helps to answer questions such as, what will be the average two-bedroom home cost in my city for the next year? So you can use this to predict or forecast values in this case. So let's go ahead and see how to implement regressionbased algorithms in Azure Machine Learning. Now here I am, back in the Azure Machine Learning studio. So you can go on to the home section. Now over here, let me go ahead and create a new pipeline. Second, over here, let me go ahead and close this. Now over here in my sample data set, So actually, before that, let me go ahead and give a name for the pipeline. So I'll go ahead and name this the regression pipeline right now. Next, again, we are going to go ahead and use one of the inbuilt data sets. So over here we're going to go ahead and look at the automobile price data. Now over here is something known as "raw." So this is in raw format. So we are going to go ahead and do something a little bit extra when it comes to the pipeline.
The basis of this pipeline remains the same. You go ahead and train your model, and then evaluate your model. You go ahead and use your data set. But in this data set, we have to do something a little bit different. And let me go ahead and show you what we need to do. So first, let me go ahead and drag the data set onto the canvas. And over here, let me close this. Now let me right click and let me go ahead and visualise the output. So now over here, the entire purpose of this data set is So over here we have 205 rows and we have 26 columns. So over here, this particular dataset has information about different automobiles. And for this data, for the different features that we have, such as the width, the height, the engine size,et cetera, it has the label of the price. So over here, based on this different historical data, we want a model to be trained to go ahead and predict the price of a car based on the data that gets submitted to the model. Now why is this data set, which is inbuilt in your machine learning, showing it has the raw format? That's because when it comes to your own data, there are some things that you need to consider. So when it came to our classification model, I told you about some of these aspects. So for example, if I go on to one of the features, let's say horsepower for the car itself. So over here, if you look at the min and max values, they're fine. So we have a minimum of 48 and a maximum of 288 horsepower.
So some of the aspects are that you have to make sure that the data is accurate. Now over here, you'll notice that you have two missing values in place. So when it comes to this dataset, there are basically two rows that don't have values for the horsepower itself. So this can be an issue. Also, if you scroll on to the left when it comes to normalised losses, So over here you can see there are a lot of cases where you have missing values. Now you can go ahead and make use of the inbuilt modules that are available in your machine learning studio to handle these sorts of issues when it comes to our data set. So the first thing we are going to do is go ahead and use a module to work with the column that has normalised losses. So over here, let me go ahead and search for a column. Now over here, I'm going to go ahead and choose select columns in the data set. Let me go ahead and first ensure there's a connection between the price data and these select columns in the data set module. Now over here in the properties for select columns in the dataset, let me go ahead and hit edit column. Now over here, I want to go ahead and include all of the columns. But I also want to go ahead and add another condition. I want to go ahead and exclude the column over here. So the column is, if I go on to column names over here, the column I want to exclude from my data set is normalised losses.
So, if you have missing values in your data set, there are some things you can look into. One is to go ahead and remove the entire column. As it is, if you feel it does not have an impact on the trading of the model itself, you can go ahead and decide to remove the column from the data set. So over here I'm giving an example of how you can actually go ahead and remove the column from the data set itself. So let me go ahead and hit save. So over here I have my data set and it's going to go ahead and select all the columns except for the normalised loss columns. Now, there are also other columns which don't have data. So for example, if I go on to the horsepoweragain, over here we have two missing values. Let me go on to the compression ratio. So this seems to be fine. So over here, in terms of thestoke as well, we have missing values. So now, instead of trying to address the missing values column by column, you can actually go ahead and address all of them together. So there is a module known as "missing." So you can go ahead and clean up the missing data. I can go ahead and drag it onto the canvas. Again, over here, let me go ahead and connect the modules together. Let me go on to clean up the missing data.
Now over here in the cleaning mode, you could either go ahead and add your own value to be substituted for the missing data, or you could go in and replace it with a mean value. So go ahead and take the mean value of the data within the column itself, or you could go ahead and remove the entire row, or the entire column. Let me go ahead and choose to remove the entire row just for the purpose of this demo. Then over here, let me go ahead and click on Edit Column. So I want to go ahead and include all of the columns.
So I want to ensure that data is missing and data is cleaned for all of the columns. Now, with all of this in place, let's go ahead and just hit submit. I'll go ahead and choose my compute target, which is already in place, and let me go ahead and hit on save so I can use the same compute target for multiple pipelines. And let me go ahead and hit submit. Let me go ahead and either choose an existing experiment or let me go ahead and create a new experiment. So let me go ahead and hit on Submit. So over here, we are going to go ahead with the building of the pipeline. But before that, I want to go ahead and show you what's important. The first thing is to go ahead and ensure that you clean your data set. So over here, we're using two modules in place. First is to go ahead and choose the columns that will be part of our dataset and how to clean up any missing data. So let's come back and in our next chapter, let's go ahead with our pipeline.
Prepared by Top Experts, the top IT Trainers ensure that when it comes to your IT exam prep and you can count on ExamSnap Microsoft Azure AI Fundamentals certification video training course that goes in line with the corresponding Microsoft AI-900 exam dumps, study guide, and practice test questions & answers.
Comments (0)
Please post your comments about AI-900 Exams. Don't share your email address asking for AI-900 braindumps or AI-900 exam pdf files.
Purchase Individually
Microsoft Training Courses
Only Registered Members can View Training Courses
Please fill out your email address below in order to view Training Courses. Registration is Free and Easy, You Simply need to provide an email address.
Latest IT Certification News
LIMITED OFFER: GET 30% Discount
This is ONE TIME OFFER
A confirmation link will be sent to this email address to verify your login. *We value your privacy. We will not rent or sell your email address.
Download Free Demo of VCE Exam Simulator
Experience Avanset VCE Exam Simulator for yourself.
Simply submit your e-mail address below to get started with our interactive software demo of your free trial.