AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) Certification Video Training Course
AWS Certified Solutions Architect - Professional (SAP-C01) Training Course
AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) Certification Video Training Course
34h 1m
134 students
4.4 (81)

Do you want to get efficient and dynamic preparation for your Amazon exam, don't you? AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) certification video training course is a superb tool in your preparation. The Amazon AWS Certified Solutions Architect - Professional certification video training course is a complete batch of instructor led self paced training which can study guide. Build your career and learn with Amazon AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) certification video training course from Exam-Labs!

Start Course

Student Feedback

4.4
Good
44%
56%
0%
0%
0%

AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) Certification Video Training Course Outline

Getting started with the course

Play
9:00
Play
11:00

AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) Certification Video Training Course Info

Gain in-depth knowledge for passing your exam with Exam-Labs AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) certification video training course. The most trusted and reliable name for studying and passing with VCE files which include Amazon AWS Certified Solutions Architect - Professional practice test questions and answers, study guide and exam practice test questions. Unlike any other AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) video training course for your certification exam.

New Domain 1 - Design for Organizational Complexity

12. Canned ACL

Hey everyone and welcome back.In today's video we will be discussing about cannedACL and we look into its importance as faras the Cross account HD bucket policies are concerned.So let's go ahead andbegin understanding about Can ACL.Now, as the initial start just to have aright base, we should remember that every bucket andthe object that we upload within that bucket hasan ACL which is associated with them.Now, whenever a request is received to the object within SThree bucket, AWS s Three will check the ACL which isassociated with that object and depending upon the ACL, it willeither allow or it will block the request.Now, one important part to remember over here isthat whenever we create a bucket or an object,AWS Three by default will grant the resource ownerfull control of that specific resource.Very important to understand.So let's do one thing, let's go ahead and understandthis pointer before we go ahead with our next slide.Now within the AWS.S three API documentation.You have a command of getbucket ACL.So this command will basically show you the ACL whichis associated with the object within the st bucket.So we were already seeing that every objecthas an ACL which is associated with them.So in order to see what exactly that ACLlooks like, we can run this specific command.So let's go to our terminal and execute this command.So I'll do an AWS.Three API.Get object ACL bucket.The bucket name would be kplapps over.Now, we will also specify the key.So key is demo TXT.So this is the file which we haduploaded within our SC bucket and we'll runit with the profile of accounting.Now, within this, if you look intothe permission, the permission is full control.We have already discussed this within the slide where wheneveryou upload an object, a level S Three will granta full control permission to the resource owner.And this is something that you'll see over here.So you have a full control andthe display name is Team Fantastic.So Team Fantastic is basically the account a and theaccount A has the full control of this specific object.Now, if you look into the cross TXT which wehad uploaded it with a different account, if you tryto run this command, it will say Access Denied.So basically you have to run this command withthe resource owner who created this specific file.So we know that cross TXT was created by the Imusers access secret key which belongs to the account B.And this is the reason why within theprofile we need to select account B, becausethis account B is the resource owner.So let me press enter.And now if you see over here the permissionis full control and to whom the full controlis given, it is given to the Mudohi Rapangeli.Basically this is the account B and thisis the reason why account A principal isnot able to access this specific file.It gives permission denied because the objects ACL hasfull control only for the account B principle.So account A does not really have any permission.And this is one of the very commonoccurrences that you will find whenever you aredealing with a cross account access.So with the basic set, let's goahead and understand about canned ACL.So basically, AWS s three supports a predefined grant toan object and that is through the can ACL.So we have already discussed that every objecthas an ACL which is associated with it.So whenever we upload an object to sthree, we can set a predefined permission orpredefined ACL with that s three bucket.So that predefined set of permissioncan be done through canned ACL.Now, these can ACL can be specified inthe request using the X AMG ACL header.So there are a lot of canned ACLs.So these are some of the examples of canned ACL.One of the interesting ones that we are lookingforward for is the bucket owner full control.So if you see over here the description,both the object owner and the bucket ownergets full control over the object.This is very important because if you are uploadingan object from a different account to a sthree bucket which is in a different account, whatyou would need is both the object owner andthe bucket owner should get the full control.Now, currently, since the bucket owner in cross TXT,since the bucket owner does not have a fullcontrol and only the object owner has full control,this is the reason why we are getting thepermission denied whenever we try to perform operations throughthe account A resource.So what we basically have to do is anyfile that we upload to the s three bucketwhich does not belong to our AWS account, weneed to make sure that the ACL associated withthat object has the bucket owner full control access.So this can be easily achieved.So let's do one thing, let's go to our terminaland I will create a file I'll say can TXTand within this I'll say the practical solution.All right, now we will upload this specific file withthe access and secret key of the account B andwe look into how exactly it would work.Now, in order to do that, what I'lldo, I'll do AWS s three CP.We'll specify the cand TXT, we'llupload it to KP lapse Democrats.Along with that we will specify with the ACL.So basically, we need to specifythe bucket owner full control access.So I'll say bucket owner full control.So this is the access permission thatwe are giving to this can TXE.And the profile through which we will beuploading this object would be account B.Perfect.So now the can TXT has been uploaded tothe Kplash demo crossover bucket with the accident secretkey, which belongs to the account b.So if you quickly do a s three,let me quickly do s three API.So this is the command and if I pressEnter so this was for the cross TXT.Now I will run the same commandthis time for the Cand TXT.Now, within the Cand TXT file, you will now see thatboth the Panjali Mudoi account so this is the account B.Account B has a full control, and then accountA, which is Team fantastic, also has full control.So both the account A and the account B hasfull control over the specific object that was uploaded.And this is the reason why it is very importantto understand that if you are transferring a file toa s three bucket which doesn't does not belong toyour account, you have to specify a specific ACL whichwill grant the bucket owner also the access to theobject that you are uploading.

13. Understanding CloudWatch Logs

Hey everyone and welcome back to the Knowledge Pool video series. So, continuing a journey with Cloud Watch. Today we'll take a look at the Cloud Watch logs. So let's understand it with a simple example. So generally, if you talk about a typical server, be it Linux, BSD, or even Windows, each and every server has its own log files. So a server can contain a lot of logfiles, which can range from system locks to even application locks if an application is running. So whenever you want to debug something, it is necessary that we have access to the log files. Now, in the default behavior, since the log files are stored within the system, if anyone wants to debug a specific thing that is not working, that individual would need access to the server. So let's take an example. So you have an application running—a PHP application running on a Linux server. So due to some reason, the PHP application is not running properly as expected, and a developer wants to look into the log file. So what you'll have to do in normal behaviour is that you have to give him SSH access to the developer, and then he will be able to go through the log files. So that is one that is not considered a good security practice. So ideally, if you go into the organisation that deals with sensitive information, none of the developers have any access to the server. By default, no developer will have access to the server. Now, the question is, you might ask, "If the developer does not have access to the server, how will he debug the log files?" and the answer is to centralise the log solution. So what you would ideally like to do isyou would like to pull the log files fromthe server and put it into a central place where the developer can look into the log files. So let me give you one of the examples in order to understand it better. So I'll just log into the server, and if you go into the wire log, you'll see there are a lot of log files. Now, let's assume that you want to debug a log file called wire log messages. So one of the approaches to this is that you manually login to the server, go and do a tail in the messages file, and look into the troubleshooting-related aspect. However, in the second and most ideal use case, you push all of these log files to a central location where one can query them. So in my case, what I have done is, since we are speaking of the Cloud Watch logs, I have pushed these files to a central solution. So if you look into the Cloud Watch console, you'll see I have a log group named "wire log messages." So if I go here, I have the instance ID, and now you see I have all the messages that are part of the wire, log messages, and log files. So similarly, you can even push all the application logs to a central log server. It can be Cloud Watch, it can be Rsyslog, it can be Elastic Search, et cetera. The main point is that if you send all logfiles to a central lock server, you won't need to give anyone inside the system access. So this is what the concept related to the Cloud Watch log group is all about. So what we'll be doing is wrapping up this lecture, and in the next lecture we'll go ahead and look into how we can push the log files from the EC2 instance to a Cloud Watch log group. So this is it for this lecture. I hope this has been informative, and I look forward to seeing you in the next lecture.

14. Pushing Linux system logs to CloudWatch

Hey everyone and welcome back to thesecond lecture of Cloud Watch logs.So in this lecture we will do the practical sessionand we'll look into how we can push the logsfrom the Linux servers to the Cloud Watch Watch group.So in order to do that, what we'll bedoing is let's switch to the Oregon region whereI have my EC two instance up and running.So let me just quickly show you.So we have one easy to instance whichis up and running in the Oregon region.And what we'll be doing is we will bepushing the logs which are generated in this ECTwo instance to a central Cloud Watch log group.Perfect.So the very first thing that you would need to dois you would have to allow this easy to instance tocreate the log group and push the logs there.So let me just show you an exampleon what do I mean by this.So if you go to the Cloud Watch console and go tothe logs, you see there is a log group which is createdby the name wire log messages and within this log group thereare a lot of log messages which are created.So in order for the EC Two instance tocreate the log group and push the logs tothe log group which is created, we need toallow that easy to instance to do these things.So the very first thing that we have to dois we have to create an im role policy thatallows EC Two to achieve those use cases.So let's go ahead and do that.So if I go to the imrole, let's create a new inline policy.Over here I'll select the JSON and from the documentationit says they have already given a policy document.So within the policy document you have seenthe first action is create log group.So this is the first thing that it needs to do.Then there is one more action like put log events.So this allows to put the log events fromthe system to the log group which is created.So we'll use this policy, I'll be pasting this in theresource section so you can just copy it as well.So we'll take this policy, I'll review the policyquickly and I'll name it as Cloud Watch locks.Quickly.Click on Create Policy. Perfect.So now we have a policy which allows ECtwo instance to create a log group and pushthe events to that created log groups. Perfect.So now that we have done that, let'sgo to the Cloud Watch console again.This time since we are working in the Oregonregion, we'll be looking into the logs here.So now you see there areno log groups which are created. Perfect.So what we'll do, we'll becreating our first log group.Now before we do that we have to install a CloudWatch agent in the server and that agent is basically responsiblefor pushing the logs so if you do Yam install AWSlogs so this is the agent which we need to installnow I am running Amazon Linux so this agent is directlyavailable from the Yam repository itself if you're using some differentoperating system you can go through the documentation and they havea different approach to achieve the same so now that Ihave AWS logs installed let's quickly verify the status of AWSlocks and AWS logs is top perfect so the very firstthing that we'll do is we'll go to the configuration directoryof AWS logs and there are two important configuration files thatwe need to work into the first is Awcliconf and thisis the place where we can specify the region where thelog group will be created so what I'll do is letme perfect so let me change the region from US eastone to US west two which is the Oregon region wherewe are working once you have configured the AWS CLI conf let's quickly also explore the Awslogs con f and ifyou go a bit down here you will see that thereis one configuration store which is configured which is for wirelog messages and the log group name which will be createdis wire log messages so here we can add more logfiles like if you have an application log file so wecan add more log files over here so what I'll dolet's leave it to the default configuration one and I'll doservice AWS logs start perfect so now that AWS logs havestarted you can look into the progress related to AWS logswithin the wire log directory where you can do talefawlogs logso this is the log file for the AWS log agentwhich is running in the easy to instance so once theconfiguration file has been configured now you see that you havea new log group which is created which is vir lockmessages and within this log group you have all the logfiles which belongs to the files which is specified within theAwslogs connect so one important thing that I'd like to showyou is that generally when you specify various log files soyou have wire lock messages you might even have wire lockapplication as well so generally it is recommended to have adifferent log group name now once you have a different loggroup name what you can actually do is you can restrictthe permission associated with each log group so this wire logmessages is not required for developers this is more required forthe system administrator so you can create a policy where onlysystem administrator can read the messages within the wire log messageslog group similarly if you have some application specific law groupyou can create a policy where only certain developers who areinvolved within that application team can access the log files withinthat log group so these policies can be defined if youhave a separate log groups which are created.So this is it.About this lecture.I hope this has an informative for you and Ilook forward to see you in the next lecture. Bye.

15. Overview of Service Catalog

Hey everyone and welcome back.In today's video we will bediscussing about the AWS service catalog.Now, AWS service catalog in definitive terms basically enablesthe organization to create manage the catalog of Itservices that are approved for the use on AWS.Now this can be better understood withthe sample use case that we have.So let's say you are working as a DevOpsor as a solutions architect in an enterprise.And now what happens is you will typically see thatthere are a lot of developers and developers would needtheir EC two instances to do that development testing.Now what typically happens is that if you give themthe full access, typically what they do is that theylaunch the EC two instances based on various types.Like you will have Ubuntu, someone will launch sentto us, someone will even launch Red Hat.I still remember that one of the developers inthe organization that I have been working with forDevelopment environment, he actually launched a Red Hat instance.And we did not really know during that time,but at the end of monthly cycle we justrealized that there is office subscription charges for RedHat and we just monitored and it was quiteinteresting as well as quite funny.And when we asked him likewhy you have launched Red Hat?He said Red Hat is known for stability.So you'll have some people whowill have a very nonstandard stack.So basically, if you see over here,there is no specific standard over here.Now one of the issues along with thatis that when developers launch their own ECtwo instances, they typically attach a security groupof which is both inbound and outbound.So not only there is a standardization issue,but there is also a security issue.So what you need is you need the standardization, youneed security and you also need developers to be ableto launch the instances quickly because otherwise you have togive them the documentation on how they can launch aC two instance and so on along with that.Again, typically what happens here is that developer willnever launch an instance with a T two micro.They'll launch M five large, M five Xlarge and big instances which are not requiredand most of them will have underutilized resources.So again, you need to havethe cost saving factor as well.And with AWS service catalog you willbe able to achieve all of these.So let's go into the demo.In this demo what we will be doing is we'lltry to launch an instance based on service catalog witha user which will assume to be a developer.So currently I have a user called Test User.Now if this test user wants to create an EC two instance,then what you'll have to do, he'll have to go to theEC to console, he'll have to click on launch instance.From launch instance he'll select the appropriate AMI.Then he'll select the appropriate instance type.Let's keep t two micro.Then he'll not know on which VPC that he has to launch.Like he should be launching theEC two instance in private subnet.He would not know about the elastic IPS, he wouldnot know about the Im role that needs to beassociated with the EC two instance and so on.So what he'll do is he'll put the random values overhere and he'll go ahead and launch the EC two instance.So that is not a very correct way.So instead of doing things manually, what now he cando is he can go to a service catalog.Let's type it out.I'll select service catalog.Now in this service catalog under the productlist you will see that I have aproduct name called Development easy to instance.So this is created by the administrator.If you see over here the owner is security team.So all developer has to do, he has to click hereon the product name which is development EC two instance andhe can go ahead and he can launch the product.So he'll have to give the name.So let me say it's dev EC two instance.One he has to select aversion which is for development.Do a next within the parameter you can goahead and give the name of the developer.Let me give it a z here.Let's do a next.We'll ignore this, we can go ahead and wecan click on Launch and that's about it.Now, what will happen in the back endis that there is a CloudFormation template.Now, cloud formation template already has a stable templatebased on which the resources would be created.So now let's quickly go to the EC two instance.Now, here we see that we have anEC two instance which is up and running.Now, typically if you look into the security group,there is a security group called Dev Security Groupwhich is associated with the EC two instance.Let me click over here andassociated with the security group.If you look into inbound, inbound isrestricted over here and if you lookinto outbound, outbound is also restricted.So what we have done is we have standardized the AMIID from which the EC two instance will be launched.We have standardized the instance type basedon which the instance will be launched.We have standardized the security group as well.And you can even standardize variousthings like I am role, etc.Which is required for the development EC two instance.So not only it becomes easier for youas an administrator or as a security person,it also becomes easier for the developer.Because now developers do not really need to know onhow to create an EC two instance or if required,if this EC two instance is based on high availability,how to create a load balancer, connect the EC Twoinstance with load balancer, et cetera.All of those can be defined by the DevOps team.And all the developer has to do is that hehas to create his own product from the service catalog.So now, if you see over here thisspecific EC two instances status is now succeeded.This is specific the CloudFormation stack Aron.Now, once he has done working with his EC Twoinstance, if he wants to terminate, you can go toActions, you can click on Terminate and select here.So it says the terminator action has been started andwhat will happen basically is that you see the instancesgetting shut down and then it will be terminated.So it is quite a easier solution forboth the developers as well as the SysOps team working within your organization.So this is the high leveloverview about the service catalog.In the next video we'll look into.I hope this video has been informative for you andI look forward to seeing you in the next video.

16. Creating Product and Portfolio in Service Catalog

Hey everyone and welcome back.Now in the earlier video we were discussing abouthow exactly the AWS service catalog works and thebenefits that it offers in an organization.So in today's video we will go ahead andwe'll do the demo and look into how exactlyexactly we can configure the service catalog.So the first thing that we need to do is we needto go to the service catalog console which you can do itfrom the service and you can just type service catalog.Now once you are there in the service catalogconsole now here you have to create portfolio list.So let me click on the portfolio list andcurrently there are no portfolios which are created.Now we'll understand what exactly portfolio list is andwhat product list is once we actually configure this.So let me click on create a portfolio.Now the portfolio name you can giveit as say development easy to instancethat have a proper naming convention.So now you have the owner, let's say the owneris the security team and let's click on create perfect.So once you have created the portfolio let's goinside the portfolio here and now you have theoption about the user group and role.So user group and role are basically who will beable to access this specific portfolio that you have created.Now before we do that, let's click on the productlist here and click on upload a new product.Let's give a product name.We'll give it an easy to instance stack and providedby we'll give it as a security team here.We'll go ahead and also add adescription saying that use for development purpose.Once you've done that let's click on next so youcan give email contact support link and support description.We'll just ignore it.And now you have to specify the template over here.Now template is required because thisspecific product needs to be created.So when a developer goes ahead and launches aproduct in the back end the cloud formation templatethat you associate with this product will be launched.So this is the location where you need toeither upload a template file or specify the URLwhere your cloud formation template is present.Now for our testing purpose, what I have doneis I have created a very simple CloudFormation template.Now this template has properties of imageid so thisis the image ID associated with the Amazon Linuxtwo and it has the instance type.So we'll be uploading thisspecific template within our product.So now within the select template we'll click onupload a template file and I'll go ahead andI'll choose the easy to hyphen Oregon TXT.Let's give it a version.Let's say you can give it a version one or youcan give it a development and within the description you cansay this should be used for all the dev servers.Once done that you can go ahead and click onNext and this will take you to the review screen.If everything seems alright, you can goahead and you can click on create. Great.So once your product is created, you cango ahead and do a refresh and youwill be able to see the product name.Now this specific product which creates an EC two instance oftype t two micro can be used by various people.It can be used by development team, itcan be maybe used by testing team dependingupon the permissions that you give.Now that permission and various other factorscan be configured in the portfolio section.So here we have a portfolio list of developmenteasy to instance stack and within this you donot have a product which is associated over here.So let's go ahead and add a product.I'll select the product that we have createdand I'll click on add product to portfolio. Great.So the product has been added to the portfolio.Now the next question that comes is who willbe able to launch the product in that environment?So this can be configured bythe users groups and roles.You also have various options related toshare with other accounts, etc for.So let's click on user groups and Roles.And here you can add a user, you can adda group or you can add a role who willbe able to have access for a specific product.So for our testing purpose, let's open up theIAM and within the Im console we'll go togroups and I'll click on add a new group.So group name, let me give it as dev.I'll click on next.Now comes the permission.Now any user who wants to launch theproduct from a specific service catalog needs tohave one role which is mandatory, which isAWS service catalog end user full access.So this role is something which is mandatory, otherwise they willnot be able to access and be able to use it.So let's go ahead and create a group.So once you have created a group of depth developer groupis something that we had used for the demo purpose.That's why we use the shorter one.Next thing, let's add a user.The username, let me give him as Bob.So Bob is a developer.Let's give him the AWS management access and I'll selecta custom password and I'll deselect the option so thatI do not really have to set the password. Again.Let's click on next.I'll ignore the permission sets and I'llgo ahead and I'll create the user.So once the user is created, you canadd the user to a specific group.In my case, I'll add him to the dev group. All right.So this is the group where the user has been added.Now the next thing is that let's quickly loginvia Bob to the console sign in link.So I'll copy this sign in link and fromIgnito mode I'll quickly log into the Bob user.So once the user has logged in, heneeds to access the service catalog service.And within this service if you would have noticedthat there are only two tabs which are available.One is product list andsecond is provisioned product list.Now, within the product list youcannot really see a product.However, we had already created oneproduct for Easy to Instance.Now the reason why Bob is not able tosee the product, because we had not associated theportfolio with the group where the Bob user is.So now coming back to the original console, ifyou look into the portfolio list so this isthe portfolio where our product was associated.You see our portfolio of development easyto Instance that is associated with aproduct called Easy to Instance that.Now this product has the association ofcloud formation template which creates appropriate resources.Now we need to allow the specific user groupor role to be able to access the product.Now you can do that.Let's click here and we need to add user group or role.So once you do that, let's go to the groups andwe'll add dev group and I'll click on add Access perfect.And it says that theaccess has been added successfully.And now if you quickly refresh the page for theBob's user console, now you see the Bob is ableto see one product which is easy to instance that.So if I quickly click on here, Ihave the option of launching this specific product.However, there is one important partthat you need to remember.So that important part is the cloud formation template.Now, if the Bob user wants to launch thespecific template, he needs to have the appropriate impolicies which can allow him to do so.Now, the first thing that he needsto have is the cloud formation permission.Now, cloud formation related permission is alreadypart of the service catalog end userfull access that we had added.Let's quickly see that.So if I go to the Bob's user and I go tothe permissions, it has the AWS service catalog enduser full Access.Now if you look into the permission, itdoes have the cloud formation associated permission.Now there is a need of additional permission herebecause if Bob wants to launch an EC twoinstance, definitely since it is going through a cloudformation, cloud formation permission is required.But Bob will also need the access for the EC two.So let's quickly add EC two permission here I'll sayAmazon EC to full Access will click on Add andwe have added the EC to full access.Now, there is one more caveat over here.It is more of a bug.Let me quickly show you.So now, Bob has the accessto launch the EC two instance.So now what he can do, he can click onLaunch product he needs to select a specific version.So this is the development version and let's call itas dev instance one and I'll click on Next.And now you see it gives you anerror saying that failed to process the productversion s three error access denied.Now this can either be the bug relatedto the service catalog or it can bea bug related to the console by itself.Now I have confirmed from the AWS team that this isa bug and they already have an incident in place.So till the time this part is resolved, what we needto do is we have to give the S three accessto the user who will be launching a specific product.So for our testing purposes I'll click on Add policyand I'll click do s three read only access.Let's click on Next and I'll add a specific permission.Once you've done that you can go back.Let's do a cancel.Let's try it out once againI'll click on Launch product.I'll name it as dev instance, theversion I can see it as development.Let's click on next.So here you can specify the parameter.I'll give it as name.The name is dev instance.I'll go ahead and I'll click on Next.I'll ignore the notification part, itwill give you a review.We can go ahead and we can click on Launch. Great.So the EC two instance is gettinglaunched so let's quickly verify that.And here from the EC two console screen you should seethat one instance with T two micro is being launched.Now the EC two template that wehave, this is a very basic one.Ideally this image ID should be the hardened AMI.If you have it within your organization then youshould have a security group associated with has aproper inbound as well as outbound rules and ifrequired you can attach load balancers, you can attachan appropriate im rule with proper policies etc forso now if you go back to the servicecatalog screen this is how the product looks like.Now if you go to product list here you will be ableto see all the product which is available for your user.If you go to provision product list it will basicallygive you the list of product which you have provisioned.So in our case it is the dev instance.Now if you want to terminate you can goahead and you can click on Terminate provision product.Let's click on terminate.And now the status has gone to underscore changeand now you see the instance status shutting down.So this is the high level overview about theservice catalog and how you can configure it.Now again, it is pretty simple.In the demo we had explored the productlist, we had explored the portfolio list.Portfolio is associated with the productand within the portfolio we canconfigure lot of constraints over here.We can configure the user groups and roles.We can associate with the other AWSaccounts and also with the AWS organizations.So this is the high leveloverview about the service catalog.I hope this video has informative andI look forward to seeing next video. Bye.

Pay a fraction of the cost to study with Exam-Labs AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) certification video training course. Passing the certification exams have never been easier. With the complete self-paced exam prep solution including AWS Certified Solutions Architect - Professional: AWS Certified Solutions Architect - Professional (SAP-C01) certification video training course, practice test questions and answers, exam practice test questions and study guide, you have nothing to worry about for your next certification exam.

Hide

Read More

Provide Your Email Address To Download VCE File

Please fill out your email address below in order to Download VCE files or view Training Courses.

img

Trusted By 1.2M IT Certification Candidates Every Month

img

VCE Files Simulate Real
exam environment

img

Instant download After Registration

Email*

Your Exam-Labs account will be associated with this email address.

Log into your Exam-Labs Account

Please Log in to download VCE file or view Training Course

How It Works

Download Exam
Step 1. Choose Exam
on Exam-Labs
Download IT Exams Questions & Answers
Download Avanset Simulator
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates latest exam environment
Study
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!

SPECIAL OFFER: GET 10% OFF. This is ONE TIME OFFER

You save
10%
Save
Exam-Labs Special Discount

Enter Your Email Address to Receive Your 10% Off Discount Code

A confirmation link will be sent to this email address to verify your login

* We value your privacy. We will not rent or sell your email address.

SPECIAL OFFER: GET 10% OFF

You save
10%
Save
Exam-Labs Special Discount

USE DISCOUNT CODE:

A confirmation link was sent to your email.

Please check your mailbox for a message from [email protected] and follow the directions.