CompTIA Pentest+ PT0-002 – Section 13: Cloud Attacks Part 3

  • By
  • January 24, 2023
0 Comment

129. Auditing the Cloud (OBJ 3.4)

In this lesson, we’re going to explore how you can audit cloud services using some common tools during your engagements. This includes tools like Scout Suite, Prowler, Pacu, Cloud Brute and Cloud Custodian. First we have Scout Suite. Scout Suite is an open-source tool written in Python that can be used to audit instances and policies created on multicolor platforms such as AWS, Microsoft Azure and Google cloud by collecting data using API calls. After being run, Scout Suite will compile the report that lists out all the VM instances, storage containers, IAM accounts, data and firewall ACLs, that’s able to gather information about.

This tool allows you to write custom rule sets as well that you can use during your scans. This way you can highlight or flag things when there’s a policy violation that exists. For instance, you can create a rule in Scout Suite to identify any users who do not have multi-factor authentication enabled. To do this, you’ll create a rule like this, description, users without MFA, dashboard name, users, path, iam.users.id, conditions and we want iam users id withKey and Login Profile also having, iam users id, MFADevices being empty.

If we have these two conditions that tells me that we don’t have mfa_enabled. Keys, iam user’s id and id_suffix, mfa_enabled. As you can see, setting up these rules is pretty simple and they’re basically in JSON format, as long as you understand what the keys are and what the ids are that you need to add in there which you can find using AWS’s Documentation. You’ll be able to set up some pretty complex rules pretty quickly. The second tool we have called Prowler. Prowler is an auditing tool that only works with Amazon Web Services. Prowler is used to evaluate a cloud infrastructure against the center for internet securities benchmarks for AWS, as well as scanning for compliance against GDPR and HIPAA regulations and checklists. Prowler is an open-source security tool that’s used to perform security best practices assessments, audits, incident response, continuous monitoring, hardening and forensic readiness for AWS cloud services. This tool is able to check for compliance against over 200 different controls and it’s really helpful in trying to ensure a target organization is meeting or exceeding the regulatory requirements. Prowler is a command-line to tool and it can create a report in HTML, CSV or JSON formats, as well as having the ability to directly submit their findings to the security hub. You could also do specific checks and groups during your assessment or you can even create your own checks and own rule sets. Prowler is really useful if you need to check multiple AWS accounts and parallel, such as when an organization has multiple accounts or sub-brands that you need to assess in your scope of work. The third tool we have is known Pacu.

Pacu is an exploitation framework that’s used to assess the security configuration of Amazon Web Services or AWS accounts. This tool includes several modules. So your teams can attempt exploits such as obtaining API keys or gaining control over a VM instance. Pacu focuses on the post-compromised phase. So the team can then drill down into that system and escalate their privileges, launch additional attacks or install back doors. Personally, I like to think about Pacu as a tool like Metasploit, but one that’s focused specifically on attacking AWS based cloud services and infrastructure. Fourth, we have CloudBrute. CloudBrute is used to find a target’s infrastructure, files and apps across the top cloud service providers, including Amazon, Google, Microsoft, DigitalOcean, Alibaba, Vultr and Linode. This tool works essentially like a web crawler or a brute force directory listing tool much like DirBuster, but it’s focused specifically on cloud-based resources.

These resources include things like object storage, virtual machines, containers and other cloud cloud resources and services. Fifth, we have Cloud Custodian. Cloud Custodian is an open-source cloud security, governance and management tool that’s designed to help administers create policies based on their different resource types. Cloud Custodian is a stateless rules engine that we can use to manage AWS environments by validating and enforcing the environment against set standards. By using Cloud Custodian, you can run a scan against a targets cloud environment, identify which policies are being enforced and which vulnerabilities may exist. As a network defender, you can use Cloud Custodian to also automatically correct the vulnerabilities that are found by enforcing new security policies in the cloud environment. Cloud Custodian is a great tool for defining rules that can enable a well-managed cloud infrastructure that is secure and optimized to save your organization money by turning off resources outside of normal business hours as well. This is known as garbage collection where unused or underutilized resources can be deleted and free up more of your budget for useful cloud services that your organization actually needs.

130. Conducting Cloud Attacks (OBJ 3.4)

In this lesson, we’re going to spend some time looking at a couple of reports when you’re conducting cloud auditing. Now, for the exam you don’t need to be able to go through all these reports and understand them step by step by step. But I just wanted to give you a good overview so you can then explore more on your own. We’re going to take a look at a couple of auditing reports based upon a misconfigured cloud environment. To do this, we’re going to use a tool known as sadcloud. Now, sadcloud is an intentionally vulnerable distribution of a cloud infrastructure, and it’s for use inside of the Amazon Web Services Cloud. If you’d like to be able to spin this up yourself, you can all the instructions are on their GitHub at github/nccgroup/sadcloud. Now, the makers of sadcloud are the same people who make Scout Suite. And so they provided this, as a way for us to learn how to use that tool, look at the different auditing reports that are generated from it, and then use other tools against it as well. Think about sadcloud like you would Metasploitable 2. Metasploitable 2 is a vulnerable Virtual Machine that you can then attack using Kali Linux. Well, sadcloud is a vulnerable cloud infrastructure with servers and other misconfigurations that you can then look at, using the different tools in the cloud area to be able to get better at using those tools and using them for the real world. If you go over to github.com/nccgroup/sadcloud, you could scroll down through their REEADME. In the README they explain a little bit about what this particular system is. As you can see here, there are 22 AWS Services with 84 different misconfigurations that they put inside of sadcloud. They do this, so they can then look at it using tools like Scout Suite, Prowler, and many others.

Now, sadcloud was as I said, a way for security researchers be to learn more about cloud vulnerabilities inside of AWS. And it’s a great tool for us as future penetration testers. Now, if you’re going to do this, you need to know that you will incur some cost by a running sadcloud. It costs about $10 per day for the cloud compute time. So if you are going to use sadcloud, you don’t want to leave this running 24 hours a day seven days a week because it’ll start eating up a lot of money inside of your budget for cloud computing. Instead though, you can go ahead and turn it on, you can run your scans or run your hacks against it, and then you can shut it back down again. There are instructions of how to do all of that, here in the GitHub for sadcloud. Now, in this particular lesson, we are going to focus on the actual outputs of those scans that you would get by using something like Scout Suite or Prowler as opposed to running those scans ourself. Now, if you want to run those scans yourself it’s not really difficult to do. If you go over to the GitHub for NCC Group, you’ll also find one called nccgroup/ScoutSuite.

And this is the actual tool for Scout Suite. If you scroll down in the README, there are instructions of how to run this tool. And it really is simple to run it, you’re simply going to type in scout space aws if you’re doing it against side cloud in AWS — profile and then basc -f. And this will go through and run a scan against that sadcloud infrastructure that you have then deployed. But what we’re going to do is instead of running it, we are going to look at the report that comes from it. And if you go over to the sadcloud GitHub, there are links to this audit report as well that you can go through on your own and explore it a little bit more in depth. When you run Scout Suite you are going to get a report like this, generated based on this cloud that you have just scanned. You could see here, all the services listed on the main dashboard the resources, the rules, the findings, and the checks. Each of these is for a different service inside of Amazon Web Services. You’ll notice if we scroll down, the EC2 is one of the largest that we have, with the most vulnerabilities. Now, why is that? Well, because EC2 is Elastic Computing and that means virtual servers.

These are Virtual Machines and there’s lots of different vulnerabilities that are associated with that. The other one that is really high that you can see with the red exclamation is IAM, which Identity and Access Management. So as we go through you can click on any of these, for instance let’s go ahead and click on EC2. By clicking on EC2 we can now see all the vulnerabilities we have. Now, as you could see, going across the top we do have four filters. We have Show All, Good, Warning and Danger. If I click on Good, I’m only going to see the things that pass the vulnerability scan. In this case, the public EBS snapshot was good. If I want to see the details, I can click on the plus and here I’ll get a description that the snapshots should never be public because that could risk exposing sensitive data. And in this case, there was one snapshot and it was not public. So this was considered a good check. Now, these snapshots are considered backups and they contain all the state information from a particular virtual machine. Now, if I want to see all the things that were warnings, I could click on Warning. And you could see there are several warning areas here, including all ports being open, the default security groups in use, non-empty rulesets for default security groups and others. Again, if you want to get more details about what it found, just go ahead and click here. And in this case it checks 73 rules and there is 26 open ports. So we’d have to go in further to be able to figure that out as we look through some of the tabs of Scout Suite.

Next, if we go to Danger, we’ll see everything that is a big danger area that we really need to take care of. In this case again, we have all ports being open and we had two rules that were flagged here, at a 69 that were checked. If I look at DNS port that’s open to all, we could see a one rule was flagged. With EBS not being encrypted, there was one snapshot check and one was flagged so there was no encryption on that snapshot. Again, the EBS volume itself was not encrypted. That was again, we had three volume checked and three of them were flagged. So these are things that we want to go and fix as somebody who is a network defender. Next, we can go click back onto Scout Suite to see our main dashboard again. And we can pick another area. Let’s go ahead and choose IAM. IAM is our Identity and Access Management and this is all about rules, roles, and permissions. And so as we go through we can see some things that are lacking. For instance, we have a cross-account assume role policy lacks external ID and Multifactor Authentication. This says that when you’re authorizing cross-account role assumption, you should require multifactor authentication. In this case, we checked it against 13 accounts or 13 role and one of those was flagged. And so we’d want to figure that one out and then go dive into that, to get that fixed. Now, in addition to looking at each individual thing that comes up on the dashboard we can actually dive down a little bit deeper as well. To do that, just go up here to Security, click on that and then go to IAM.

And as you going to Dashboard, we can actually look at the password policy, the permissions or the IAM configuration itself. Let’s go ahead and take a look at the roles underneath IAM config. Underneath the roles you’ll see lots of good information. Here we have AWS config example role. And in this case, we see the ID, we see the Amazon resource number, the description, the creation date, the path and the maximum session duration. Underneath Role Trust Policy, we can click that to see details. And in this case, we see that this is an assume role, the effect is allow, the services config.amazonaws.com. And it was created with version 2012-10-17. We could see the supplies to zero Instances, zero Inline Policies and zero Manage Policies. Then we have the next role and we can keep going through all of these roles as we’re looking at them. Additionally, we can go back up here and we can go to our IAM and we can go to our Credential report. Under our Credential report, we can learn information about the particular passwords and authentications that are currently in use. So, based on this credential report we can see the user rami never used their password, there is no data available, and there is no MFA or multifactor authentication active. They do have access keys one and two that are active. And we could see that those were last used back in 2019 and 2020, which was over two and three years ago. If we go to the next user, we see sadcloudInlineUser. And again, we have the password that was never used, MFA being false. And in this case, we do not have any active keys. We scroll the next person, we have the root account.

The root account has the password last used in 2020, MFA is true, so they are using multifactor authentication but they do not have access keys one and two active. If we go to jdow, we can see this person also has MFA turned off, they have access key one is true, access key two is false and we can see the password was last used in 2019. All right, as we continue to go through you can find more and more information as you go through the different services. In this case, I’m looking at S3 which is our buckets that contain our objects, which is another way of saying our folders and our files. And here you can see, there are two dangers listed: all actions authorized to all principles and get actions authorized to all principles.

If I look at that, I can see that the buckets were checked there was seven and one of them was flagged so only one of them has an issue. And again, we are looking at the summary level when we’re on the dashboard. To dig in deeper and find out which ones it would be, we would click down here onto Buckets. And this would give us more information about those particular buckets. And that’s where we’d be able to find the information and the ACLs for each of these different buckets whether they have list, upload, delete, view permissions or edit permissions, as well as the groups with access, the roles with access and the users with access based on these particular buckets as you go through all of them.

As you can see Scout Suite does a really good job of collecting all the information and putting it into a graphical way for you to start digging through it. All right, let’s go over and take a look at Prowler. Here’s a report that we generated using Prowler against sadcloud. Now, in this case, Prowler is being used to look at all of the different vulnerabilities that may exist against that particular cloud inside of AWS. As you could see, this HTML file is not nearly as pretty or well laid out as the Scout Suite report but is something that we can get a lot of information from, in one place and it’s all kind of listed in one area. As you see anything in brown that’s going to be considered informational, anything that’s green means it was passed, it was the recommended security value. Anything that’s red is something that has failed and needs to have a fixed applied.

So let’s go down here into 1.1, avoid the use of root account, this was info. The root account was last accessed using the password, key one and key two back in 2019. What this says is, they shouldn’t be logging in as root and since they did that flagged it as something we want to be aware of. Number 1.2, this check which is check 12, is trying to ensure that factor authentication is enabled for all IAM users that have a console password. In this case, you can see that all of these particular users have passwords enabled but MFA has been disabled.

So if we were making report on this network we would want to recommend that they start enabling MFA for all of their users, because that is a higher level of security that’ll help prevent data breaches in the future. If we go to number 1.3, ensure that anytime credentials are unused for 90 days or more they’re disabled. Here you could see a couple of users have used their credentials in the last 90 days including jdow, student 10, student 16, student 17, student 18, et cetera. All the ones listed red haven’t law logged in for 90 days or more. So those should be disabled instead of having those credentials still being active.

If you go to 1.4, you could see this past. It ensures that access keys are rotated every 90 days or less. And in this case, nobody has an access key that is older than 90 days so this was considered a pass. As you continue to go through lots and lots of other information, I’m not going to sit here and read it all to you, but you can go and visit this report by going to sadcloud in their GitHub and then looking at this particular report. You’ll notice there are IAM policies, there are cloud trail information, there is configuration information lots and lots of information. And in this particular case, there’s a lot of things wrong with this particular cloud. And that makes sense, because sadcloud was designed to be intentionally vulnerable.

The last thing I want to point out to you in terms of sadcloud, is that when you’re installing sadcloud you can set it up and configure how vulnerable you want it to be. So if you go down to the configure sadcloud section of the README, it will tell you how you can comment or uncomment different modules in the particular configuration file to determine what things will be found when you run your scans. For example, you could uncommon all of your modules inside of sadcloud/main.tf, and you can then enter or edit all of the findings flags inside the Terraform image.

Terraform is essentially the code for the orchestration that will allow us to create the entire environment inside of AWS with the different findings that we want. Now, if you wanted to have only some of the findings be able to be found when you’re running your scans, you can then uncomment only relevant sections of sadcloud. And this will allow you to do something like, let’s just look at the storage vulnerabilities. And then you could go into AWS in that environment you just created, and try to fix those different vulnerabilities, run the scan again and see if the things you did to fix it have actually fixed it. This is a way that you can gain additional experience and skill in running these different cloud services by figuring out what is vulnerable, how can you properly configure them and then run that scan again.

Again, all of that goes way beyond the scope of this particular exam. But I just wanted to introduce you to the way you can read these different reports inside of Scout Suite and Prowler, as you start using tools like sadcloud to gain some experience, working in the cloud as a vulnerability analyst or a penetration tester.

Comments
* The most recent comment are at the top

Interesting posts

Preparing for Juniper Networks JNCIA-Junos Exam: Key Topics and Mock Exam Resources

So, you’ve decided to take the plunge and go for the Juniper Networks JNCIA-Junos certification, huh? Great choice! This certification serves as a robust foundation for anyone aiming to build a career in networking. However, preparing for the exam can be a daunting task. The good news is that this guide covers the key topics… Read More »

Mastering Microsoft Azure Fundamentals AZ-900: Essential Study Materials

Ever wondered how businesses run these days without giant server rooms? That’s the magic of cloud computing, and Microsoft Azure is a leading cloud platform. Thinking about a career in this exciting field? If so, mastering the Microsoft Certified: Azure Fundamentals certification through passing the AZ-900 exam is the perfect starting point for you. This… Read More »

The Impact of Remote Work on IT Certification Exam Processes

With remote work becoming the new norm, it’s not just our daily routines that have changed but also how we tackle IT certification exams. Gone are the days of trekking to testing centers; now, your living room can double as an exam room. This shift has brought about some fascinating changes and challenges. Let’s dive… Read More »

IT Risk Management: CRISC Certification Exam Essentials

Do you ever feel like the IT world is moving at warp speed? New tech seems to pop up every day, leaving you wondering how to keep up and truly stand out in your field. Companies are increasingly concerned about online threats, data leaks, and meeting legal requirements. That’s where the CRISC (Certified in Risk… Read More »

The Ultimate Guide to Mastering Marketing Automation for Email Wizards

Hey there, email aficionados! Welcome to your new favorite read – the one that’s going to turbocharge your email marketing game. You’re about to dive into the captivating world of marketing automation, a place where efficiency meets effectiveness, letting you boost your campaigns without breaking a sweat. Get ready to discover how automation can not… Read More »

Master YouTube Marketing with These 10 Powerful Steps

Welcome to the dynamic world of YouTube marketing! Whether you’re a seasoned pro or just getting started, harnessing the power of YouTube can significantly boost your brand’s visibility and engagement. With over 2 billion monthly active users, YouTube offers a vast audience for your content. But how do you stand out in such a crowded… Read More »

sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |