How to Upload Files to Amazon S3 without Access and Secret Key using IAM Roles and STS

Amazon S3 is a popular choice for data storage due to its scalability, data availability, and security features. However, managing access keys may be challenging and potentially insecure. But did you know you can upload files to Amazon S3 without an Access Key or Secret Key? In this article, I will walk you through the process.

How to Upload Files to Amazon S3 without Access and Secret Key using IAM Roles and STS

Amazon S3 is a popular choice for data storage due to its scalability, data availability, and security features. However, managing access keys may be challenging and potentially insecure. But did you know you can upload files to Amazon S3 without an Access Key or Secret Key? In this article, I will walk you through the process.

What is IAM and STS?

Before we delve into the process, let’s understand the two key AWS services we’ll be using: IAM and STS.

IAM, or Identity and Access Management, is an AWS service that helps control access to AWS resources. You can create users, groups, and roles and use permissions to allow and deny their access to AWS resources.

STS, or Security Token Service, is a web service that provides temporary, limited-privilege credentials for AWS Identity and Access Management (IAM) users or for users that you authenticate (federated users).

Creating an IAM Role

First, we need to create an IAM role. This role will be assumed by our application, granting it permissions to carry out necessary actions on S3.

aws iam create-role --role-name S3UploaderRole --assume-role-policy-document file://trust-policy.json

The trust-policy.json should look something like this:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {"Service": "ec2.amazonaws.com"},
      "Action": "sts:AssumeRole"
    }
  ]
}

Next, we need to attach a policy to this role that allows uploading files to S3:

aws iam put-role-policy --role-name S3UploaderRole --policy-name S3UploaderPolicy --policy-document file://s3-policy.json

The s3-policy.json should look something like this:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:PutObject",
        "s3:PutObjectAcl"
      ],
      "Resource": "arn:aws:s3:::bucket-name/*"
    }
  ]
}

Requesting Temporary Credentials with STS

Next, let’s use the STS service to request temporary credentials:

import boto3

sts_client = boto3.client('sts')

assumed_role_object=sts_client.assume_role(
    RoleArn="arn:aws:iam::account-id:role/S3UploaderRole",
    RoleSessionName="S3UploaderSession"
)

credentials=assumed_role_object['Credentials']

s3_resource=boto3.resource(
    's3',
    aws_access_key_id=credentials['AccessKeyId'],
    aws_secret_access_key=credentials['SecretAccessKey'],
    aws_session_token=credentials['SessionToken'],
)

Uploading a File to S3

Finally, we can use these temporary credentials to upload a file to S3:

s3_resource.Bucket('bucket-name').upload_file(Filename='file-path', Key='file-name')

The upload_file method uploads the file at the given local path to the bucket, with the specified key (file name).

Conclusion

In this post, we learned how to upload files to Amazon S3 without using an Access Key or Secret Key. We used IAM to create a role with the necessary permissions, and then used STS to assume this role and get temporary credentials. This approach can be more secure than managing long-term keys, especially for applications running on EC2 instances. Remember, always follow the principle of least privilege when assigning permissions to your roles.

Remember to replace all placeholders, such as account-id, bucket-name, file-path, and file-name with your actual values. Happy uploading!

Keywords: Amazon S3, upload, IAM, STS, no access key, no secret key, data storage, security, AWS


About Saturn Cloud

Saturn Cloud is your all-in-one solution for data science & ML development, deployment, and data pipelines in the cloud. Spin up a notebook with 4TB of RAM, add a GPU, connect to a distributed cluster of workers, and more. Join today and get 150 hours of free compute per month.