How to set-up AWS to receive data from your device using a python script.

Uploading files to Amazon S3 is one of the most common tasks when working with cloud-based applications. In this guide, I’ll walk you through the exact steps needed to set up AWS so your Python code can securely upload data into an S3 bucket.

We’ll cover:

  • Creating an S3 bucket

  • Creating and configuring a KMS key

  • Creating an IAM user

  • Assigning the right permissions

  • Configuring credentials

  • Writing the Python upload script

By the end, your environment will be fully ready for secure, authenticated uploads to S3.


🪣 1. Create an S3 Bucket

Your S3 bucket will be the storage location for the files uploaded by Python.

Steps

  1. In the AWS Console, search for S3.

  2. Click Create bucket.

  3. Enter a globally unique Bucket name.

  4. Choose the AWS Region where your bucket should live.

  5. Ensure Block Public Access remains enabled.

  6. Under Default encryption, choose:

    • AWS KMS (we’ll create the KMS key next)

  7. Click Create bucket.

That’s it—you now have a secure S3 bucket ready for uploads.


🔐 2. Create a KMS Key for Encryption

Using a customer-managed KMS key gives you full control over encryption and access auditing.

Steps

  1. Open the AWS Console and search for KMS.

  2. Select Customer managed keys → Create key.

  3. Choose Symmetric key.

  4. Give the key a memorable name (e.g., s3-upload-key).

  5. Choose yourself as the Key Administrator.

  6. Under Key Users, add:

    • The IAM user you'll create next.

  7. Finish the wizard.

Be sure to copy the KMS Key ARN—your Python code will need it.


👤 3. Create an IAM User for Programmatic Access

This user will supply the credentials for your Python code.

Steps

  1. In AWS Console, open IAM.

  2. Select Users → Create user.

  3. Name it something like python-s3-uploader.

  4. Enable Programmatic access.

  5. Create the user without permissions for now.

You now have the identity—next, we attach the permissions.


🛡️ 4. Create an IAM Policy for S3 Uploads

This policy grants the user permission to upload into the bucket and to use your KMS key.

Steps

  1. In IAM, go to Policies → Create policy.

  2. Switch to the JSON tab.

  3. Paste and customize this policy:

{

  "Version": "2012-10-17",

  "Statement": [

    {

      "Effect": "Allow",

      "Action": [

        "s3:PutObject",

        "s3:PutObjectAcl",

        "s3:AbortMultipartUpload"

      ],

      "Resource": "arn:aws:s3:::YOUR_BUCKET_NAME/*"

    },

    {

      "Effect": "Allow",

      "Action": [

        "kms:Encrypt",

        "kms:Decrypt",

        "kms:GenerateDataKey"

      ],

      "Resource": "arn:aws:kms:REGION:ACCOUNT_ID:key/YOUR_KEY_ID"

    }

  ]

}

  1. Save the policy with a name like s3-upload-policy.


🔗 5. Attach the Policy to the IAM User

Now we give the user permission to use S3 and KMS.

Steps

  1. Go to IAM → Users → python-s3-uploader.

  2. Choose Permissions → Add permissions.

  3. Select Attach policies directly.

  4. Pick your s3-upload-policy.

  5. Save.

Your IAM user can now upload encrypted objects to your bucket.


🗝️ 6. Create Access Keys for the IAM User

These keys will authenticate your Python code.

Steps

  1. In IAM, open the Security credentials tab of your user.

  2. Click Create access key.

  3. Choose Local code.

  4. Download or copy your:

    • AWS Access Key ID

    • AWS Secret Access Key

Store them securely—as you can’t see the secret again.


🗝️ 7. Store Your AWS Credentials in a .env File (Visual Studio Code)

Instead of using aws configure, a great practice—especially when writing Python locally—is to store your credentials securely in a .env file. This prevents your credentials from being accidentally committed to GitHub.

Steps

  1. In Visual Studio Code, create a new file named:

.env

  1. Add your AWS credentials:

AWS_ACCESS_KEY_ID=your-access-key-id

AWS_SECRET_ACCESS_KEY=your-secret-access-key

BUCKET=your-region

  1. Open your project’s .gitignore file and add:

.env

This ensures your credentials never get pushed to GitHub or any shared repository.

Your environment is now fully configured and ready for Python to load these credentials securely.


🐍 8. Upload a File to S3 Using Python and Environment Variables

Now that your .env file is set up, you can write Python code that reads your AWS credentials securely and uploads a file to S3.

Below is a simple example using boto3 and python-dotenv.

Python Upload Script

import boto3

import os

from dotenv import load_dotenv

# Load variables from .env file

load_dotenv()

aws_access_key = os.getenv('AWS_ACCESS_KEY')

aws_secret_key = os.getenv('AWS_SECRET_ACCESS_KEY')

bucket = os.getenv('BUCKET_NAME')

# Create an S3 client using your environment variables

s3_client = boto3.client(

    's3',

    aws_access_key_id=aws_access_key,

    aws_secret_access_key=aws_secret_key

)

Author:
Alfie King
Powered by The Information Lab
1st Floor, 25 Watling Street, London, EC4M 9BR
Subscribe
to our Newsletter
Get the lastest news about The Data School and application tips
Subscribe now
© 2025 The Information Lab