Connect to Amazon S3
Amazon S3 (Simple Storage Service) provides scalable object storage for files such as user uploads, backups, and static assets. This guide covers connecting your Appliku application to S3 or any S3-compatible storage service.
Use Cases
- Store user-uploaded media files outside your server
- Serve static assets from a CDN-backed bucket
- Store database backups offsite
- Share files between multiple application instances
Step 1: Install Dependencies
Add the following to your requirements.txt:
django-storages
boto3
Step 2: Create an S3 Bucket
In the AWS Management Console:
- Go to S3 and click Create bucket
- Choose a unique bucket name and region
- Configure public access settings based on your needs
- Create the bucket
Step 3: Create IAM Credentials
Create an IAM user with programmatic access and attach a policy that grants access to your bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
]
}
]
}
Step 4: Set Environment Variables
In your application's Environment Variables tab, add the following:
| Variable | Value |
|---|---|
AWS_ACCESS_KEY_ID | Your IAM access key |
AWS_SECRET_ACCESS_KEY | Your IAM secret key |
AWS_STORAGE_BUCKET_NAME | Your bucket name |
AWS_S3_REGION_NAME | Bucket region (e.g., us-east-1) |
Never hard-code credentials in your source code. Always use environment variables.
Step 5: Configure Django Settings
# settings.py
import os
DEFAULT_FILE_STORAGE = "storages.backends.s3boto3.S3Boto3Storage"
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_STORAGE_BUCKET_NAME")
AWS_S3_REGION_NAME = os.environ.get("AWS_S3_REGION_NAME", "us-east-1")
# Optional settings
AWS_S3_FILE_OVERWRITE = False
AWS_DEFAULT_ACL = None
AWS_QUERYSTRING_AUTH = True
Using S3-Compatible Services
django-storages works with any service that implements the S3 API. Add the AWS_S3_ENDPOINT_URL environment variable to point to your provider:
| Provider | Endpoint URL Example |
|---|---|
| DigitalOcean Spaces | https://nyc3.digitaloceanspaces.com |
| MinIO | https://minio.example.com |
| Backblaze B2 | https://s3.us-west-002.backblazeb2.com |
# settings.py
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL") # Only needed for non-AWS providers
DigitalOcean Spaces is a popular and cost-effective alternative to AWS S3. The configuration is identical -- just set the endpoint URL and use your Spaces credentials.
Step 6: Deploy
Save your environment variables and deploy. Your application will now store uploaded files on S3 instead of the local filesystem.