AWS CLI is not available
Since you mentioned the AWS CLI is not available, we will focus on configuring AWS directly in your Django project using environment variables. This approach ensures that your AWS credentials are securely managed and can be easily changed without modifying your code.
Step 1: Set Up Environment Variables
First, set up your environment variables to store AWS credentials. You can do this in your operating system or in a .env
file if you're using a library like django-environ
.
Using Operating System Environment Variables
You can set environment variables in your shell configuration file (.bashrc
, .zshrc
, etc.) or directly in your terminal session:
export AWS_ACCESS_KEY_ID='your-access-key-id'
export AWS_SECRET_ACCESS_KEY='your-secret-access-key'
export AWS_STORAGE_BUCKET_NAME='your-bucket-name'
export AWS_S3_REGION_NAME='your-region-name'
Using a .env
File
If you prefer using a .env
file, first install django-environ
:
pip install django-environ
Then, create a .env
file in the root of your Django project:
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_STORAGE_BUCKET_NAME=your-bucket-name
AWS_S3_REGION_NAME=your-region-name
Step 2: Configure Django to Read Environment Variables
If you’re using django-environ
, update your settings.py
to read from the .env
file:
import environ
# Initialise environment variables
env = environ.Env()
environ.Env.read_env()
# AWS S3 settings
AWS_ACCESS_KEY_ID = env('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = env('AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = env('AWS_STORAGE_BUCKET_NAME')
AWS_S3_REGION_NAME = env('AWS_S3_REGION_NAME')
AWS_S3_SIGNATURE_VERSION = 's3v4'
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
Step 3: Create a Function to Upload Files
Create a function in a utility file to handle file uploads. This function will use boto3
to interact with S3:
import boto3
from django.conf import settings
def upload_file_to_s3(file, bucket_name=settings.AWS_STORAGE_BUCKET_NAME, object_name=None):
# If S3 object_name was not specified, use file name
if object_name is None:
object_name = file.name
# Upload the file
s3_client = boto3.client(
's3',
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
region_name=settings.AWS_S3_REGION_NAME,
)
try:
s3_client.upload_fileobj(file, bucket_name, object_name)
except Exception as e:
print(f"Error uploading file to S3: {e}")
return False
return True
Step 4: Using the Upload Function in a View
In your views, use the upload function to handle file uploads:
from django.http import HttpResponse
from .utils import upload_file_to_s3 # Adjust the import according to your project structure
def upload_view(request):
if request.method == 'POST' and request.FILES['file']:
file = request.FILES['file']
success = upload_file_to_s3(file)
if success:
return HttpResponse("File uploaded successfully")
else:
return HttpResponse("File upload failed", status=500)
return HttpResponse("Upload a file")
Step 5: Create a Simple HTML Form for Upload
Create a simple HTML form to upload files:
<form method="post" enctype="multipart/form-data">
{% csrf_token %}
<input type="file" name="file">
<button type="submit">Upload</button>
</form>
Step 6: Add URL Configuration
Add a URL pattern to access the upload view in your urls.py
:
from django.urls import path
from .views import upload_view
urlpatterns = [
path('upload/', upload_view, name='upload'),
]
Summary
By following these steps, you will have a Django application configured to upload files to Amazon S3 using environment variables for configuration, eliminating the need for the AWS CLI. This approach ensures that your credentials are managed securely and can be easily updated.