Implementing Serverless Video on Demand Streaming with AWS: A Five-Part Series (Part 2)

Nic Lasdoce
14 Nov 20235 minutes read

This series guides the creation of a serverless Video on Demand (VoD) architecture using AWS, detailing the process from video upload to HLS conversion, ensuring secure, scalable streaming with immediate playback capabilities for a seamless user experience. This is Part 1 where we will detail how secure video uploading in a serverless VoD system using AWS works.

Introduction

Continuing our series on creating a serverless Video on Demand (VoD) system, we now focus on the critical aspect of video uploading. Specifically, we'll explore how to securely upload videos to an Amazon S3 bucket using a signed URL. This process involves the use of AWS API Gateway and Lambda to generate the URL, ensuring both security and efficiency.

Uploading directly to an API server, then having it transfer the file to S3, is a resource-intensive process that can strain your infrastructure. This was a lesson learned the hard way, resulting in significant challenges. To avoid these pitfalls, we will implement a direct upload method to S3, bypassing the API Gateway's 10MB payload limit and enhancing overall architecture efficiency.

The series is structured as follows:

  • Part 1 - Architecture Overview: We lay the groundwork, detailing the architecture and AWS services involved in our serverless VoD solution.

  • Part 2 - Secured File Upload: We'll cover the video upload process, how to securely upload to s3 bucket using signed url

  • Part 3 - Converting to Stream Format: This part will explain how to manage events from s3 upload to trigger conversion job to transform them into streaming format

  • Part 4 - Saving, Retrieving, and Playing: We will go through the saving and retrieval of video links, then play them using HLS Player on Chrome

  • Part 5 - Securing the Streaming: The final part will focus on securing the application to ensure that only authenticated users can access the videos.

Section 1: Required Resources

To implement this solution, we need to set up the following AWS resources:

  • An S3 bucket for video storage.
  • A Lambda function for signed URL generation.
  • An API Gateway Endpoint to get signed url
  • Necessary permissions for the Lambda function to access the S3 bucket.

Note: We will only be uploading mp4 files for simplicity of the tutorial

Here is an overview of the process

Section 2: Configuring AWS Lambda and S3

  1. Create an S3 Bucket: Essential for storing the uploaded videos. Just edit your bucket name and keep everything the same
  2. Lambda Function Setup: Navigate to AWS Lambda, create a new function named "s3-presigner-function", and select Python 3.11 as the runtime.
  3. Deploy the Script: Copy the provided script into the Lambda function, replace the "bucket_name" with the s3 bucket we just created, then click deploy button.
import json
import boto3
import os
from botocore import client
import uuid
def lambda_handler(event, context):
s3 = boto3.client('s3', config=client.Config(signature_version='s3v4'))
bucket_name = 's3-bucket-name-created-above'
object_name = f'{str(uuid.uuid4())}.mp4'
expiration = 3600 # or your preferred timeout
presigned_url = s3.generate_presigned_url(
'put_object',
Params={'Bucket': bucket_name, 'Key': object_name},
ExpiresIn=expiration)
print(presigned_url)
return {
'statusCode': 200,
'upload_url': presigned_url
}

4. Make sure to replace the "bucket_name" inside the script to the s3 bucket name we created above

Section 3: Creating an API Endpoint

To facilitate the retrieval of the signed URL:

  1. API Gateway Selection: Go to API Gateway and click Build on REST API
  2. API Gateway Setup: Add name to your api and click Create API
  3. Create Endpoint: Click on Create resource to create a resource named upload. This will become the /upload endpoint
  4. Add Http Get Method: Now click the /upload and then click "Create Method"
  5. Integrate with Lambda: Under method type select "GET" for simplicity since we will not use any payload for now. Then select lambda function and choose the "s3-presigner-function" we created before. Finally, click "Create Method"
  6. Deploy Dev Stage: We should be back to the resources page, from here click "Deploy API". Then Select "New Stage" and name in "dev"
  7. Invoke the Upload URL: After deployment from above, we will be redirected to Stages -> dev (this is the name we created before). Find the "Invoke URL" and copy it (mine is https://es8m6m6ezd.execute-api.us-east-1.amazonaws.com/dev)
  8. Get the Upload URL: Using the invoke url we got from above, copy it to your postman and add the /upload path. It should look like (invoke-url)/upload. Then we should get the response with the "upload_url"
  9. Uploading File Settings: Create a new postman request with PUT as method. Copy the upload_url from above and paste to the postman request url. Then, under "Body" select "binary" to allow we to select file to upload. See image on step 10 for more info.
  10. Upload the File: Now select any .mp4 file and then click Send
  11. Response: We will get an "Access Denied" error from step number 10, this is because we have not yet setup permission to allow the lambda function access to the s3 bucket. Let's do that in the next phase

Section 4: Troubleshooting Access Issues

  1. Adjust IAM Roles: Go to Security Credentials -> IAM -> Roles and find the role of the lambda function we created above. It should look like "s3-presigner-function-role-(some-char-here)". Click to open the role and its settings.
  2. Add Permission: After opening it, look for Add permissions -> Create inline policy.
  3. Permission Configuration: Add an inline policy with JSON code specifying the S3 bucket access permissions. Make sure to replace the (bucket-name) with the bucket name we created at the start of this tutorial
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Statement1",
"Effect": "Allow",
"Action": "s3:*",
"Resource": "arn:aws:s3:::<bucket-name>/*"
}
]
}

Try it Again Now get another upload_url using Section 3 Step 8 then upload it using Section 3 Step 10. We should now get a Status 200 Ok Response and your file should now be inside your bucket

NOTE: If your postman is throwing signature does not match error then try using the postman on web, this issue happened to me on a some postman version on linux

Conclusion

By following these steps, we establish a secure, efficient method for uploading videos directly to S3, bypassing potential bottlenecks. This setup not only enhances the performance of your VoD service but also lays the groundwork for a robust, scalable architecture. Stay tuned for the next part of our series, where we will delve into handling conversion completion events.

Bonus

If you are a founder needing help in your Software Architecture or Cloud Infrastructure, we do free assessment and we will tell you if we can do it or not! Feel free to contact us at any of the following:
Social
Contact

Email: nic@triglon.tech

Drop a Message

Tags:
Software Development
TechStack
AWS
Python

Nic Lasdoce

Software Architect

Unmasking Challenges, Architecting Solutions, Deploying Results

Member since Mar 15, 2021

Tech Hub

Unleash Your Tech Potential: Explore Our Cutting-Edge Guides!

Stay ahead of the curve with our cutting-edge tech guides, providing expert insights and knowledge to empower your tech journey.

View All
Struggling with Database Performance? Discover These 5 Aurora Patterns for Better Operations
30 Jun 20242 minutes read
Monolith: From Zero to Millions
14 May 20244 minutes read
View All

Get The Right Job For You

Subscribe to get updated on latest and relevant career opportunities