As we continue our project in creating a serverless Video on Demand (VoD) system, this segment, titled "Converting to Stream Format," plays a pivotal role. Here, we will delve into the mechanics of managing events triggered by uploads to Amazon S3. Specifically, our focus will be on initiating conversion jobs that transform these uploaded videos into a streaming format. This process is integral to ensuring that our VoD system not only stores videos but also prepares them for seamless streaming to end users.
This part of the series is essential as it bridges the gap between simple video storage and the actual streaming of video content, a core functionality in any VoD platform. By automating the conversion process, we ensure efficiency and readiness of content for users, fulfilling the promise of a robust and responsive VoD service.
The series is structured as follows:
Part 1 - Architecture Overview: We lay the groundwork, detailing the architecture and AWS services involved in our serverless VoD solution.
Part 2 - Secured File Upload: We'll cover the video upload process, how to securely upload to s3 bucket using signed url
Part 3 - Converting to Stream Format: This part will explain how to manage events from s3 upload to trigger conversion job to transform them into streaming format
Part 4 - Saving, Retrieving, and Playing: We will go through the saving and retrieval of video links, then play them using HLS Player on Chrome
Part 5 - Securing the Streaming: The final part will focus on securing the application to ensure that only authenticated users can access the videos.
In our approach, we employ AWS services effectively to automate the process. When a video is uploaded to an S3 bucket, it triggers an event that activates a Lambda function. This function is then responsible for invoking AWS Media Convert, which undertakes the critical task of converting the video file into a format suitable for streaming. The converted files are subsequently saved back into an S3 folder, with the folder name mirroring the original file name, thereby maintaining a coherent structure.
The process is shown in the diagram and in summary:
import boto3import jsonMEDIA_CONVERT_ENDPOINT_URL = 'Set this up through tutorial below'ROLE_ARN = 'Set this up through tutorial below'QUEUE_ARN = 'Set this up through tutorial below'mediaconvert_client = boto3.client('mediaconvert',region_name='us-east-1',endpoint_url=MEDIA_CONVERT_ENDPOINT_URL)def generate_parameters(bucket, key):s3_output_object_path = key.split('.')[0]s3_input_location = f's3://{bucket}/{key}'s3_output_location = f's3://{bucket}/{s3_output_object_path}/'base_input = {'VideoSelector': {'ColorSpace': 'FOLLOW','Rotate': 'AUTO',},'FilterEnable': 'AUTO','PsiControl': 'USE_PSI','FilterStrength': 0,'DeblockFilter': 'DISABLED','DenoiseFilter': 'DISABLED','TimecodeSource': 'EMBEDDED','FileInput': s3_input_location,}base_output = {'ContainerSettings': {'Container': 'M3U8','M3u8Settings': {},},'VideoDescription': {'CodecSettings': {'Codec': 'H_264','H264Settings': {'MaxBitrate': 5000000,'RateControlMode': 'QVBR','SceneChangeDetect': 'TRANSITION_DETECTION',},},},'OutputSettings': {'HlsSettings': {},},'NameModifier': 'stream',}job_settings = {'Queue': QUEUE_ARN,'UserMetadata': {'Customer': 'Amazon',},'BillingTagsSource': 'QUEUE','Role': ROLE_ARN,'Settings': {'OutputGroups': [{'CustomName': 'hls','Name': 'Apple HLS','Outputs': [base_output],'OutputGroupSettings': {'Type': 'HLS_GROUP_SETTINGS','HlsGroupSettings': {'SegmentLength': 10,'Destination': s3_output_location,'MinSegmentLength': 0,},},},{'CustomName': 'poster-frame','Name': 'File Group','Outputs': [{'ContainerSettings': {'Container': 'RAW',},'VideoDescription': {'CodecSettings': {'Codec': 'FRAME_CAPTURE','FrameCaptureSettings': {'MaxCaptures': 1,'Quality': 20,},},},'Extension': 'jpg',},],'OutputGroupSettings': {'Type': 'FILE_GROUP_SETTINGS','FileGroupSettings': {'Destination': s3_output_location,},},},{'CustomName': 'poster-thumbnail','Name': 'File Group','Outputs': [{'ContainerSettings': {'Container': 'RAW',},'VideoDescription': {'CodecSettings': {'Codec': 'FRAME_CAPTURE','FrameCaptureSettings': {'MaxCaptures': 1,'Quality': 5,},},},'Extension': 'jpg',},],'OutputGroupSettings': {'Type': 'FILE_GROUP_SETTINGS','FileGroupSettings': {'Destination': f'{s3_output_location}thumbnail',},},},],'AdAvailOffset': 0,'Inputs': [base_input],'TimecodeConfig': {'Source': 'EMBEDDED',},},}return job_settingsdef lambda_handler(event, context):bucket = event['Records'][0]['s3']['bucket']['name']key = event['Records'][0]['s3']['object']['key']print(bucket)print(key)job_settings = generate_parameters(bucket, key)response = mediaconvert_client.create_job(**job_settings)return response['ResponseMetadata']
MEDIA_CONVERT_ENDPOINT_URL = ''ROLE_ARN = ''QUEUE_ARN = ''
Note that this is better placed in the environment variables section or using aws secrets, but for simplicity sake, we will simply add it to the function itself
We are now done with everything that this lambda function will need to execute
To trigger the conversion, we will setup S3 Event Notifications to filter .mp4 files and for each new .mp4 file, it will notify the lambda function to run.
To check if everything is working so far then:
That's it, we just finished the conversion part of the series. On the next part, we will show how we can save the file links and fetch the data from dynamo db and how to see our video playing on Chrome through an HLS Player Extension
Stay ahead of the curve with our cutting-edge tech guides, providing expert insights and knowledge to empower your tech journey.
Subscribe to get updated on latest and relevant career opportunities