How to use Boto3 library in Python to upload an object in S3 using AWS Resource?

Boto3 is the AWS SDK for Python that allows you to integrate Python applications with AWS services. This tutorial shows how to upload a file to Amazon S3 using the AWS Resource interface in Boto3.

Prerequisites

Before uploading files to S3, ensure you have:

  • AWS credentials configured (via AWS CLI, environment variables, or IAM roles)
  • Boto3 installed: pip install boto3
  • S3 bucket created with appropriate permissions

Approach

The solution involves validating paths, creating an S3 resource, parsing the S3 path, and uploading the file using upload_fileobj() method.

Key Steps

Step 1 ? Import required libraries including boto3 and botocore exceptions

Step 2 ? Validate that the source path is local and destination path follows S3 format

Step 3 ? Create AWS session and S3 resource

Step 4 ? Parse S3 path to extract bucket name and key

Step 5 ? Upload file and wait for completion

Complete Example

Here's a complete function to upload a file to S3 ?

import boto3
from botocore.exceptions import ClientError
from pathlib import PurePosixPath

def upload_object_into_s3(s3_path, filepath):
    # Validate input paths
    if 's3://' in filepath:
        print('SourcePath is not a valid path.' + filepath)
        raise Exception('SourcePath is not a valid path.')
    elif s3_path.find('s3://') == -1:
        print('DestinationPath is not a s3 path.' + s3_path)
        raise Exception('DestinationPath is not a valid path.')
    
    # Create AWS session and S3 resource
    session = boto3.session.Session()
    s3_resource = session.resource('s3')
    
    # Parse S3 path to extract bucket and key
    tokens = s3_path.split('/')
    target_key = ""
    if len(tokens) > 3:
        for tokn in range(3, len(tokens)):
            if tokn == 3:
                target_key += tokens[tokn]
            else:
                target_key += "/" + tokens[tokn]
    
    target_bucket_name = tokens[2]
    
    # Get filename and construct full key path
    file_name = PurePosixPath(filepath).name
    if target_key != '':
        target_key.strip()
        key_path = target_key + "/" + file_name
    else:
        key_path = file_name
    
    print(("key_path: " + key_path, 'target_bucket: ' + target_bucket_name))
    
    try:
        # Upload file to S3
        with open(filepath, "rb") as file:
            s3_resource.meta.client.upload_fileobj(file, target_bucket_name, key_path)
            
        try:
            # Wait until object exists in S3
            s3_resource.Object(target_bucket_name, key_path).wait_until_exists()
        except ClientError as error:
            error_code = int(error.response['Error']['Code'])
            if error_code == 412 or error_code == 304:
                print("Object didn't Upload Successfully ", target_bucket_name)
                raise error
        
        return "Object Uploaded Successfully"
        
    except Exception as error:
        print("Error in upload object function of s3 helper: " + error.__str__())
        raise error

# Example usage
print(upload_object_into_s3('s3://Bucket_1/testfolder', 'c://test.zip'))

Output

key_path:/testfolder/test.zip, target_bucket: Bucket_1
Object Uploaded Successfully

How It Works

The function performs several key operations:

  • Path Validation ? Ensures the source is a local path and destination follows S3 URI format
  • S3 Resource Creation ? Creates a boto3 session and S3 resource for API operations
  • Path Parsing ? Splits the S3 URI to extract bucket name and key prefix
  • File Upload ? Uses upload_fileobj() to stream file data to S3
  • Verification ? Waits until the object exists to confirm successful upload

Error Handling

The function includes comprehensive error handling for:

  • Invalid path formats
  • Upload failures with specific error codes
  • Generic exceptions during the upload process

Conclusion

Using boto3's S3 resource interface provides a robust way to upload files to Amazon S3. The function includes proper validation, error handling, and verification to ensure reliable file uploads to your S3 buckets.

Updated on: 2026-03-25T18:09:06+05:30

924 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements