e-Zest members share technology ideas to foster digital transformation.

Boto3 with Python3+

Written by Prasannasingh Bayas | Nov 11, 2016 6:14:40 AM

Python3+:

Python3 has released new version from core python. They have improved a lot of functionalities in Python3. I will not give a brief explanation of new features, but will try to provide an overview.

Here are the new features added to Python 3+:
a. Dedicated (@) infix operator for matrix multiplication which adds new directory iteration.
b. Function os.scandir() to standard library where an errno.EINTR error code is returned whenever a system calls, that is waiting for I/O, is interrupted by a signal. Previously, Python would raise InterruptedError in such cases. It meant that, while writing a Python application developer would have two choices: Ignore InterruptedError or handle InterruptedError and attempt to restart interrupted system call at every call site. Python3 has only support for Boto3. The lesser version of Boto does not support newer version of Python.

Boto3:

Boto is a Python package which provides interfaces to Amazon Web Services (AWS). Boto is fully supported with Python’s 2.6 and 2.7 and Python 3+. Boto modules are being ported one at a time with the help of open source community. It’s also known as AWS library. Python developer can write services by using Amazon S3 and EC2.
Boto3 is newly a released version which has a total different interface. While using Boto3 you should configure AWS credentials for more details we will look forward:

If you are using Ubuntu 16.04, it has a built-in Python 3+ package, so there is no need to install extra packages.
I assume you have Virtual Environment installed and have knowledge about commands. If you don’t, then please visit here.

How to install Boto3:

Need to install boto3 in your virtual environment directory. Once boto3 is instaledl we can access botos liabrary from boto package.
Syntax:
$pip install boto3

Install AWS CLI :

AWS Command Line Interface will install your aws-cli library which supports to Python3.5. If you don’t want install please don’t use this command.
Syntax:
$sudo apt-get install awscli

Configure AWS credentials:

After installing aws-cli we need to configure AWS keys and credentials:

$aws configure
$AWS Access Key ID [None]: xxxxxxxxxx
$AWS Secret Access Key [None]: xxxxxxxx
$Default region name [None]: xxxxxxxxxx
$Default output format [None]: xxxxxxx

Run Python by command line:

Run Python in command line so we can write Python program in command prompt.

$python3.5
$Python 3.5.2 (default, Sep 10 2016, 08:21:44)
$[GCC 5.4.0 20160609] on linux
$Type "help", "copyright", "credits" or "license" for more information.
$>>>

Import Boto3
By importing boto3 we can use all packages provided by them.

>>>import boto3

Create S3 Bucket using Boto3:

Sytnax:

>>> import boto3
>>> s3 = boto3.resource("s3")
>>> bucket = s3.Bucket("<Bucket_name>")
>>> res = bucket.create(ACL='private')

Parameters to pass while creating S3 bucket.

bucket.create(
ACL='private'|'public-read'|'public-read-write'|'authenticated-read',
CreateBucketConfiguration={
'LocationConstraint': 'EU'|'eu-west-1'|'us-west-1'|'us-west-2'|'ap-south-1'|'ap-southeast-1'|'ap-southeast-2'|'ap-northeast-1'|'sa-east-1'|'cn-north-1'|'eu-central-1'
},
GrantFullControl='string',
GrantRead='string',
GrantReadACP='string',
GrantWrite='string',
GrantWriteACP='string'
)

Parameters:
ACL (string) -- Canned ACL for applying to the bucket.
CreateBucketConfiguration (dict) --
LocationConstraint (string) -- Specifies the region where the bucket will be created. If you don't specify a region, the bucket will be created in US Standard format.
GrantFullControl (string) -- Allows grantee read, write, read ACP, and write ACP permissions on the bucket.
GrantRead (string) -- Allows grantee to list the objects in the bucket.
GrantReadACP (string) -- Allows grantee to read the bucket ACL.
GrantWrite (string) -- Allows grantee to create, overwrite, and delete any object in the bucket.
GrantWriteACP (string) -- Allows grantee to write the ACL for the applicable bucket.

Get List of Bucket:

To get list of bucket created in our account.
>>> for bucket in s3.buckets.all():
... print(bucket.name)

OutPut:
BucketName1
BucketName2

Deleting Bucket:

We can delete the bucket by using delete command provided by boto package.

>>> res = bucket.delete("<Bucket_Name>")

Upload by File on S3 Bucket:

Uploading file on S3 using boto3 is most important point in our blog so we are going to upload file on S3 by single command using boto3.

Syntax:
upload_file(Filename, Key, ExtraArgs=None, Callback=None, Config=None)

>>>import os //imported os to take exact path of local file
>>> file = os.path.dirname(os.path.realpath(_filename_))
>>> s3.Bucket('b3p3').upload_file(file,'<File_Name>')

Parameters:
Filename : Name of file which you want to upload on s3.
Key : Name of the Key to upload to.
ExtraArge: Extra arguments that may be passed to upload.
Callback : It takes number of bytes transferred to periodically called during the upload.
Config: Transfer configuration to be used when performing upload

Upload by FileObject:

Upload a file-like object to this bucket. The file-like object must be in binary mode. This is a managed transfer which will perform a multipart upload in multiple threads if necessary.
Syntax:
upload_fileobj(Fileobj, Key, ExtraArgs=None, Callback=None, Config=None)

>>>import boto3
>>>s3 = boto3.resource('s3')
>>>bucket = s3.Bucket('mybucket')
>>>file = os.path.dirname(os.path.realpath('_filename_'))
>>>with open(file, 'rb') as data:
... bucket.upload_fileobj(data, '<File_Name>')

Fileobj: A file-like object to upload.
Key: Name of the Key to upload to.
ExtraArge: Extra arguments that may be passed to upload.
Callback : It takes number of bytes transferred to periodically called during the upload.
Config: Transfer configuration to be used when performing upload

Download by File from S3 Bucket:

Syntax:
download_file(Key, Bucket, Filename, ExtraArgs=None, Callback=None, Config=None)

>>>import os
>>> s3 = boto3.resource('s3')
>>> s3.Bucket('b3p3').download_file('<Object_name>', '<Local_storage_path_filename> ')

Filename: A file-like an s3 object to download.
Bucket: Name of the bucket to download from.
Key: Name of the download from.
ExtraArge: Extra arguments that may be passed to upload.
Callback : It takes number of bytes transferred to periodically called during the upload.
Config: Transfer configuration to be used when performing upload

Download by Fileobj from S3:

Syntax:

download_fileobj(Key, Fileobj, ExtraArgs=None, Callback=None, Config=None)

>>>import boto3
>>>s3 = boto3.resource('s3')
>>>bucket = s3.Bucket('mybucket')
>>>with open(file, 'rb') as data:
... bucket.download_fileobj('mykey', data)

Fileobj: A file-like object to download.
Key: Name of the Key to download to.
ExtraArge: Extra arguments that may be passed to download.
Callback : It takes number of bytes transferred to periodically called during the download.
Config: Transfer configuration to be used when performing download.

Get Uploaded File object detail:

Syntax:

>>> from boto3 import client
>>> conn = client('s3')
>>> for key in conn.list_objects(Bucket='b3p3')['Contents']:
... print(key)

OutPut:

{'LastModified': datetime.datetime(2016, 10, 19, 11, 22, 22, tzinfo=tzutc()), 'Size': 82, 'Owner': {'DisplayName': '<AccountOwnerName>', 'ID': 'cc1f3ad2ac74c75eb258cdcc54bdec5aa29be0e993fad632f9907ea992a859a8'}, 'ETag': '"64f108b15a97f7d6cf5334d478a151c5"', 'StorageClass': 'STANDARD', 'Key': 'Doc.txt'}

We can get details about uploaded object/key/file on s3. This will help us to get Size, Owner, Name of file etc.

Get Bucket Size:

Get bucket size by each of the key presented in respective bucket. We can get filesize by using this method also.

Syntax:
>>>bsize = 0
>>> for key in conn.list_objects(Bucket='b3p3')['Contents']:
... bsize += key['Size']
...
>>> print(bsize)

OutPut:
xxxxx

Reference links:
https://boto3.readthedocs.io/en/latest/# - Boto3
http://docs.python-guide.org/en/latest/dev/virtualenvs/ - VEM
https://www.python.org/download/releases/3.0/ - Python3