Boto 3 is the AWS SDK for Python. In fact, this SDK is the reason I picked up Python - so I can do stuff with AWS with a few lines of Python in a script instead of a full blown Java setup. Its fun, easy, and pretty much feels like working on a CLI with a rich programming language to back it up. In this post we will use SQS and boto 3 to perform basic operations on the service.

SQS is a highly available and scalable PaaS for Messaging. This allows for decoupling various components of the architecture, and cleaner hand-off of responsibilities across them. We use SQS heavily at Marqeta for various integration patterns.

If you’re used to JMS, then you may need some pivoting, as SQS is not exactly a JMS provider, but there is a library that can be used as a bridge between JMS and SQS.

Setup

  1. You should already have an AWS account, and AWS configured on your development machine via AWS CLI’s aws configure command. Here is a quick tutorial to familiarize yourself with SQS.

  2. You should already have python3 and pip3 installed. Please see this post for details on installing and getting started with Python 3.

  3. To install Boto 3, type pip3 install boto3 on the shell prompt.

  4. Verify that Boto 3 is installed -

bash-3.2$ pip3 show boto3
Name: boto3
Version: 1.4.4
Summary: The AWS SDK for Python
Home-page: https://github.com/boto/boto3
Author: Amazon Web Services
Author-email: UNKNOWN
License: Apache License 2.0
Location: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages
Requires: s3transfer, jmespath, botocore

We will use a standard queue, for FIFO Queues, see notes in the comments below.

The Code

#!/usr/local/bin/python3
import boto3
# create a boto3 client
client = boto3.client('sqs')
# create the test queue
# for a FIFO queue, the name must end in .fifo, and you must pass FifoQueue = True
client.create_queue(QueueName='test_queue')
# get a list of queues, we get back a dict with 'QueueUrls' as a key with a list of queue URLs
queues = client.list_queues(QueueNamePrefix='test_queue') # we filter to narrow down the list
test_queue_url = queues['QueueUrls'][0]
# send 100 messages to this queue
for i in range(0,100):
    # we set a simple message body for each message
    # for FIFO queues, a 'MessageGroupId' is required, which is a 128 char alphanumeric string
    enqueue_response = client.send_message(QueueUrl=test_queue_url, MessageBody='This is test message #'+str(i))
    # the response contains MD5 of the body, a message Id, MD5 of message attributes, and a sequence number (for FIFO queues)
    print('Message ID : ',enqueue_response['MessageId'])
# next, we dequeue these messages - 10 messages at a time (SQS max limit) till the queue is exhausted.
# in production/real setup, I suggest using long polling as you get billed for each request, regardless of an empty response
while True:
    messages = client.receive_message(QueueUrl=test_queue_url,MaxNumberOfMessages=10) # adjust MaxNumberOfMessages if needed
    if 'Messages' in messages: # when the queue is exhausted, the response dict contains no 'Messages' key
        for message in messages['Messages']: # 'Messages' is a list
            # process the messages
            print(message['Body'])
            # next, we delete the message from the queue so no one else will process it again
            client.delete_message(QueueUrl=test_queue_url,ReceiptHandle=message['ReceiptHandle'])
    else:
        print('Queue is now empty')
        break

Please refer to Boto 3 SQS documentation here.