This document provides an overview of a technical demonstration that uses various AWS services like SNS, SQS, DynamoDB and S3 to mimic an auto-scaling application and generate dynamic dashboard content. It describes how auto scaling events are captured using SNS notifications and persisted to SQS, then processed to update a DynamoDB table and generate JSON files in S3 that power dynamic frontend content through JavaScript. The goal is to illustrate how these services can be integrated to build scalable, event-driven applications.
2. A technical deep dive beyond the basics
Help educate you on how to get the best from AWS technologies
Show you how things work and how to get things done
Broaden your knowledge in ~45 mins
Masterclass
3. A grand title for a demonstration system that ties together services
Show you how to use services like SNS and SQS to carry AWS events
Use S3 as a web server to host dynamically generated content
Why? Show you some tips and tricks you can use in your projects
Application Services &
Dynamic Dashboard
5. Services &
topics
EC2
Instances to run our
application code
SNS
To publish events from our
instances
Autoscaling
To generate events as our
application scales up & down
SQS
To persist the event messages
for processing
S3
To store and serve content
we create
CloudFormation
To build our system as a
managed stack
DynamoDB
To store all events from
application
IAM
To control the creation and
management of resources
6. To do
what? Mimic an application that implements auto-
scaling
Trap, transport and store the scaling events
produced
Use a simple technique to produce pseudo-
dynamic content from S3
7. To do
what? Mimic an application that implements auto-
scaling
Trap, transport and store the scaling events
produced
Use a simple technique to produce pseudo-
dynamic content from S3
An exercise beyond compute and storage!
8. There’s a movie you can
view:
http://youtu.be/lb9qPhxIVNI
I’ll show this link again at the end
9. This demo is just an illustration of
what you can do with these services
11. Built in
this way…
Auto scaling Group
An arbitrary
application that we
can scale up and
down
12. Auto scaling Group
SNS notification
from auto scaling
group SNS event body as
JSON
SQS queue to
persist event
Built in
this way…
Messages produced
when instances are
started or
terminated
13. Auto scaling Group
SNS notification
from auto scaling
group SNS event body as
JSON
SQS queue to
persist event
DynamoDB table
holding instance details
Monitoring
instance
Auto scaling Group
Built in
this way…
14. Auto scaling Group
SNS notification
from auto scaling
group SNS event body as
JSON
SQS queue to
persist event
DynamoDB table
holding instance details
S3 bucket holding
dashboard web content
Auto scaling Group
Monitoring
instance
Built in
this way…
15. EC2 Instance Contents
DynamoDB table
holding instance details
Monitoring
instance
Python script
Read SQS queue and
generate data for S3
Static site
HTML, Javascript, css
reading data file
Built in
this way…
26. Simple Queue Service
Reliable
Queues store messages across
availability zones
Scalable
Designed for unlimited services
reading unlimited number of
messages
Simple
CreateQueue, SendMessage,
ReceiveMessage, DeleteMessage
Inexpensive
Low per request fees
Secure
Authentication
Performance
Excellent throughput
29. Python
Application A
>>> import boto
>>> conn = boto.connect_sqs()
>>> q = conn.create_queue('myqueue')
>>> from boto.sqs.message import Message
>>> m = Message()
>>> m.set_body('This is my first message.')
>>> status = q.write(m)
30. Python
Application A
Python
Application B
>>> import boto
>>> conn = boto.connect_sqs()
>>> q = conn.create_queue('myqueue')
>>> from boto.sqs.message import Message
>>> m = Message()
>>> m.set_body('This is my first message.')
>>> status = q.write(m)
>>> m = q.read(60)
>>> m.get_body()
Message not visible
to other
applications for 60
seconds
31. Python
Application A
Python
Application B
X
>>> import boto
>>> conn = boto.connect_sqs()
>>> q = conn.create_queue('myqueue')
>>> from boto.sqs.message import Message
>>> m = Message()
>>> m.set_body('This is my first message.')
>>> status = q.write(m)
>>> m = q.read(60)
>>> m.get_body()
>>> q.delete_message(m)
Message not visible
to other
applications for 60
seconds
41. {
"Type" : "Notification",
"MessageId" : <message id>,
"TopicArn" : <arn>,
"Subject" : "Auto Scaling: termination for group "SNS-Dashboard-ASG"",
"Message" : ”…",
"Timestamp" : "2013-05-21T09:13:09.555Z",
"SignatureVersion" : "1",
"Signature" : ”…",
"SigningCertURL" : "https://sns.us-east-
1.amazonaws.com/SimpleNotificationService-f3ecfb7224c7233fe7bb5f59f96de52f.pem",
"UnsubscribeURL" : "https://sns.us-east-
1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-
1:241861486983:SNS-Dashboard-ASNotifications-7GU41DCQW8HC:ed30bf6e-582c-4fd2-8e07-
28f7d1ac6278"
}
{"StatusCode":"InProgress","Service":"AWS Auto Scaling","AutoScalingGroupName":"SNS-Dashboard-
ApplicationServerGroup-K61R5797WCMA","Description":"Terminating EC2 instance: i-8bb679eb","ActivityId":"dfc1181b-
0df8-47dc-aa8d-
79e13b8a33d1","Event":"autoscaling:EC2_INSTANCE_TERMINATE","Details":{},"AutoScalingGroupARN":"arn:aws:autoscali
ng:us-east-1:241861486983:autoScalingGroup:77ef2778-ded1-451a-a630-6a35c8e67916:autoScalingGroupName/SNS-Dashboard-
ApplicationServerGroup-K61R5797WCMA","Progress":50,"Time":"2013-05-
21T09:13:09.442Z","AccountId":"241861486983","RequestId":"dfc1181b-0df8-47dc-aa8d-
79e13b8a33d1","StatusMessage":"","EndTime":"2013-05-21T09:13:09.442Z","EC2InstanceId":"i-
8bb679eb","StartTime":"2013-05-21T09:12:20.323Z","Cause":"At 2013-05-21T09:12:02Z a user request explicitly set group
desired capacity changing the desired capacity from 5 to 1. At 2013-05-21T09:12:19Z an instance was taken out of service in
response to a difference between desired and actual capacity, shrinking the capacity from 5 to 1. At 2013-05-21T09:12:19Z
instance i-8fdbafed was selected for termination. At 2013-05-21T09:12:19Z instance i-8ddbafef was selected for termination. At
2013-05-21T09:12:20Z instance i-8bb679eb was selected for termination. At 2013-05-21T09:12:20Z instance i-85b778e5 was
selected for termination."}
46. We now have events in SQS
Let’s do something with them…
47. Auto scaling Group
SNS notification
from auto scaling
group SNS event body as
JSON
SQS queue to
persist event
Built in
this way…
Messages produced
when instances are
started or
terminated
48. Auto scaling Group
SNS notification
from auto scaling
group SNS event body as
JSON
SQS queue to
persist event
DynamoDB table
holding instance details
Monitoring
instance
Auto scaling Group
Built in
this way…
49. Auto scaling Group
SNS notification
from auto scaling
group SNS event body as
JSON
SQS queue to
persist event
DynamoDB table
holding instance details
S3 bucket holding
dashboard web content
Auto scaling Group
Monitoring
instance
Built in
this way…
50. EC2 Instance Contents
DynamoDB table
holding instance details
Monitoring
instance
Python script
Read SQS queue and
generate data for S3
Static site
HTML, Javascript, css
reading data file
Built in
this way…
51. Read messages
from SQS queue
Write data to
DynamoDB table
Form JSON file
from updated
results
Write file to S3
for javascript to
interpret
64. def delete_instance(instance_id, ddb_table_name):
# Connect to DynamodB and get table
ddb = boto.connect_dynamodb()
table = ddb.get_table(ddb_table_name)
# Get the item to soft delete
item = table.get_item(instance_id)
# Update the terminated flag
item['terminated'] = 'true'
# Save the item to DynamoDB
item.put()
65. def delete_instance(instance_id, ddb_table_name):
# Connect to DynamodB and get table
ddb = boto.connect_dynamodb()
table = ddb.get_table(ddb_table_name)
# Get the item to soft delete
item = table.get_item(instance_id)
# Update the terminated flag
item['terminated'] = 'true'
# Save the item to DynamoDB
item.put()
66. def delete_instance(instance_id, ddb_table_name):
# Connect to DynamodB and get table
ddb = boto.connect_dynamodb()
table = ddb.get_table(ddb_table_name)
# Get the item to soft delete
item = table.get_item(instance_id)
# Update the terminated flag
item['terminated'] = 'true'
# Save the item to DynamoDB
item.put()
69. def write_instances_to_s3(instances_json, s3_output_bucket, s3_output_key):
# Connect to S3 and get the output bucket
s3 = boto.connect_s3()
output_bucket = s3.get_bucket(s3_output_bucket)
# Create a key to store the instances_json text
k = Key(output_bucket)
k.key = s3_output_key
k.set_metadata("Content-Type", "text/plain")
k.set_contents_from_string(instances_json)
70. def write_instances_to_s3(instances_json, s3_output_bucket, s3_output_key):
# Connect to S3 and get the output bucket
s3 = boto.connect_s3()
output_bucket = s3.get_bucket(s3_output_bucket)
# Create a key to store the instances_json text
k = Key(output_bucket)
k.key = s3_output_key
k.set_metadata("Content-Type", "text/plain")
k.set_contents_from_string(instances_json)
82. Want to try this yourself?
View the video tutorial here:
http://youtu.be/lb9qPhxIVNI
And grab the CloudFormation template here:
http://bootstrapping-assets.s3.amazonaws.com/as-
register-instances.template
83. 1. Create security groups
2. Create a notification of type SQS
3. Create SQS queue
4. Create auto-scaling launch configs & groups
5. Add auto-scaling notifications to SQS SNS notification
6. Create S3 bucket
7. Create DynamoDB table
8. Start instances
9. Bootstrap monitoring application
85. Pub/Sub models with SNS S3 & pseudo dynamic content
More than compute & storage
Reliable delivery with SQS DynamoDB for high performance
Summary
86. Pub/Sub models with SNS S3 & pseudo dynamic content
More than compute & storage
Reliable delivery with SQS
Given you some ideas?
Introduced you to some handy services?
Helped you with some CloudFormation?
DynamoDB for high performance
Summary