36. 1. Create Web - DB Application
36
• EC2 Instance x 1
• DB Server running on EC2 Instance above.
• User can access http://<some domain or ip>:80/
37. 2. Create Web - DB Application
37
• EC2 Instance x 1
• RDS Instance x 1
• User can access http://<some domain or ip>:80/
38. 3. Create Web - DB Application
38
• EC2 Instance x 2
• RDS Instance x 1
• Load Balancer x 1
• User can access http://<some domain or ip>:80/
• Top page shows host name or private IP on it.
• May use any stuff you prefer.
39. 4. Create Web - DB Application
39
• EC2 Instance x 2 at public subnet in VPC
• RDS Instance x 1 at private subnet in VPC, as Multi AZ
• Load Balancer x 1 in VPC (internet-facing)
• User can access http://<some domain or ip>:80/
• Top page shows host name or private IP on it.
• May use any stuff you prefer.
40. 4-1. Create VPC
40
• Start VPC Wizard
• Select “VPC with Public and Private Subnets”
41. 4-2. Launch EC2 Instance and ELB
41
• Launch EC2 Instance in your VPC
note: should set it in Public Subnet
• Create Load Balancer in your VPC.
• Add your EC2 Instance in ELB.
50. Create Your Table and …
50
• Create your table.
Hash Key => item_id (String)
Range Key => selling_date (String)
set Read/Write Throughput as you like
• Insert test data by your way (manually, by program).
sample program: https://gist.github.com/mryoshio/6744061
51. Light Exercise
51
• Let’s write code to read data in DynamoDB.
• And to change Read/Write throughput dynamically.
54. Concept
54
• Treat S3, DynamoDB as HDFS.
• Support Hive, Pig, Custom Program (e.g. Python).
• Sample Program is prepared in AWS Console.
• JobFlow (Job) can be handled in AWS Console but might
be easier to do with command line client.
56. 1. Hive Program (DynamoDB -> S3)
56
• Export DynamoDB table you created before into S3.
57. 2. Hive Program (S3 -> DynamoDB)
57
• Import S3 files into DynamoDB table you created before.
• May re-use S3 files exported in previous page.
58. 3. Streaming (S3 -> DynamoDB)
58
• Design S3 files and DynamoDB table.
• Create new DynamoDB table.
• Generate files on S3.
• Import files on S3 into DynamoDB table.
• On the way importing, your custom MapReduce script
has to work this time.