This document discusses how to build an AngularJS application that utilizes Amazon Web Services (AWS) for authentication, storage, and database functionality. It recommends using AWS Identity and Access Management (IAM) to manage user access, AWS Simple Storage Service (S3) for file storage, AWS DynamoDB for the database, and AWS Security Token Service (STS) to generate temporary credentials for unauthenticated users. Code examples are provided for setting up Angular services to interface with each AWS service, handling user login/logout via the STS, and using the services in application controllers.
3. The Cloud is
• Cheap for startup projects
• Ready to scale for growing projects
• Rich of services for complex projects
How we can use the cloud with JS?
6. • With AngularJS we can create
apps that work with RESTful
resources or directly with cloud
services.
7. Next steps
• #1 - signup to AWS website
• #2 - access to the AWS Web console
• #3 - create a IAM user with access to all services
• #4 - download and use the JavaScript SDK
8. Introducing Amazon Web Services
• Over 25 cloud based services available
• Several regions across the world
• JavaScript SDK available
• http://aws.amazon.con
11. IAM: Identity and Access Management
AWS Identity and Access Management (IAM) enables us to securely control access
to AWS services and resources for our users, setting users and groups and using
permissions to allow and deny their access to AWS resources.
Backup
system
AWS
Storage
AWS
Email Service
Marketing
app
PUT & GET
FULL ACCESS
YOUR APP AWS SERVICESAWS IAM
12. We create an user (or a group of
users) with Power User Access
level, in order to grant the access
to all services.
Then, we have to download the
access and secret keys that we’ll
use with the JS SDK.
13. Now we can use the JS/Browser AWS SDK
Paste in your HTML:
<script src="https://sdk.amazonaws.com/js/aws-sdk-2.0.0-rc6.min.js"></script>
available on http://aws.amazon.com/javascript
Configure with your IAM credentials:
<script>
AWS.config.update({accessKeyId: 'akid', secretAccessKey: 'secret'});
AWS.config.region = ‘eu-west-1'; //set your preferred region
</script>
14. Upload a file to Amazon Simple Storage Service with classic JS:
<input type="file" id="file-chooser" />
<button id="upload-button">Upload to S3</button>
<div id="results"></div>
<script type="text/javascript">
var bucket = new AWS.S3({params: {Bucket: 'myBucket'}});
var fileChooser = document.getElementById('file-chooser');
var button = document.getElementById('upload-button');
var results = document.getElementById('results');
button.addEventListener('click', function() {
var file = fileChooser.files[0];
if (file) {
results.innerHTML = '';
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.putObject(params, function (err, data) {
results.innerHTML = err ? 'ERROR!' : 'UPLOADED.';
});
}
else {
results.innerHTML = 'Nothing to upload.';
}
}, false);
</script>
20. Create an app that helps users to store incomes/expenses and track cashflow.
So we need:
- a database service where store private cashflow entries
- a storage service where upload private files (receipts, invoices, bills…)
- a authentication service that manages the access to database and storage
- an AngularJS app that merge al previous
Example
22. Step #1: set the storage service
Simple Storage Service (S3) is a
Cloud storage that lets us to PUT and GET
private (backup, private images…)
and public (js, css, public images) files.
We just have to create a bucket
(folder) in S3 where we’ll store the
files uploaded.
23. Step #2: set the database service
DynamoDB is a fully managed NoSQL
database stored in the cloud.
We pay for the throughput setted.
For example:
10 reads / 5 writes per sec = free
100 reads / 25 writes per sec = $31.58/month
We just have to create a new
table where store the user’s
incomes and expenses.
We set a low throughput for
the begeinning.
24.
25. We have to choose the indexes for the table.
We set a primary key(string type) called userID that will be useful later.
We set also a range key (numeric type) called timestamp that lets us query
quickly the entries ordering by «insert datetime».
26. We want to manage the authentication with certified external websites
such as Amazon, Google and Facebook.
Step #3: create federated apps
Go to http://login.amazon.com websites and create a new app.
There is possible get the code for the login, as following:
27. Creating an app we get an ID and we can set
allowed source (the url of our test/production
web application). HTTPS is required.
28.
29. Go back to http://aws.amazon.com/console,
and add a new role in the IAM area linked to our Amazon App.
Step #4: create the IAM role
31. We add policy to this role giving
full access to S3 and DynamoDB
thanks to the policy generator:
32. {
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1291088462000",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::financeapptest"
]
},
{
"Sid": "Stmt1291088490000",
"Effect": "Allow",
"Action": [
"dynamodb:*"
],
"Resource": [
"arn:aws:dynamodb:eu-west-1:728936874546:table/finance"
]
}
]
}
This is an example of the policy generated (full access to S3 bucket and DynamoDB table):
33. <div id="amazon-root"></div>
<script type="text/javascript">
window.onAmazonLoginReady = function() {
amazon.Login.setClientId('YOUR-CLIENT-ID');
};
(function(d) {
var a = d.createElement('script'); a.type = 'text/javascript';
a.async = true; a.id = 'amazon-login-sdk';
a.src = 'https://api-cdn.amazon.com/sdk/login1.js';
d.getElementById('amazon-root').appendChild(a);
})(document);
</script>
Get your code and add after <body>
<script type="text/javascript">
document.getElementById('LoginWithAmazon').onclick = function() {
options = { scope : 'profile' };
amazon.Login.authorize(options, 'https://www.example.com/handle_login.php');
return false;
};
</script>
Create a button (#LoginWithamazon) and the click event:
34. new AWS.STS().assumeRoleWithWebIdentity({
RoleArn: ‘the-arn-of-the-role’,
RoleSessionName: ‘the-name-of-the-role’,
WebIdentityToken: ACCESS_TOKEN,
ProviderId: "www.amazon.com"
}, function(err, data){
if(data && data.Credentials) {
console.log(data); //we get the Amazon User ID
}
});
};
User is redirected to https://www.example.com/handle_login.php?ACCESS_TOKEN=XYZ
40. Thanks to that you have a configuration available along your app.
Now we have to find a way to work with cloud services integrating
the AWS SDK in our app. There are several ways to to that with
AngularJS. In this case we create factory services to wrap each
needed feature.
Firstly, a service to manage the auth.
41. 'use strict';
angular.module('myApp.services', [])
//provide methods to manage credentials of federated user
.factory('loggerManager', function(configLogger, $location, $rootScope){
var baseFactory = {
handler: new AWS.STS(),
provider: false,
credentials: {},
id: false
};
/**
* logout method (based on ID provider)
*/
baseFactory.logout = function() {
if(baseFacory.provider == "amazon") {
amazon.Login.Logout();
}
};
42. /**
* login method (based on provider)
* @param provider the name of provider
* @param data data used for the login
* @param redirect the destination after login
*/
baseFactory.login = function(provider, data, redirect) {
//get the access params from AWS with the amazon login
if(provider == "amazon") {
AWS.config.credentials = new AWS.WebIdentityCredentials({
RoleArn: configLogger.amazonRoleArn,
ProviderId: 'www.amazon.com', // this is null for Google
WebIdentityToken: data.access_token
});
//assume role from AWS
baseFactory.handler.assumeRoleWithWebIdentity({
RoleArn: configLogger.amazonRoleArn,
RoleSessionName: configLogger.amazonRoleName,
WebIdentityToken: data.access_token,
ProviderId: "www.amazon.com"
}, function(err, data){
//login ok
if(data && data.Credentials) {
baseFactory.provider = provider;
baseFactory.credentials = data.Credentials;
baseFactory.id = data.SubjectFromWebIdentityToken;
if(redirect) {
$location.path(redirect);
$rootScope.$apply();
}
}
});
}
};
43. /**
* return the access key provided by amazon, google, fb...
*/
baseFactory.getAccessKeyId = function() {
if(baseFactory.credentials.AccessKeyId) {
return baseFactory.credentials.AccessKeyId;
}
else {
return "";
}
};
/**
* return the id provided by amazon, google, fb...
*/
baseFactory.getSecretAccessKey = function() {
if(baseFactory.credentials.SecretAccessKey) {
return baseFactory.credentials.SecretAccessKey;
}
else {
return "";
}
};
/**
* return the user id
*/
baseFactory.getUserId = function() {
if(baseFactory.id) {
return baseFactory.id;
}
else {
return "";
}
};
return baseFactory;
})
44. Then, a service to work with S3. This is a tiny example:
// provides methods to put and get file on S3
.factory('s3Ng', function(configAWS, loggerManager){
var baseFactory = {
handler:false
};
/**
* start the service
*/
baseFactory.build = function() {
baseFactory.handler = new AWS.S3({params: {Bucket: configAWS.bucketName}});
};
/**
* put file on the cloud storage
* @param fileName
* @param fileBody
*/
baseFactory.put = function(fileName, fileBody) {
var params = {Key: loggerManager.provider + "/" + loggerManager.getUserId()
+ "/" + fileName, Body: fileBody};
baseFactory.handler.putObject(params, function (err, data) {
console.log(data);
});
};
return baseFactory;
})
45. Wotking with Dynamo is more complex. This is an example:
.factory('dynamoNg', function (configAWS, loggerManager) {
var baseFactory = { handler:false };
//build the servic
baseFactory.build = function() {
baseFactory.handler = new AWS.DynamoDB({region: configAWS.region});
};
/**
* put an element in to dynamo table. Data is a formatted json for dynamo
* @param table name
* @param data are the data in JSON formatted for DynamoDB
* @return the result of the query
*/
baseFactory.put = function(table, data) {
return baseFactory.handler.putItem({
TableName: table,
Item: data
});
};
/**
* Get an element from a DynamoDB table
* @param table name
* @param data the key to fetch
* @return elements by the table
*/
baseFactory.get = function(table, data) {
console.log("getting");
return baseFactory.handler.getItem({
TableName: table,
Key: data
});
};
46. /**
* parse the dynamo data
* @param the data
* @returns the data extracted
*/
baseFactory.reverseModel = function(response) {
var result = [];
if(response.data.Count) {
for(var ii in response.data.Items) {
var item = response.data.Items[ii];
result[ii] = {};
for(var kk in item) {
if(item[kk].S) {
result[ii][kk] = item[kk].S;
}
if(item[kk].N) {
result[ii][kk] = item[kk].N;
}
//binary type is missing!
}
}
}
return result;
};
return baseFactory;
})
;
47. // provides methods to put and get file on S3
.factory('s3Ng', function(configAWS, loggerManager){
var baseFactory = {
handler:false
};
/**
* start the service
*/
baseFactory.build = function() {
baseFactory.handler = new AWS.S3({params: {Bucket: configAWS.bucketName}});
};
/**
* put file on the cloud storage
* @param fileName
* @param fileBody
*/
baseFactory.put = function(fileName, fileBody) {
var params = {Key: loggerManager.provider + "/" + loggerManager.getUserId()
+ "/" + fileName, Body: fileBody};
baseFactory.handler.putObject(params, function (err, data) {
console.log(data);
});
};
return baseFactory;
})
48. In a controller, start the auth:
.controller('HomeCtrl', function($scope) {
// login button to auth with amazon.com app
document.getElementById('LoginWithAmazon').onclick = function() {
var options = { scope : 'profile' };
amazon.Login.authorize(options, '/dynamofinance/app/#/logged/amazon');
return false;
};
})
And a controller to manage login (after app auth) and logout:
.controller('LoginCtrl', function($scope, $routeParams, loggerManager) {
//user comes back from amazon.com app login success
if($routeParams.access_token) {
//do the login with the provider got by the url
loggerManager.login($routeParams.provider, $routeParams, "/finance/list");
};
})
.controller('LogoutCtrl', function($scope, $routeParams, loggerManager) {
loggerManager.logout();
})
49. In a controller, how to work with services:
.controller("FinanceCtrl", function($scope, $routeParams, dynamoNg, dynamoFinanceTable, s3Ng,
loggerManager, configLogger, configAWS){
//build services
dynamoNg.build();
s3Ng.build();
//.... More code here
//upload file to S3
$scope.uploadFile = function() {
s3Ng.put("your filename", $scope.upload);
$scope.entryId = false;
};
//store movement
$scope.add = function(el) {
//prepare the data to store
el.date = el.date.toString();
var movement = dynamoFinanceTable.modelAmount(el);
//store the data
$scope.putMovement(movement);
$scope.formReset(false);
};
$scope.putMovement = function(movement) {
dynamoNg.put(configAWS.tableName, movement)
.on('success', function(response) {
$scope.entryId = response.request.params.Item.date.S;
$scope.$apply();
})
.on('error', function(error, response) { console.log(error); })
.send();
};
//... More code here
});
50. You can find an example on GitHub: it’s a work-in-progress app,
don’t use in production. It’s under dev and test.
https://github.com/gmittica/angularjs-aws-test-app
But our work is not finished.
51.
52. The files & data that we’re storing in AWS
are protected by unauthorized users,
but are fully visible by other authorized users.
Each user has access to data of the other ones.
Security problem
We have to refine the policies adding fine-grained conditions.
53. Step #5: fix the role policy
In Simple Storage Service, we can limit the access of each user to a
specific subfolder called with his userId.
{
"Effect":"Allow",
"Action":[
"s3:ListBucket"
],
"Resource":[
"arn:aws:s3:::financeuploads"
],
"Condition":{
"StringLike":{
"s3:prefix":[
"amazon/${www.amazon.com:user_id}/*"
]
}
}
},
55. In DynamoDB, thanks to fine-grained access
we can allow the access only to the rows
owned by the user (the rows with his userID)
It is also possible restrict the access of the
role to specific columns.