One of five presentations at Chicago's Day of Cloud mini-conference. Chris McAvoy (http://www.psclistens.com) demonstrates cloud computing with Amazon services.
13. EC2 Pricing Small 1.7 gb / 1cpu $0.10 $72 Large 7.5 gb / 4 cpu $0.40 $288 X Large 15gb / 8 cpu $0.80 $576 High CPU 1.7 gb / 5 cpu $0.20 $144 High CPU X Large 7 gb / 20 cpu $0.80 $576
lots of examples at http://code.google.com/p/boto/wiki/BotoS3Examples My own S3 backup script: #!/usr/bin/env pythonimport osimport botofrom boto.s3.key import Keyc = boto.connect_s3()b = c.get_bucket('mcavoy-photos')import dbmdef backup(): db = dbm.open('picture_backup_log', 'c') os.chdir('/Users/cmcavoy/Pictures/iPhoto Library') for dirpath, dirnames, filenames in os.walk('Data'): for f in filenames: try: if db[f] == 'True': print "%s logged in database, skipping" % f continue except: pass k = Key(b) f_full = os.path.join(dirpath, f) k.key = f_full print "%s %s" % (f_full, k.exists()) if not k.exists(): print "uploading %s" % f_full k.set_contents_from_filename(f_full) print "done uploading %s" % f_full db[f] = 'True' db.close()if __name__ == '__main__': backup()
this will be discussion only, cloudfront is a CDN service that’s attached to S3. It allows you to distribute files from S3 across multiple datacenters, similar to any reasonable CDN service.
will go pretty quickly on this one as well. I wrote a script that moves a mysql database to simpledb ( http://code.activestate.com/recipes/576548/ ): import botoimport MySQLdbfrom MySQLdb import cursorsimport threadingdb_user = 'root'db_name = 'eve'db = MySQLdb.connect(user=db_user,db=db_name,cursorclass=cursors.DictCursor)c = db.cursor()sdb = boto.connect_sdb(access_key, secret_key)def get_or_create_domain(domain): try: d = sdb.get_domain(domain) except boto.exception.SDBResponseError: d = sdb.create_domain(domain) return ddef get_primary_key(table_name, cursor): &quot;&quot;&quot; Returns a dictionary of fieldname -> infodict for the given table, where each infodict is in the format: {'primary_key': boolean representing whether it's the primary key, 'unique': boolean representing whether it's a unique index} &quot;&quot;&quot; cursor.execute(&quot;SHOW INDEX FROM %s&quot; % table_name) indexes = {} for row in cursor.fetchall(): if row['Key_name'] == 'PRIMARY': return row['Column_name'] raise(&quot;Table %s does not have a primary key&quot; % table_name)class BotoWorker(threading.Thread): def __init__(self, name, record, domain): self.domain = domain self.name = name self.record = record threading.Thread.__init__(self) def run(self): print &quot;inserting %s into %s&quot; % (self.name, self.domain) item = self.domain.new_item(self.name) for key, value in self.record.items(): try: item[key] = value except UnicodeDecodeError: item[key] = 'unicode error'def main(): c.execute(&quot;show tables like 'invtypes';&quot;) for table in c.fetchall(): print table table = table[&quot;Tables_in_%s (invtypes)&quot; % db_name] print &quot;loading data from %s&quot; % table total = c.execute(&quot;select * from %s&quot; % table) print &quot;fetched %s items from mysql&quot; % total complete = 0 for record in c.fetchall(): name = record.pop(get_primary_key(table, c)) thread_started = False while not thread_started: if threading.activeCount() < 30: print &quot;got a thread %s&quot; % threading.activeCount() BotoWorker(name=name, record=record, domain=get_or_create_domain(table)).start() thread_started = True complete += 1 print &quot;%s complete of %s&quot; % (complete, total)if __name__ == '__main__': main()
Another quick one. I’ll use examples from the boto documentation.
The bulk of the presentation is here.
boto is a python library wrapper for amazon web services, including cloud web services. All the examples we’ll look at today are written in boto.