Skip navigation
Help

Amazon Elastic Compute Cloud

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
r_adams

Moving from physical servers to the "cloud" involves a paradigm shift in thinking. Generally in a physical environment you care about each invididual host; they each have their own static IP, you probably monitor them individually, and if one goes down you have to get it back up ASAP. You might think you can just move this infrastructure to AWS and start getting the benefits of the "cloud" straight away. Unfortunately, it's not quite that easy (believe me, I tried). You need think differently when it comes to AWS, and it's not always obvious what needs to be done.

So, inspired by Sehrope Sarkuni's recent post, here's a collection of AWS tips I wish someone had told me when I was starting out. These are based on things I've learned deploying various applications on AWS both personally and for my day job. Some are just "gotcha"'s to watch out for (and that I fell victim to), some are things I've heard from other people that I ended up implementing and finding useful, but mostly they're just things I've learned the hard way.

0
Your rating: None

The tech unit's sign, autographed by its members.

The reelection of Barack Obama was won by people, not by software. But in a contest as close as last week's election, software may have given the Obama for America organization's people a tiny edge—making them by some measures more efficient, better connected, and more engaged than the competition.

That edge was provided by the work of a group of people unique in the history of presidential politics: Team Tech, a dedicated internal team of technology professionals who operated like an Internet startup, leveraging a combination of open source software, Web services, and cloud computing power. The result was the sort of numbers any startup would consider a success. As Scott VanDenPlas, the head of the Obama technology team's DevOps group, put it in a tweet:

4Gb/s, 10k requests per second, 2,000 nodes, 3 datacenters, 180TB and 8.5 billion requests. Design, deploy, dismantle in 583 days to elect the President. #madops

Read 53 remaining paragraphs | Comments

0
Your rating: None

Not everyone wants to run their applications on the public cloud. Their reasons can vary widely. Some companies don’t want the crown jewels of their intellectual property leaving the confines of their own premises. Some just like having things run on a server they can see and touch.

But there’s no denying the attraction of services like Amazon Web Services or Joyent or Rackspace, where you can spin up and configure a new virtual machine within minutes of figuring out that you need it. So, many companies seek to approximate the experience they would get from a public cloud provider on their own internal infrastructure.

It turns out that a start-up I had never heard of before this week is the most widely deployed platform for running these “private clouds,” and it’s not a bad business. Eucalyptus Systems essentially enables the same functionality on your own servers that you would expect from a cloud provider.

Eucalyptus said today that it has raised a $30 million Series C round of venture capital funding led by Institutional Venture Partners. Steve Harrick, general partner at IVP, will join the Eucalyptus board. Existing investors, including Benchmark Capital, BV Capital and New Enterprise Associates, are also in on the round. The funding brings Eucalyptus’ total capital raised to north of $50 million.

The company has an impressive roster of customers: Sony, Intercontinental Hotels, Raytheon, and the athletic-apparel group Puma. There are also several government customers, including the U.S. Food and Drug Administration, NASA, the U.S. Department of Agriculture and the Department of Defense.

In March, Eucalyptus signed a deal with Amazon to allow customers of both to migrate their workloads between the private and public environments. The point here is to give companies the flexibility they need to run their computing workloads in a mixed environment, or move them back and forth as needed. They could also operate them in tandem.

Key to this is a provision of the deal with Amazon that gives Eucalyptus access to Amazon’s APIs. What that means is that you can run processes on your own servers that are fully compatible with Amazon’s Simple Storage Service (S3), or its Elastic Compute cloud, known as EC2. “We’ve removed all the hurdles that might have been in the way of moving workloads,” Eucalyptus CEO Marten Mickos told me. The company has similar deals in place with Wipro Infotech in India and CETC32 in China.

0
Your rating: None

s3_growth_2012_q1_1

Amazon has released some fairly impressive numbers showcasing the growth of Amazon Simple Storage Service (S3) over the years. By the end of the first quarter of 2012, there were 905 billion objects stored, and the service routinely handles 650,000 requests per second for those objects, with peaks that go even higher. To put that in perspective, that’s up from 262 billion objects stored just two years ago and up from 762 billion by Q4 2011.

Or maybe it’s more impressive when you look further back: 2.9 billion in 2006, for example. And how fast is it growing? Well, says Amazon, every day, over a a billion objects are added. That’s how fast.

The S3 object counts grows even when Amazon recently added ways to make it easier for objects to leave, including through object expiration and multi-object deletion. The objects are added via S3 APIsAWS Import/Export, the AWS Storage Gateway, various backup tools, and through Direct Connect pipes.

Note, that the above chart shows Q4 data up until this year, as Amazon only has data up to Q1. So that’s not any sort of slowdown you’re seeing there – by Q4 2012, that number is going to be much, much higher.

0
Your rating: None



Maybe you're a Dropbox devotee. Or perhaps you really like streaming Sherlock on Netflix. For that, you can thank the cloud.

In fact, it's safe to say that Amazon Web Services (AWS) has become synonymous with cloud computing; it's the platform on which some of the Internet's most popular sites and services are built. But just as cloud computing is used as a simplistic catchall term for a variety of online services, the same can be said for AWS—there's a lot more going on behind the scenes than you might think.

If you've ever wanted to drop terms like EC2 and S3 into casual conversation (and really, who doesn't?) we're going to demystify the most important parts of AWS and show you how Amazon's cloud really works.

Read the rest of this article...

Read the comments on this post

0
Your rating: None