The Easiest Way to Compute in the Cloud – AWS Lambda
When AWS launched, it changed how developers thought about IT services: What used to take weeks or months of purchasing and provisioning turned into minutes with Amazon EC2. Capital-intensive storage solutions became as simple as PUTting and GETting objects in Amazon S3. At AWS we innovate by listening to and learning from our customers, and one of the things we hear from them is that they want it to be even simpler to run code in the cloud and to connect services together easily. Customers want to focus on their unique application logic and business needs – not on the undifferentiated heavy lifting of provisioning and scaling servers, keeping software stacks patched and up to date, handling fleet-wide deployments, or dealing with routine monitoring, logging, and web service front ends. So we challenged ourselves to come up with an easy way to run applications without having to manage the underlying infrastructure and without giving up on the flexibility to run the code that developers wanted. Our answer is a new compute service called AWS Lambda.
AWS Lambda makes building and delivering applications much easier by giving you a simple interface to upload your Node.js code directly to Lambda, set triggers to run the code (which can come from other AWS services like Amazon S3 or Amazon DynamoDB, to name a couple), and that’s it: you’re ready to go. AWS handles all the administration of the underlying compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, code and security patch deployment, and code monitoring and logging. You can go from code to service in three clicks and then let AWS Lambda take care of the rest.
One of the most exciting aspects of Lambda is that it helps you create dynamic, event-driven applications in the cloud. Lambda is launching in conjunction with a new Amazon S3 feature called event notifications the generates events whenever objects are added or changed in a bucket, and our recently announced Amazon DynamoDB Streams feature that generates events when a table is updated. Now developers can attach code to Amazon S3 buckets and Amazon DynamoDB tables, and it will automatically run whenever changes occur to those buckets or tables. Developers don’t have to poll, proxy, or worry about being over or under capacity – Lambda functions scale to match the event rate and execute only when needed, keeping your costs low.
Event-driven cloud computing makes it easy to create responsive applications, often without needing to write new APIs. For example, a mobile, tablet, or web application that uploads images to Amazon S3 can automatically trigger the generation of thumbnails with a few lines of code – no servers, queues, or new APIs are needed. Logs are equally easy to process – if you already use AWS CloudTrail to track API calls made to AWS services, you now can easily audit the result just by turning on S3 event notifications for the appropriate bucket and writing a few lines of JavaScript code. Data stored in Amazon DynamoDB can be automatically verified, audited, copied, or transformed with an AWS Lambda function through the new Streams feature we announced earlier this week. AWS Lambda is also launching with support for Amazon Kinesis that makes it easy to process data in a Kinesis stream…and we’re not stopping there – keep watching for more integration points between AWS Lambda and other AWS services that make it easy to respond to events of all types.
We’re excited about event-driven computing – using AWS Lambda to extend other AWS services helps developers create applications that are simple, powerful, and inherently scalable. Lambda also excels at another challenge we hear a lot from customers: Turning some library code into a scalable, secure, and reliable cloud-based backend. With Lambda, developers can upload any library, even native (“binary”) libraries, making it easy to use a few lines of JavaScript to turn a library into an AWS-operated cloud service accessible as a Lambda function. AWS Lambda’s “stateless” programming model lets you quickly deploy and seamlessly scale to the incoming request rate, so the same code that works for one request a day also works for a thousand requests a second.
As with other AWS services, AWS Lambda can be accessed programmatically using the AWS SDK, through a RESTful web service API, from the command line interface, or through the AWS Lambda console. The console lets you edit and run code directly from a browser – you can author, debug, and experiment in real time without even needing an IDE. The AWS Lambda console can also create simulated events for Amazon S3 event notifications, Amazon DynamoDB Streams, and other event sources to help you verify how your code handles events from those sources. Once you’ve created and tested your Lambda function, you can monitor its performance and activity in the AWS Lambda console dashboard or through AWS CloudWatch, including setting alarms on latency or error rates. Logs for your Lambda functions are automatically captured as AWS CloudWatch Logs.
AWS Lambda is launching as a Preview with support for functions written in JavaScript (more languages to come) and event integration with Amazon S3, Amazon DynamoDB , and Amazon Kinesis. Preview mode lets you try all of AWS Lambda’s features with a limit on concurrent function requests. We look forward to seeing what our customers will do with AWS Lambda and the new Amazon S3 and DynamoDB event features. We’d like to hear your thoughts on our new event-driven compute service and features, so please connect directly with the product team on the AWS Lambda forum.