Understand The AWS Lambda Cold Start Problem| 3 Ways To Prevent It



The software development field is being transformed by the serverless ecosystem for the past few years and one of its flagship promoters has been AWS. While dealing with the AWS ecosystem there are chances that you may utilize lambda functions. When using AWS lambda, the issue of cold starts arises automatically, which can degrade the user experience. In this blog, we will be discussing how to avoid or prevent AWS lambda cold start problem.


What is an AWS lambda cold start?

Cold starts can kill lambda performance while developing a customer-facing application that should operate in real-time. If your lambda isn't currently operating, this can happen because AWS needs to deploy your code and spin up a container before the request can start. It might imply that a request will take considerably longer to process, and your lambda will only start running after the container is ready.

AWS Lambda cold start problem is one of the first requests that a new lambda worker handles. This request could take longer to process as the land of service should:

  1. Find a space in the EC2 fleet to allocate the worker.
  2. Initialize the worker.
  3. Initialize the function module before passing the request to the handler function.


A matter of fact is that cold starts are a significant byproduct of serverless. Cold starts refer to the amount of time it takes AWS lambda cold start solution to warm up containers before they can be used.

AWS demands a supply of containers for spinning up while functions are invoked. It indicates that functions are kept warm for a limited amount of time after executing so that the content is ready for any new function to be invoked.

Cold starts in lambda account for fewer than 0.25 percent of requests, but their impact can be significant, and running the function can take up to 5 seconds. This problem is relevant for applications needing real-time execution or rely on split-second timing.


What is provisioned concurrency and how can it solve cold starts?

Amazon lambda offers provisioned concurrency, a feature that offers more control over the performance of serverless applications. By using provisioned concurrency cold starts and startup latency issues can be avoided for your lambda functions. 

Provisioned concurrency allows you to build scalable serverless applications with an anticipated latency. You will also be able to set the desired concurrency for all versions or alliances of a function.

AWS Lambda provides function containers and guarantees that they may be invoked with double-digit latency after being called. It suggests that serverless services might respond to sudden spikes in traffic or large scaling events without affecting latency.

However, provisioned concurrency comes at a price, and you'll be charged for it starting from the time you enable it, rounded up to the nearest 5 minutes. The price is calculated by verifying concurrent function invocations performed without latency and memory allocated. It indicates that provisioned concurrency should be set provisionally by specifying enough currency for workloads to avoid unnecessary costs.


 

Also, read AWS Pricing Lambda: How Much Does It Cost To Run An Application Without A Server?

 

How to turn on provisioned concurrency?

The steps below explain how to configure provisioned concurrency for lambda functions using the AWS management console.

  1. Choose an existing lambda function in the AWS lambda console.
  2. Choose the publish new version option in the Actions dropdown menu and it will allow you to apply settings to an alias or a published version of a function.

 

 

  1. Now add a description for the version and then choose the publish option.
  2. Choose to create the alias option in the Actions dropdown menu and enter a name for each alias.
  3. Choose one and then choose to create in the version dropdown menu.

 

  1. Find a concurrency card and choose to add an option.
  2. Select the alias radio button for qualifier type and then choose functions you selected previously in the drop-down menu. Then define the required value for provisioned concurrency. Choose to save.
  3. Now go to the lambda console and the provisioned concurrency card should display in the progress status.


The initialization procedure will be complete after some time and then you can use the published alias as a function with provisioned concurrency features.

 

These steps apply to the AWS management console. AWS cloud formation, AWS CLI, and AWS SDK can be used for modifying these.


Ways to prevent AWS lambda cold starts

 

1.Keep an eye on your application to see how cold starts are affecting it.

Cold starts can happen even if you utilize provided concurrency appropriately. It's critical to keep an eye on your apps and see how cold starts influence performance. Some requests have increased latency as a result of cold beginnings, and you must determine which requests are affected and if they harm your end customers.

Although it involves some active deduction on your side, both CloudWatch Logs and X-Ray may assist you to discover where and when cold starts occur in your application. It's much easier to monitor how cold starts affect your application using a serverless-focused monitoring tool like Lumigo.

 

2. Reduce the number of packages

The size of the package has the least influence on the AWS Lambda cold start problem, but the initialization time when the package is loaded for the first time does.

The longer it takes for the container to load the items, the more time it will take. Browserify and Serverless Plugin Optimize are two tools that can assist you to minimise the number of packages you have.

 

3. Use Node.js, Python or Golang

Cold start times can be reduced to an acceptable range (500ms) with minimum effort if Lambda functions are written in Node.js, Python, or Golang. That implies that even when the cold starts to occur, the application's response time is still within the SLA.

Nathan Malishev discovered that Python, Node.js, and Go required considerably less time to begin than Java or.NET, with Python performing at least twice as rapidly as Java, depending on memory allocation, in one experiment.


Conclusion

While working with the service ecosystem you will need to deal with the AWS lambda cold start problem. Lambda functions have a cold start and it cannot be eliminated but could be reduced and are managed effectively by precise planning and execution.

Many serverless experts have been able to find some unique solutions to prevent AWS Lambda cold starts dependent on their experiments. Nevertheless, there is no one size that fits all kinds of solutions for a cold start. You are supposed to analyse and then carefully pick one that suits you the best.