By the end of this blog, you will know some aspects of devops practices that are not most obvious within the context of serverless applications. Most of these aspects are the lessons learnt while creating and running NotesAlly.
What do we mean by Serverless?
Before we understand what is serverless, lets try to come to an understanding of what a server is - it is nothing but a process that listens to a port for the requests and processes them.
In general, when we run an application in a non-serverless based environment, our ownership ranges from owning application code to the hardware. Minimum ownership will be managing the application code along with other configuration about the server it runs on and the server itself(incase of PaaS solutions such as heroku). While the maximum ownership will be managing the application code, server, OS and the hardware itself(incase of On-prem servers). Whereas in Serverless environment, we dont need to manage the server or anything else, but only the application code that is meant to render the functionality of the application we are intending to build.
DevOps aspects that are not so obvious in Serverless Applications
#1 Start with Offline Development ASAP
When building the serverless applications, start thinking about offline development on the local machine from the Day 1.
This not only speeds up the development cycles but also reduces the dependency of having to have connected to internet all the times during the development of application. We are in the times where people are encouraged to work from non-office locations, so we cannot expect everyone to be having good internet connectivity, so it reduces bit of hassle for such people.
- Reduces the waiting time to test. We dont need to wait for the entire code to be pushed to cloud and then test the functionality.
- Saves some cost by not utilizing the cloud infrastructure
#2 Don’t forget to test it on Cloud
Having offline development environment does not mean you should never test it on cloud until you do the production deployment. Based on our experience, the offline development environment is not as restrictive as the cloud platform itself and there could be surprises if you don’t test your work on cloud.
For example, there is an AWS feature called Lambda authorizer(is an API Gateway feature that uses a Lambda function to control access to your API). This has some restrictions when executed in cloud compared to the offline development environment.
#3 Don’t forget to monitor DB Connection Count
When dealing with the serverless applications, everything could scale up and down automatically based on the load. When the application scales up, we need to bear in mind that the database connections will also increase to a point where it will become a bottleneck. So, there should be some mechanism such as a monitoring dashboard to keep track of the database connections and take proactive measures if things go south.
#4 Services need to be backward Compatible
We cannot just modify the api and expect it to work. This could break the other apis which are dependent on the modified api. If there are different clients using different versions of api then the versioning of apis would need additional thought.
#5 Nitpick about the service startup time
AWS lambdas can drop the container after a period of inactivity, and your function becomes inactive or cold.
There are 2 different ways that AWS start the lambda function. Warm start & Cold start.
A Warm start happens when there are available containers.
A Cold start happens when you execute an inactive Lambda function. The execution of an inactive Lamda function happens when there are no available containers, and the function needs to start up a new one. Creation of new containers in a cold start creates a delay. So cold starts make serverless applications respond slower. We cannot entirely avoid cold starts, but can reduce their duration and frequency of cold starts.
We can reduce the time for cold starts by avoid dependies which takes time to load, preferring dynamically typed language and by increasing the memory on AWS Lambda to get more CPU capacity to startup quickly.
#6 Avoid libraries requiring native compilation
When using libraries requiring native compilation in a server based application would be trivial as it is seemingly easy and has well documented steps such as install the required npm module, import and use it in the application. Whereas using such libraries in serverless environment, is not so trivial and it requires additional effort. Because, we dont have any control over the underlying OS that the apis(AWS lambdas) run on the cloud infrastructure, we cannot even compile it to a particular OS and assume it will work without issues. So, look for alternate libraries which doesn’t need native compilation.
We hope you found this useful!