Serverless Computing - A Primer for Decision Makers
It seems like only yesterday that cloud computing was deemed the next big thing in the business and IT landscape. Service providers scrambled to offer the best cloud services available, while organisations carefully planned how they could best make a smooth transition into the cloud environment.
Now fast forward about a decade. Cloud computing remains a game-changing technology which initiated a paradigm shift in many companies, not only in how they set up their network infrastructure, but also in how they run their operations. Over time though, provisioning resources in the cloud may become a tedious and complex task for IT administrators, especially if the primary aim of a business is faster time to market their product.
This is where serverless computing comes in.
Serverless Computing: What is it and how does it work?
Serverless computing is a form of cloud computing that provides backend services on a pay-per-use basis. This means that a user—in most cases, a developer, is able to write snippets of code and then run it right away, without having to think about provisioning and managing the underlying infrastructure. This simplifies code deployment and greatly reduces administrative responsibility and costs for managing physical or virtual servers.
It should be understood, however, that this doesn’t completely remove servers. The term serverless of a misnomer in this regard, because servers are still very much in use. The user just doesn’t need to worry about them at all.
In serverless computing, also appropriately referred to as “backend-as-a-service”, developers simply write and deploy code, period. The service provider automatically allocates the exact amount of resources needed to run the code, then bills the user for these, usually down to the nearest 100 milliseconds of task running time. The resources provided and the subsequent costs charged are very precise—no more and no less than what is actually used.
Why go serverless?
Many service providers would say that serverless computing best exemplifies two of the primary attributes of cloud computing in general: computing resources that can scale at a moment’s notice, and paying only for what you use. While these same features are also definitive of IaaS, serverless takes it a step further.
First, computer resources auto-scale with demand, unlike “regular” cloud computing where the company has to be mindful of the peak periods in order to allocate more virtual resources, and then scale these down when they are no longer needed. You really just pay for what is actually used, as there are no server, storage, or networking resources on standby that need to be paid for. Instead, what is billed is simply the instance when a function or task is performing.
As beneficial as this setup may sound however, businesses should know that a serverless architecture is not applicable for all software application types.
Who would benefit most from using serverless?
A serverless architecture is essentially built on small, usable chunks of code or functions, which are self-contained and can be deployed when and where needed. This makes it a suitable option for developers who are designing lightweight and flexible applications, and for existing applications that can be broken down into separate, independent blocks which are easy to update and expand.
On the other hand, sizeable applications with workloads that are relatively predictable, as well as legacy systems that have an entirely different structure, would receive no benefit from being migrated to a serverless environment. In these cases, a traditional setup with dedicated servers, whether physical or virtual, would be more fitting from a cost-efficiency and system architecture standpoint.
What are the pros and cons of serverless computing?
The biggest benefit of serverless computing is pretty much apparent: it adds a lot of efficiency and speed to the development lifecycle. Automatic scalability, significantly lowered server costs (no idle resources) and elimination of the need for server maintenance (thus freeing up more time for developers and admins) are the main reasons for considering and adopting a serverless architecture, if the move makes sense.
But it’s not all good news. The drawbacks are real too:
Heavy reliance on vendor ecosystems. You don’t have to manage the servers yourself, but you are also completely dependent on how your vendor manages theirs. This means you have no control over server hardware, runtimes and runtime updates. Plus, you can’t easily switch providers if you need or want to.
Performance issues. Serverless computing makes you prone to dealing with ‘cold starts’, primarily because when a function is not running regularly the startup time is affected. You do have the option of keeping functions ‘warm’ by letting them run at regular intervals. Also, it’s best to keep serverless codes small and focussed to minimise this problem.
Security concerns. The issue of security is inherent in the cloud, and it’s no different in the serverless world. With your servers in the hands of the provider, you have no guarantee or full knowledge of their security policies and practices. This can be a huge concern, particularly if you have to deal with personal information or confidential data.
IT talent is scarce. Your company may be ready to go serverless, but are your developers ready? The fact is, only a small percentage of developers are capable of writing serverless code at this point. However, this may change sooner rather than later, considering the appeal of serverless computing to IT organisations today.
Which providers offer serverless computing?
Most major cloud providers also offer serverless computing, namely AWS, Google Cloud Platform, Microsoft Azure, and IBM Cloud Functions. Each provider has its own features, so consider these first to see how you can maximise the benefits and mitigate the risks outlined above.
So, is serverless computing the best option for you? While this question remains, having a better understanding of it should help you make the right call.