The Rise of Serverless Cloud Computing
Serverless cloud computing started as an interesting point of discussion in 2013-14 (think Backend-as-a-Service), but is now center stage in the cloud computing conversation. The fundamental premise of Serverless computing is that developers cannot see the server in a virtualized environment (though it does exist, otherwise where would you run the logic?). Allocation of server resources is not fixed, but is triggered by events. This allows extremely high utilization rates for compute resources, leading to a super-optimized setup, especially for short-term, asynchronous / real-time processing. The construct is also known, therefore, as Function-as-a-Service (FaaS).
With enterprises worldwide adopting cloud computing for diverse needs, it is only natural that they would also gravitate towards Serverless architecture to build greater efficiency and scalability in their asynchronous processes. By eliminating the need to manually provision, maintain and scale compute resources, Serverless architectures make IT systems extremely optimized, especially in distributed computing scenarios with highly variable workloads (the telecom, entertainment and ecommerce industries should find many use cases that fit such workload criteria).
Leading cloud vendors have specific offerings for cloud-native application development using Serverless architecture. From the compute perspective, we have AWS Lambda and Google Cloud Functions (both supporting Node.js). Microsoft has Azure Functions and Azure Data Lake which run on the Azure Cloud. Next-generation platforms like Kubernetes (which is an open-source container management tool owned by Google) have now adapted to be compatible with Serverless containers. And while adoption is still nascent, there are a number of scenarios where this architecture has strong benefits:
- Provisioning and auto-scaling resources for voluminous and variable workloads – e.g., data coming in from IoT devices
- Combining traditional and new (Big Data) workloads for advanced analytics and real-time decision support – e.g., real-time risk profiling of insurance members using various data sources such as claims, hospital data, social data, registry data, inflation index, etc.
- Performing powerful analytics and rules based processing on streaming data – e.g., GPS data coming from satellites combined with motion sensor data to calculate speed and project turnaround times, etc. (useful in supply chain, transportation, logistics)
- AI based use cases such as Robotic Process Automation and chatbots that require large scale, asynchronous data processing and rapid scalability of compute resources
- Distributed computing use cases, such as batch processing of very large data sets (this is the basis of traditional Big Data processing constructs like MapReduce)
In many ways, Serverless computing takes the notion of cloud computing to the highest business-relevant level of abstraction (any higher and we would be discussing VMs, hypervisors and OS). It gives unmatched flexibility to developers and IT teams to architect, develop and deploy cloud-native applications. At the same time IT teams need to create strong skills around specific tools such as (AWS Lambda, Kubernetes Kubeless or Google Cloud Functions) as well as supported development platforms (such as Node.js, Python and C#). And while it has its benefits, organizations need to start slow (with a few strong POCs), and eventually consider a well-defined, rational and calibrated strategy to move enterprise applications to Serverless cloud computing.
Like most new technology constructs, Serverless Cloud Computing is also seeing its fair share of hype. However, cutting through media hyperbole, there are clear benefits that are driving adoption among developers. Interestingly, according to a report by DigitalOcean in July 2018, India developers (43%) lead the world (33%) when it comes to deploying applications in Serverless environments. While there are a few inherent challenges with Serverless computing – such as debugging complexity, latency, and security concerns – the long term benefits are too big to be ignored. Over time, cloud service providers would innovate to iron out kinks and work towards a robust development community for cloud-native application development.
Nitin Mishra heads the product management and solutions engineering functions at Netmagic Solutions. During his nine years with the company, he has been responsible for conceptualizing and packaging hosting and managed services focused on IT infrastructure requirements of Internet and Enterprise applications.