What is serverless?
Serverless is a cloud-native development model that allows developers to build and run applications without having to manage servers. The term “serverless” doesn’t mean there are no servers. It means the servers are abstracted away from application development. A cloud provider handles the routine work of provisioning, maintaining, and scaling the server infrastructure.
With serverless, developers package their code in containers for deployment. Once deployed, serverless apps respond to demand and automatically scale up or down as needed. Serverless offerings from public cloud providers are usually metered on demand using an event-driven execution model. As a result, when a serverless function is sitting idle, it doesn’t cost anything.
What’s the difference between serverless computing and serverless architecture?
Serverless computing and serverless architecture are often used interchangeably, but they are different concepts. Serverless computing refers to the application development model, while serverless architecture refers to the design approach of an application.
Serverless computing
Serverless computing differs from other cloud computing models in that the cloud provider is responsible for managing both the cloud infrastructure and the scaling of apps. Instead of provisioning and maintaining servers, applications run on automatically managed computing resources provided by cloud providers. Serverless computing is event-driven, scales automatically, and typically follows a pay-as-you-go pricing model. Serverless apps are deployed in containers that automatically launch on demand when called.
Under a standard Infrastructure-as-a-Service (IaaS) cloud computing model, users prepurchase units of capacity, meaning you pay a public cloud provider for always-on server components to run your apps. It’s the user’s responsibility to scale up server capacity during times of high demand and to scale down when that capacity is no longer needed. The cloud infrastructure necessary to run an app is active even when the app isn’t being used.
Types of serverless computing
Serverless computing can fall into one of several categories:
- Function-as-a-Service (FaaS): FaaS runs event-driven functions in short-term containers. There’s no need to manage server instances, and it executes only when triggered.
- Backend-as-a-Service (BaaS): BaaS offers fully managed backend services that handle authentication, databases, messaging, and storage. It’s often used in mobile and web applications.
- Serverless databases: Serverless databases scale automatically and don’t require infrastructure management.
- Serverless containers: Serverless containers run without manual provisioning and scale dynamically.
- Serverless edge computing: Serverless edge computing runs code closer to users for lower latency.
Serverless architecture
Serverless architecture describes a design approach in which apps are launched only as needed. When an event triggers app code to run, the public cloud provider allocates resources for that code. The user stops paying when the code finishes executing. Serverless also frees developers from tasks associated with app scaling and server provisioning. Routine tasks such as managing the operating system and file system, security patches, load balancing, and capacity management are all offloaded to a cloud services provider. It’s possible to build an entirely serverless app, or an app composed of both serverless and traditional microservices components.
Serverless architecture that uses microservices components is called serverless microservices. With serverless microservices, serverless architecture allows the developer to write code, while microservices break down the application into smaller, manageable components. This combination can result in faster development and deployment.
Understanding cloud computing
What are the advantages and disadvantages of serverless?
Advantages
- Offloading of server management tasks: Without the need to provision, maintain, or scale servers, developers have more time to focus on their apps.
- Cost-efficiency: With pay-per-use, you only pay for the actual execution time of functions, reducing idle resource costs.
- Faster development and deployment: Developer productivity increases with less infrastructure management and deployments that require minimal setup.
- Automatic scaling: Functions and services scale up or down automatically based on demand.
- Built-in security: Security updates and patches are managed by the provider, reducing risks associated with misconfigured servers.
Disadvantages
- Complexity in architecture: Event-driven workflows can become complex with multiple functions interacting asynchronously.
- Interaction constraints: Cloud providers may impose limits on how you can use their components, which can reduce your flexibility.
- Vendor lock-in: Serverless services that are tightly integrated with cloud provider ecosystems can make migration difficult.
What are some serverless use cases?
Serverless architecture is ideal for asynchronous, stateless apps that can be started instantaneously. It is a good fit for use cases that see infrequent, unpredictable surges in demand.
Think of a task like batch processing of incoming image files, which might run infrequently but also must be ready when a large batch of images arrives all at once. Or a task like watching for incoming changes to a database and then applying a series of functions, such as checking the changes against quality standards, or automatically translating them.
Serverless apps are also a good fit for use cases that involve incoming data streams, chat bots, scheduled tasks, or business logic.
Some other common serverless use cases are back-end application programming interfaces (APIs) and web apps, business process automation, serverless websites, and integration across multiple systems.
What is FaaS and the cloud provider’s role?
In a serverless model, a cloud provider runs physical servers and allocates their resources on behalf of users who can deploy code straight into production.
FaaS, the more common serverless model that focuses on event-driven functions, allows developers to write custom server-side logic that is deployed in containers fully managed by a cloud services provider. These containers are:
- Stateless, making data integration simpler.
- Ephemeral, allowing them to be run for a very short time.
- Event-triggered, running automatically when needed.
- Fully managed, so you only pay for what is used.
With FaaS, developers have a greater degree of control and can call functions through APIs, managed by the cloud provider through an API gateway.
Major cloud providers offer FaaS solutions, including AWS Lambda, Azure Functions, Google Cloud, and IBM Cloud Functions. Some companies use open source platforms like Red Hat® OpenShift® Serverless (based on Knative) to run their own FaaS environments.
You can contrast FaaS with its backend services counterpart, BaaS, which gives developers access to third-party services like authentication, encryption, and databases, usually through APIs. BaaS simplifies backend tasks but offers less control over custom application logic.
What is Knative?
Kubernetes is a popular platform for managing containerized apps, but it doesn’t natively support serverless workloads. Knative is an open source project that adds the necessary components to deploy, run, and manage serverless apps on Kubernetes.
Knative enables serverless environments by letting you deploy code on Kubernetes platforms like Red Hat OpenShift. With Knative, you package your code as a container image, and the system automatically starts and stops instances based on demand.
Knative has 3 main components:
- Build: Converts source code into containers.
- Serving: Deploys and scales containers automatically based on request-driven demand.
- Eventing: Manages events from various sources, such as apps, cloud services, Software-as-a-Service (SaaS) systems, and streams for Apache Kafka, to trigger functions.
Unlike traditional serverless solutions, Knative supports a wide range of workloads, from monolithic apps to microservices and small functions. It can run on any Kubernetes-enabled platform, including on-premise environments.
Knative delivers key benefits, including:
- Supported microservices and serverless workloads: You can deploy event-driven, stateless functions within Kubernetes. You can also deploy long-running microservices that scale dynamically without having to maintain server instances manually.
- Optimized cost and resource use: Unlike traditional microservices where pods may always be running, Knative supports scale-to-zero—meaning if no requests come in, it deallocates resources, reducing costs.
- Standardized APIs and flexibility: Knative follows Kubernetes-native patterns and can work across cloud providers.
Knative brings the power of serverless computing to Kubernetes, simplifying deployment and management of serverless workloads.
Why choose Red Hat?
Red Hat OpenShift Serverless helps you build and deploy serverless applications faster without having to worry about managing infrastructure details. It provides an enterprise-grade serverless platform that brings portability and consistency across hybrid and multicloud environments.
With OpenShift Serverless, developers can create cloud-native, source-centric applications using a series of Custom Resource Definitions (CRDs) and associated controllers in Kubernetes. And for operations teams, OpenShift Serverless offers a simplified experience because it installs easily on Red Hat OpenShift, has been tested with other Red Hat products, and comes with access to award-winning support.
OpenShift Serverless delivers a complete serverless app dev and deployment solution by integrating apps with other Red Hat OpenShift Container Platform services, like Red Hat OpenShift Service Mesh and cluster monitoring. Developers benefit from being able to use a single platform for hosting their microservices, legacy, and serverless applications. Apps are packaged as Linux® containers that can be run anywhere.
Cloud-native meets hybrid cloud: A strategy guide
This step-by-step guide includes insights about hybrid cloud, cloud-native app technologies, and the importance of IT systems to organizational success.