The term serverless has been coming up in more conversations recently. Let’s clarify the concept, and those related to it, such as serverless computing and serverless platform.

Serverless is often used interchangeably with the term FaaS (Functions-as-a-Service). But serverless doesn’t mean that there is no server. In fact, there are many servers—serverful—because a public cloud provider provides the servers that deploy, run, and manage your application.

Serverless computing is an emerging category that represents a shift in the way developers build and deliver software systems. Abstracting application infrastructure away from the code can greatly simplify the development process while introducing new cost and efficiency benefits. I believe serverless computing and FaaS will play an important role in helping to define the next era of enterprise IT, along with cloud-native services and the hybrid cloud.

Serverless platforms provide APIs that allow users to run code functions (also called actions) and return the results of each function. Serverless platforms also provide HTTPS endpoints to allow the developer to retrieve function results. These endpoints can be used as inputs for other functions, thereby providing a sequence (or chaining) of related functions.

On most serverless platforms, the user deploys (or creates) the functions before executing them. The serverless platform then has all the necessary code to execute the functions when it is told to. The execution of a serverless function can be invoked manually by the user via a command, or it may be triggered by an event source that is configured to activate the function in response to events such as cron job alarms, file uploads, or many others.

7 open source platforms to get started with serverless computing

  • Apache OpenWhisk is a serverless, open source cloud platform that allows you to execute code in response to events at any scale. It’s written in the Scala language. The framework processes the inputs from triggers like HTTP requests and later fires a snippet of code on either JavaScript or Swift.

  • Fission is a serverless computing framework that enables developers to build functions using Kubernetes. It allows coders to write short-lived functions in any programming language and map them with any event triggers, such as HTTP requests.

  • IronFunctions is a serverless computing framework that offers a cohesive microservices platform by integrating its existing services and embracing Docker. Developers write the functions in Go language.

  • Fn Project is an open source container-native serverless platform that you can run anywhere—on any cloud or on-premise. It’s easy to use, supports every programming language, and is extensible and performant.

  • OpenLambda is an Apache-licensed serverless computing project, written in Go and based on Linux containers. The primary goal of OpenLambda is to enable exploration of new approaches to serverless computing.

  • Kubeless is a Kubernetes-native serverless framework that lets you deploy small bits of code without having to worry about the underlying infrastructure. It leverages Kubernetes resources to provide autoscaling, API routing, monitoring, troubleshooting, and more.

  • OpenFaas is a framework for building serverless functions with Docker and Kubernetes that offers first-class support for metrics. Any process can be packaged as a function, enabling you to consume a range of web events without repetitive boilerplate coding.

Kubernetes is the most popular platform to manage serverless workloads and microservice application containers, using a finely grained deployment model to process workloads more quickly and easily. With Knative Serving, you can build and deploy serverless applications and functions on Kubernetes and use Istio to scale and support advanced scenarios such as:

  • Rapid deployment of serverless containers

  • Automatic scaling up and down to zero

  • Routing and network programming for Istio components

  • Point-in-time snapshots of deployed code and configurations

Knative focuses on the common tasks of building and running applications on cloud-native platforms for orchestrating source-to-container builds, binding services to event ecosystems, routing and managing traffic during deployment, and autoscaling workloads. Istio is an open platform to connect and secure microservices (effectively a service mesh control plane to the Envoy proxy) and has been designed to consider multiple personas interacting with the framework, including developers, operators, and platform providers.

For example, you can deploy a JavaScript serverless workload using Knative Serving on a local Minishift platform with the following code snippets:

## Dockerfile
FROM bucharestgold/centos7-s2i-nodejs:10.x
WORKDIR /opt/app-root/src
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080 3000
CMD ["npm", "start"]


## package.json
{
  "name": "greeter",
  "version": "0.0.1",
  "private": true,
  "scripts": {
    "start": "node app.js"
  },
  "dependencies": {
    "express": "~4.16.0"
  }
}

## app.js
var express = require("express");
var app = express();

var msg = (process.env.MESSAGE_PREFIX || "") + "NodeJs::Knative on OpenShift";

app.get("/", function(req, res, next) {
  res.status(200).send(msg);
});

app.listen(8080, function() {
  console.log("App started in port 8080");
});

## service.yaml
apiVersion: serving.knative.dev/v1alpha1
kind: Service
metadata:
  name: greeter
spec:
  configuration:
    revisionTemplate:
      spec:
        container:
          image: dev.local/greeter:0.0.1-SNAPSHOT

Build your Node.js serverless application and deploy the service on local Kubernetes platform. Install Knative, Istio, Knative Serving on Kubernetes (or Minishift) as prerequisites.

1. Attach to the Docker daemon using the following the commands:

(minishift docker-env) && eval(minishift oc-env)

2. Build a serverless application container image using the following the commands with Jib:

./mvnw -DskipTests clean compile jib:dockerBuild

3. Deploy a serverless service such as Minishift to your Kubernetes cluster:

kubectl apply -f service.yaml

Conclusion

The example above shows where and how to start developing the serverless application with a cloud-native platform such as Kubernetes, Knative Serving, and Istio.

This article was originally published on Opensource.com. CC BY-SA 4.0.


关于作者

UI_Icon-Red_Hat-Close-A-Black-RGB

按频道浏览

automation icon

自动化

有关技术、团队和环境 IT 自动化的最新信息

AI icon

人工智能

平台更新使客户可以在任何地方运行人工智能工作负载

open hybrid cloud icon

开放混合云

了解我们如何利用混合云构建更灵活的未来

security icon

安全防护

有关我们如何跨环境和技术减少风险的最新信息

edge icon

边缘计算

简化边缘运维的平台更新

Infrastructure icon

基础架构

全球领先企业 Linux 平台的最新动态

application development icon

应用领域

我们针对最严峻的应用挑战的解决方案

Virtualization icon

虚拟化

适用于您的本地或跨云工作负载的企业虚拟化的未来