Last month, I gave a short intro to Node.js and Red Hat’s involvement in the Node.js project.  Today, I am happy to share that the Node.js community is releasing Node.js 16.

As is standard in Node.js releases, this version will not be promoted to long-term support (LTS) until October. We need the greater ecosystem to try it out and give the community feedback. This will allow us to address any issues in advance and make sure both the release, the ecosystem, and our customers are ready when it’s promoted.

In this post, I will highlight some of the new features and ongoing work in the 16 release, which  include:

  • Updated Platform support
  • V8 JavaScript Engine Version 9
  • N-API Version 8
  • New Promises APIs
  • Async Local Storage APIs

Platform Support

As with most major releases, this release updates the minimum supported levels for platforms and tooling used to build Node.js. Some examples include updating the minimum supported Xcode version to 11 and the GCC version for Linux and AIX platforms to 8.3. Please check the documentation in Node's building instructions for all the latest minimum levels.

More interesting is the work being done to add support for the new Apple M1 architecture. The Red Hat team is active in the Node.js build working group, helping to keep the infrastructure running to support the Power PC and s390 architectures, but also helps with work across the other architectures.

Red Hat's Ash Cripps, a build work group member for Node.js, has been actively working to install/configure M1 machines so that we can test/build binaries that are compiled for M1 and run natively. Node.js 16 will be the first version to provide native M1 support.

V8 JavaScript Engine Version 9

The V8 JavaScript engine is the runtime environment that executes JavaScript code. It's what lets JavaScript run across many platforms so developers don't need to worry about whether their code is running on Linux, Windows, macOS or whether the hardware underneath the OS is x64, Arm, or Power PC. However, V8 is written in C++ and requires the Node community to maintain and improve V8 for assorted operating system and hardware combinations.

Because of this Red Hat Node.js team gets a "sneak peak" at what is coming in new versions of the V8 JavaScript engine. As the platform owners for Power PC and s390 directories within V8 we are hard at work making commits every week to keep V8 running on these platforms. It’s great to see that work pay off as new features come to Node.js. One example this time is ECMAScript RegExp Match indexes. For more info on all of the features check out the V8 blog.

Node-API Version 8

Node.js 16 continues the journey of making it easier to create, build, and support native modules (also known as addons). Red Hat is one of the ongoing contributors to NODE-API, and Node 16 comes with NODE-API version 8, which includes new support for:

  • Type tagging objects - napi_type_tag_object/napi_check_object_type_tag
  • Freezing objects - napi_object_freeze
  • Sealing objects -  napi_object_seal

The additions in each version of NODE-API are driven by the use cases brought to the team from real-world use.

For more information about NODE-API, check out the API docs and node-addon-api.

Progress on adding promise based APIs

There is an ongoing strategic initiative within the project to add promise based APIs. One of the interesting additions in the 16 release is the addition of Promise based timer APIs. You can now do:

import {setTimeout} from 'timers/promises';

async function waitForTimer() {

  let curTime = new Date();

  console.log('timer started:' + curTime);

  await setTimeout(2000);

  curTime = new Date();

  console.log('timer expired:' + curTime);

}

waitForTimer();

 

With the output being:

 

timer started:Fri Apr 09 2021 13:44:33 GMT-0400 

timer expired:Fri Apr 09 2021 13:44:35 GMT-0400

It’s great to see ongoing progress to add promise based apis!

Async Local Storage APIs

Observability and problem determination are one of the key focus areas for Red Hat so that we can help our customers identify and resolve issues they have in production. In the Node.js project, we’ve been a long time member and supporter of the Diagnostics Working Group.

I hosted the first diagnostic summit in Canada back in 2018 and attended the second one in Germany 2019. One of the key things that came out of the second summit was the concept of providing an AsyncLocalStorage API that would be easier/quicker to get to stable than the underlying Async Hooks.

While we didn't quite get the AsyncLocalStorage API stable for the 16 release (many thanks to the Node.js collaborators still working hard on this), I’m hopeful that will happen before 16 becomes LTS in october so I’m going to mention it here anyway.

It is a key step to provide a built-in set of APIs that and be used by packages like OpenTelemetry in order to support the tracing component of Observability in Node.js applications (with logging and metrics being the others, if you are interested in metrics you might want to check out this recently updated article monitor Node.js applications on Red Hat OpenShift with Prometheus or our logging suggestions in the Red Hat/IBM Node.js reference architecture: logging).

Now let’s look at some code using the AsyncLocalStorage to see the benefit that it brings:

const AsyncLocalStorage = require('async_hooks').AsyncLocalStorage;

const asyncLocal1 = new AsyncLocalStorage();

let   sharedFlowId = "";

function asyncflow(flowid) {

  setTimeout( () => {

    asyncLocal1.run(new Object(), () => {

      sharedFlowId = flowid;

      asyncLocal1.getStore()['flowid'] = flowid;

      setTimeout(() => {

        console.log(flowid + ':' + sharedFlowId + ':' + asyncLocal1.getStore()['flowid']);

      }, Math.random()*500);

    });

  }, Math.random()*500);

}

asyncflow('flow1');

asyncflow('flow2');

asyncflow('flow3');

With the output for a few runs being:

closure:shared:aslocal

--------------------

flow2:flow2:flow2

flow1:flow1:flow1

flow3:flow1:flow3

 

flow2:flow2:flow2

flow1:flow2:flow1

flow3:flow2:flow3

 

flow3:flow3:flow3

flow1:flow2:flow1

flow2:flow2:flow2

The first question I often get on this sample is “If you can just use variable capture through closures why do I need AsyncLocalStorage?”.

The easy answer is that while it’s possible to get the value for the flow in this example, for more realistic cases it’s not possible. The same answer goes for passing the flow value to through every async call made in the flow. Not only is that difficult, it can affect a lot of code, but more importantly it is also often impractical for modules like OpenTelmetry which insert instrumentation into existing code.

In the case of the value taken from the closure, I’ve included it in the example as it is an easy way to show what the expected value for the flowid is.

The second question is “if

const asyncLocal1 = new AsyncLocalStorage();”

creates a global just like “let   sharedFlowId = "";” then what’s the difference? The good news is that this example demonstrates why they are different!

You’ll notice from the sharedFlowId (second column) does not always match the value obtained through the closure (first column). This is because it is simply the last flow id that was written to the shared variable and depends on the order of execution.

On the other hand, although asyncLocal1 is a shared global, asyncLocal1.getStore() returns an object which is unique for each asynchronous flow. This is the “magic” that the AsyncLocalStroge API delivers and allows us to get the right flow id regardless of how many concurrent async flows are running the same code or how deeply the flow is nested.  You’ll notice that “asyncLocal1.getStore()['flowid'])” (third column) always matches the first column so we get the right flowid for each asynchronous flow.

If you still have any questions I’d suggest you play around a bit with the example to convince yourself that AsyncLocalStorage delivers the goods!

Thanks to all of the project contributors

At this point I’d like to thank the large number of people who contribute to Node.js releases, including Red Hat's Bethany Griggs, on the Node.js Releaser's team, who created and coordinated the Node.js 16 release. The community has a large cast of contributors (including those across Red Hat and IBM) working across the Node.js project and each release is the result of all of their hard work.

How you can help

There are many ways to contribute to the Node.js project. Committing code and participating in the work of the Node.js organization is not the only way to contribute to the continued success of Node.js. As a Node.js application developer or end user a few ways you can help out include:

  • testing new releases early and often and provide feedback to the Node.js project.
  • Participating in the project’s surveys like the recent Next 10 survey.

If you are a package maintainer, there are a number of things you can do to align with the efforts of the Node.js project including:

You can read about all the new features and changes from the community post here. To learn more about the Node.js community and to learn how you can contribute, you can checkout the project repository and website. If you want to read more about what Red Hat is up to on the Node.js front, check out the Node.js developers page here. Finally,  if you’re a Red Hat customer, check out the customer portal for more information.


About the author

Michael Dawson is an active contributor to the Node.js project and chair of the Node.js Technical Steering Committee (TSC). He contributes to a broad range of community efforts including platform support, build infrastructure, N-API, Release, as well as tools to help the community achieve quality with speed (e.g., ci jobs, benchmarking and code coverage reporting). As the Node.js lead for Red Hat and IBM, he works with Red Hat and IBM's internal teams to plan and facilitate their contributions to Node.js and v8 within the Node and Google communities. Dawson's past experience includes building IBM's Java runtime, building and operating client facing e-commerce applications, building PKI and symmetric based crypto solutions as well as a number of varied consulting engagements. In his spare time, he uses Node.js to automate his home and life for fun.

Read full bio