Node.js is one of the most popular JavaScript runtime environments initially released in 2009 and is still going strong in 2023. The Node.js ecosystem keeps evolving and adding new features to provide developers with a better experience. Here are the 5 new nodejs features every Node.js developer should try this year.
Test Runner
Tests are essential in the modern development process. Before v18, there was no out-of-the-box test runner for Node.js. So developers depended on third-party test runners like Mocha or Jest to do unit testing. This complicates the existing environment configs, CI/CD pipelines, and maintenance overheads. It is also challenging to choose which tool to use. Some of these tools are easy to integrate with minimal effort but lack advanced nodejs features like async testing or auto-mocking. There are tools available with these features, but the trade-off is they are not so easy to set up.
The Node.js “Test Runner” enables a limited subset of the functionalities provided by these testing frameworks but in a lightweight manner.
How does it work?
- No need to install any npm packages
- node:test module facilitates the creation of JavaScript tests
- node:assert module for the assertion
Node.js Test Runner took inspiration from mocha, Deno, and Jest.
import test from ‘node:test’;
import assert from ‘node:assert/strict’;
const simpleFunc = () => ‘really simple’;
test(‘simpleFunc’, (t) => {
assert.equal(simpleFunc(), ‘really simple!’);
});
Node.js Test Runner also provides features like subtests, async-tests, mocking, use of custom reporters, etc.
For more advanced functionality like Module Interception, Spying, and Stubbing, Standalone Snapshots we can use external packages
– Watch
Another cool feature to look for is the -watch functionality. This is super useful during the development process.
This will automatically restart the Node.js application if there are any file changes. No need to use ‘Nodemon’ anymore. Watch mode will watch the entry point of any required or imported module, no need to stop and start your application again.
Use “node –watch index.js” to start your application in watch mode (assuming index.js is your entry point).
Stream Accessors
Now you can use map/reduce/filters inside streams, how cool is that? Streams are one of the fundamental concepts, which power Node.js applications. However, they are somewhat hard to work with. With the help of streams, we can read or write input into output sequentially. Streams read chunks of data, processing its contents without keeping everything in memory. This helps to process large datasets or big files, even if you have little free memory space available.
With the help of Stream Accessors, now we can use map/reduce/filters inside each chunk of the stream.
How does it work?
Let’s find out how can we use these in a Readable stream.
In the following example, we are going to read a few web pages and then map the web pages and log the resolved DNS for those pages.
import { Readable } from ‘node:stream’;
import { Resolver } from ‘node:dns/promises’;
const resolver = new Resolver();
const dnsResponses = Readable.from([
‘www.perfomatix.com’,
‘news.ycombinator.com’,
‘www.google.com’
]).map((domain) => resolver.resolve4(domain));
for await (const response of dnsResponses) {
console.log(response);
}
We can also use the filter operator with a stream, let’s find out how can we do that using a simple example.
In this example, we are filtering out domains that took more than 60 milliseconds to resolve.
import { Readable } from ‘node:stream’;
import { Resolver } from ‘node:dns/promises’;
const resolver = new Resolver();
const dnsResponses = Readable.from([
‘www.perfomatix.com’,
‘news.ycombinator.com’,
‘www.google.com’
]).filter(async (domain) => {
const addresses = await resolver.resolve4(domain, { ttl: true });
return addresses.filter(address => address.ttl > 1);
});
for await (const response of dnsResponses) {
// Logs domains with more than 60 milliseconds on the resolved DNS record.
console.log(response);
}
We can also use operators like find, every, some, take, etc with streams.
Argument Parsers
Previously we were using external modules to create programs inside CLI with the help of tools like Commander.js and Yargs to parse arguments. These are very popular tools with millions of weekly downloads.
Now Node.js supports built-in argument parsing with the help of parseArgs in the node:util module.
Let’s see this in action.
import { parseArgs} from ‘node:util’;
const options = {
‘org’: {
type: ‘string’
},
‘location’: {
type: ‘string’
}
}
const { values } = parseArgs ({options});
console.log(values);
Use the following command to pass org and location as parameters:
node parseArgs.mjs –org=Perfomatix –location=India
You can also define single character alias for the option with the help of a short option.
import { parseArgs} from ‘node:util’;
const options = {
‘org’: {
type: ‘string’,
short: ‘o’
},
‘location’: {
type: ‘string’,
short: ‘l’
}
}
const { values } = parseArgs ({options});
console.log(values);
Undici
Undici is a spec-compliant HTTP/1.1 client, which is written from scratch and represents the expansion of the Node.js HTTP stack.
Why Undici?
Node.js already has an HTTP stack that works very well and developers are using it extensively. So why the Node.js team goes into the hassle of creating something from scratch? The answer is the HTTP stack in Node.js have some fundamental design issues that are impossible to resolve without breaking the API. There are also many bugs and performance bottlenecks. So Undici was born.
Some important nodejs features of Undici are:
- Keep alive by default – No need to install the keep-alive agent
- Unlimited connections
- LIFO scheduling – minimize timeout errors
- No pipelining – by default, but it is configurable
- Can follow redirected (opt-in)
Undici re-invents HTTP primitives and relies directly on SOCKETS. Mocking approaches like nock won’t work with Undici, but it comes with its own built-in mocking system.
Now let’s see some code.
The getDogBreeds function will call an API to get the breeds using Undici.
const { request } = require(‘undici’);
module.exports.getDogBreeds = async () => {
const { body } = await request(‘https://dog.ceo/api/breeds/list/all’);
const data = (await body.json()).message;
console.log(data);
return data;
}
Now let’s add some unit tests for this.
The below tests will verify whether the breed bulldog exists in the response. (Note that we used built-in test runner and assert modules).
import test from ‘node:test’;
import assert from ‘node:assert/strict’;
import { getDogBreeds } from ‘./dog-client.js’;
test(‘getDogBreeds’, async (t) => {
const dogBreeds = await getDogBreeds();
assert.ok(dogBreeds.human);
})
Now let’s mock the test using the built-in MockAgent.
const { MockAgent } = require(‘undici’);
const dogBreeds = require(‘./breeds.json’);
const agent = new MockAgent();
agent.disableNetConnect();
const client = agent.get(‘https://dog.ceo’);
client.intercept({
path: ‘/api/breeds/list/all’,
method: ‘GET’
}).reply(200, {
message: dogBreeds,
status: 200
});
module.exports = agent;
Conclusion
The Node.js ecosystem is growing tremendously. The Node.js core team is adding many new cool features which were only attainable using external dependencies. These new nodejs features are really compelling and every Node.js developer should try out these.