NodeJS
You can build and test a Node.js application using a Linux platform on Harness Cloud or a self-managed Kubernetes cluster build infrastructure.
This guide assumes you've created a Harness CI pipeline.
Install dependencies
Use Run step to install dependencies in the build environment.
- Harness Cloud
- Self-managed
- step:
type: Run
identifier: dependencies
name: Dependencies
spec:
shell: Sh
command: |-
npm install express@4.18.2 --no-save
- step:
type: Run
identifier: dependencies
name: Dependencies
spec:
connectorRef: account.harnessImage
image: node:14.18.2-alpine
command: |-
npm install express@14.18.2 --no-save
In addition to Run steps, Plugin steps are also useful for installing dependencies.
You can use Background steps to run dependent services that are needed by multiple steps in the same stage.
Cache dependencies
- Cache Intelligence
- Save and Restore Cache steps
Cache Node dependencies with Cache Intelligence.
Add caching.enabled.true
to your stage.spec
:
- stage:
spec:
caching:
enabled: true
You can use built-in steps to:
All Node pipelines must include node_modules
in the sourcePaths
for your Save Cache step.
spec:
sourcePaths:
- node_modules
If your pipeline uses npm, the key
value must reference package-lock.json
in your Save Cache and Restore Cache steps.
spec:
key: cache-{{ checksum "package-lock.json" }}
If your pipeline uses yarn, the key
value must reference yarn.lock
in your Save Cache and Restore Cache steps.
spec:
key: cache-{{ checksum "yarn.lock" }}
YAML example: Save and restore cache steps
Here's an example of a pipeline with Save Cache to S3 and Restore Cache from S3 steps.
steps:
- step:
type: RestoreCacheS3
name: Restore Cache From S3
identifier: Restore_Cache_From_S3
spec:
connectorRef: AWS_Connector
region: us-east-1
bucket: your-s3-bucket
key: cache-{{ checksum "package-lock.json" }}
archiveFormat: Tar
- step:
type: Run
...
- step:
type: BuildAndPushDockerRegistry
...
- step:
type: SaveCacheS3
name: Save Cache to S3
identifier: Save_Cache_to_S3
spec:
connectorRef: AWS_Connector
region: us-east-1
bucket: your-s3-bucket
key: cache-{{ checksum "package-lock.json" }}
sourcePaths:
- node_modules
archiveFormat: Tar
Build and run tests
Add Run steps to build and run tests in Harness CI.
- Harness Cloud
- Self-managed
- step:
type: Run
name: npm test
identifier: npm_test
spec:
shell: Sh
command: |-
npm install
npm run build --if-present
npm test
reports:
type: JUnit
spec:
paths:
- report.xml
- step:
type: Run
name: npm test
identifier: npm test
spec:
connectorRef: account.harnessImage
image: node:latest
shell: Sh
command: |-
npm install
npm run build --if-present
npm test
reports:
type: JUnit
spec:
paths:
- report.xml
Visualize test results
If you want to view test results in Harness, make sure your test commands produce reports in JUnit XML format and that your steps include the reports
specification.
reports:
type: JUnit
spec:
paths:
- report.xml
Test splitting
You can use test splitting (parallelism) to improve test times.
Specify version
- Harness Cloud
- Self-managed
Node is pre-installed on Hosted Cloud runners. For details about all available tools and versions, go to Platforms and image specifications.
If your application requires a specific Node version, add a Run step to install it.
Install one Node version
- step:
type: Run
name: Install Node
identifier: installnode
spec:
shell: Sh
envVariables:
NODE_VERSION: 18.16.0
command: |-
mkdir $HOME/nodejs
curl -L https://nodejs.org/dist/v${NODE_VERSION}/node-v${NODE_VERSION}-linux-x64.tar.xz | tar xJ -C $HOME/nodejs
export PATH=$HOME/nodejs/node-v${NODE_VERSION}-linux-x64/bin:$PATH
Install multiple Node versions
- Add the matrix looping strategy configuration to your stage.
- stage:
strategy:
matrix:
nodeVersion:
- 18.16.0
- 20.2.0
- Reference the matrix variable in your steps.
- step:
type: Run
name: Install node
identifier: installnode
spec:
shell: Sh
command: |-
mkdir $HOME/nodejs
curl -L https://nodejs.org/dist/v${NODE_VERSION}/node-v${NODE_VERSION}-linux-x64.tar.xz | tar xJ -C $HOME/nodejs
export PATH=$HOME/nodejs/node-v${NODE_VERSION}-linux-x64/bin:$PATH
envVariables:
NODE_VERSION: <+matrix.nodeVersion>
Specify the desired Node Docker image tag in your steps. There is no need for a separate install step when using Docker.
Use a specific Node version
- step:
type: Run
name: Node Version
identifier: nodeversion
spec:
connectorRef: account.harnessImage
image: node:18.16.0
shell: Sh
command: |-
npm version
Use multiple node versions
- Add the matrix looping strategy configuration to your stage.
- stage:
strategy:
matrix:
nodeVersion:
- 18.16.0
- 20.2.0
- Reference the matrix variable in the
image
field of your steps.
- step:
type: Run
name: Node Version
identifier: nodeversion
spec:
connectorRef: account.harnessImage
image: node:<+matrix.nodeVersion>
shell: Sh
command: |-
npm version
Full pipeline examples
Here's a YAML example of a pipeline that:
- Tests a Node code repo.
- Builds and pushes an image to Docker Hub.
This pipeline uses Harness Cloud build infrastructure and Cache Intelligence.
If you copy this example, replace the placeholder values with appropriate values for your Harness project, connector IDs, account/user names, and repo names.
Pipeline YAML
pipeline:
name: nodejs-sample
identifier: nodejssample
projectIdentifier: default
orgIdentifier: default
tags: {}
stages:
- stage:
name: Build Node App
identifier: Build_Node_App
description: ""
type: CI
spec:
cloneCodebase: true
caching:
enabled: true
platform:
os: Linux
arch: Amd64
runtime:
type: Cloud
spec: {}
execution:
steps:
- step:
type: Run
name: npm test
identifier: npm_test
spec:
shell: Sh
command: |-
npm install
npm run build --if-present
npm test
- step:
type: BuildAndPushDockerRegistry
name: BuildAndPushDockerRegistry_1
identifier: BuildAndPushDockerRegistry_1
spec:
connectorRef: YOUR_DOCKER_CONNECTOR_ID
repo: YOUR_DOCKER_HUB_USERNAME/DOCKER_REPO_NAME
tags:
- <+pipeline.sequenceId>
properties:
ci:
codebase:
connectorRef: YOUR_CODE_REPO_CONNECTOR_ID
repoName: YOUR_REPO_NAME
build: <+input>
Next steps
Now that you have created a pipeline that builds and tests a Node app, you could:
- Create triggers to automatically run your pipeline.
- Add steps to build and upload artifacts.
- Add a step to build and push an image to a Docker registry.