Faster Node.js VS Code containers with RAM disks

Paul Hammond, 23 October 2020

I’ve switched all of my development over to VS Code Remote Containers and it’s working really well. Having every project isolated with its own runtime means I don’t have to upgrade every project at the same time, and I no longer find that half my projects have broken thanks to a macOS or homebrew upgrade.

The one challenge is that Node.js projects can be much slower when running in a container. This is not surprising, since these projects usually have tens of thousands of files inside the node_modules directory. That directory is inside a Docker bind mount and Hyperkit needs to do a lot of extra work to keep all those files in sync with the host computer.

The VS Code documentation discusses this problem and suggest using a named volume to improve disk performance, but doing this requires managing Docker volumes outside of VS Code, and in my subjective experience didn’t seem to result in much improvement in speed.

Instead, I’ve started using RAM disks, which are faster and can be managed entirely within the devcontainer.json file:

{
  "name": "node",
  "build": { "dockerfile": "Dockerfile" },
  "runArgs": [
    "--tmpfs",
    "${containerWorkspaceFolder}/node_modules:exec"
  ],
  "postStartCommand":
    "sudo chown node node_modules && npm i",
  …
}

The runargs config adds an argument to the docker run command. This particular argument tells Docker to create a new tmpfs at /workspaces/project/node_modules . The exec flag is needed by a handful of packages that install helper scripts, otherwise Linux will ignore the executable bit on those files.

The postStartCommand then ensures that every time the container is started we give the node user write access to this directory. We also run npm i for good measure.

The end result is a container where node_modules is stored in RAM, and Docker knows that it’s not important data so doesn’t do extra work to sync it to disk. As a result everything is faster.