node_modules and Docker
Learn how to handle node_modules inside a Docker container including how to fix any errors associated with OS specific binaries. We will be using the Node package bcrypt as a demonstration.
Table of Contents 📖
- What is the node_modules Folder?
- node_modules Issues with Docker
- node_modules Error Demonstration
- How to Handle node_modules with Docker
What is the node_modules Folder?
The node_modules folder is what Node uses to keep track of packages that have been installed locally. When packages are installed locally from npm they are copied into the node_modules folder. Node then looks inside node_modules for a specific package when it is required.
node_modules Issues with Docker
The node_modules folder can be problematic for Docker if it contains packages with binaries specific to certain operating systems. In other words, certain packages will install different files depending on the operating system of the computer. This can cause issues if you are devloping an application with Docker as the Docker container doesn't always use the same OS as the host computer.
node_modules Error Demonstration
As a demonstration lets use bcrypt, an npm package to help hash passwords. This package will install different binaries depending on the operating system it is installed on. First, lets set up a simple application and install bcrypt and also nodemon.
npm init es6 -y
npm i bcrypt
npm i nodemon -D
Now, lest create a src folder to hold our source code and place an index.js file inside.
mkdir src
cd src
touch index.js
Finally lets create a simple start script to run our application with nodemon.
"start": "nodemon ./src/index.js"
Now lets create a simple Node Docker image. First lets set up the Dockerfile.
FROM node:20-alpine
ENV NODE_ENV development
WORKDIR /app
COPY package*.json .
RUN npm install
CMD [ "npm", "start" ]
This sets up a Node version 20 image that uses Alpine Linux. It also copies the package.json and package-lock.json files over to the container and installs the project dependencies from within the container. This is fine as it will install the packages based off the Alpine Linux distribution. We could then build this image using the following command.
docker build -t my-node-image:0.0.1 -f Dockerfile .
Now that we've built our image, we can create a container from it by using the following command.
docker run -it --name my-node-c -v ./:/app my-node-image:0.0.1
A part of this command that is very important is the -v flag. This tells Docker to establish a volume between our local machine application code and the container. This gives us live code updates inside the container which is necessary for program development. However, we are overwriting the node_modules folder in the Linux Alpine container with our local machine's node_modules. As we are using bcrypt, a dependency that has operating system specific binaries, this will cause issues. As a demonstration, lets simply import the bcrypt library to a file.
import bcrypt from 'bcrypt';
bcrypt.hash('wittcode', 1).then(data => {
console.log(data);
});
Nodemon will update and restart, causing the following error to be displayed in the console.
node:internal/modules/cjs/loader:1473
return process.dlopen(module, path.toNamespacedPath(filename));
^
Error: Error loading shared library /app/node_modules/bcrypt/lib/binding/napi-v3/bcrypt_lib.node: Exec format error
How to Handle node_modules with Docker
There are a few ways to fix this, however one common way is to map the node_modules folder to either a named volume or anonymous volume. This is so that Docker can handle the node_modules itself and prevent it from being overwritten. In other words, we have two volumes now. One for the application and one for the node_modules folder.
docker run -it --name my-node-c -v ./:/app -v my-node-modules:/app/node_modules my-node-image:0.0.1
Now both the host and docker container have their own OS specific binaries as the node_modules from the host are no longer overwriting the ones in the container. However, because we no longer map our node_modules host folder to the one in Docker, we will need to run npm install both inside the container and outside. So say we wanted to install the npm package webpack, first we would install it locally with npm i webpack.
npm i webpack
Now, because of our source code volume, the container's package.json file will have webpack listed. So now we just need to run npm install inside the container which can be done with the docker exec command.
docker exec my-node-c sh -c "npm i"
This tells docker to execute the string "npm i" as a command inside the docker container called my-node-c. After doing this, the container's node_modules will contain all the dependencies for webpack specific to its environment.