Connect Node to Postgres
Learn how to connect a Node application to a postgres database using the node-postgres or pg npm library, what connection pooling is, the pg npm library, and managing environment variables with dotenv.
Table of Contents 📖
- Initializing a Node Project
- Installing pg
- Client vs Pool
- Working with Environment Variables
- Connecting to PostgreSQL with a Pool
Initializing a Node Project
If we want to add persistence to a Node application we can use the relational database PostgreSQL. To begin, create a directory to hold the project and then navigate into it. We will call this project connect-node-to-postgres.
mkdir connect-node-to-postgres
cd connect-node-to-postgres
Now lets initialize this project as an npm project by using npm init -y. This command creates a package.json file to hold information about the project. The -y flag provides default information to package.json.
npm init -y
Installing pg
To connect node to postgres, we are going to use the pg npm library. This library is a pure JavaScript implementation of a PostgreSQL client. A PostgreSQL client is essentially a connection to a PostgreSQL server that can issue commands/database operations.
npm i pg
Client vs Pool
There are two different ways to connect to postgres with the pg library, Client and Pool. Both of these are objects that are exported from the pg library. To use them, lets create an index.js file and import them.
const { Pool, Client } = require('pg');
A Client is one static connection to the Postgres server while a Pool is a dynamic number of Clients that have automatic re-connect functionality. In other words, the Pool object is a connection pool while the Client object is a single connection. A connection pool should be used in applications that need to handle concurrent requests or frequent queries. This is because connecting to the PostgreSQL server carries a fair amount of overhead. For this demonstration we will use the Pool.
Working with Environment Variables
The best way to provide connection information to postgres is through environment variables. This is so we can set up different databases for development and production. To easily work with environment variables in Node we can use the dotenv library.
npm i dotenv
The npm library dotenv loads environment variables from a .env file into node's process.env. Create a file called development.env and place the following inside.
USER=postgres
HOST=localhost
DATABASE=postgres
PASSWORD=postgres
PORT=5432
- The USER variable is the database user. A fresh postgres database will have the predefined user called postgres.
- HOST is the location of the PostgreSQL server, here the server is running on our computer or localhost.
- DATABASE is the name of the database. Most Postgres servers have the database called postgres defined by default.
- PASSWORD is the password to access PostgreSQL. The default postgres password is postgres though it will be the password used when the PostgreSQL server was created.
- PORT is the port that the PostgreSQL server is running on. By default this is port 5432.
Now lets load this environment file into our node application with dotenv.
const path = require('path');
require('dotenv').config({
override: true,
path: path.join(__dirname, "development.env")
});
The config function from the dotenv library loads the environment variables into process.env. We can pass it a path variable to accept a path to the environment variable file. We can also provide the override option which will override any environment variables already set on our system. Of course, when more environments are added the path should not be a hardcoded path but rather a dynamic path to the environment variables applicable to the chosen environment.
Connecting to PostgreSQL with a Pool
To connect to PostgreSQL with a connection pool we instantiate a Pool object. We then need to supply it with the environment variables that we created inside our development.env file.
const pool = new Pool({
user: process.env.USER,
host: process.env.HOST,
database: process.env.DATABASE,
password: process.env.PASSWORD,
port: process.env.PORT
})
The Pool is initially empty and clients are created when they are needed. Now, to ensure that we are connected, lets query the PostgreSQL server. First lets checkout a client from the pool. This is done with the connect method. We will wrap this all in an asynchronous IIFE so we can use the await keyword.
(async () => {
const client = await pool.connect();
})();
Now that we've checked out a client (connection) from the pool, we can use that client to query the postgres server. We can query it with the query method and extract the returned rows from the response. Lets get the current user, which here is the postgres user.
try {
const response = await client.query("SELECT current_user");
const {rows} = response;
const currentUser = rows[0]['current_user'];
console.log(currentUser); // postgres
} catch (err) {
console.log(err);
} finally {
client.release();
}
We need to wrap the query inside a try, catch, finally block so that we can release the client back into the pool. We want to do this whether the query was successful or not. This is because if we don't release the client back into the pool then soon the pool will be depleted and there will be no clients to handle any requests. However, if we want the node-postgres library to handle checking out and releasing clients from the pool for us, we can call the query method directly on the pool as opposed to the connection checked out from the pool.
try {
const response = await pool.query("SELECT current_user");
const {rows} = response;
const currentUser = rows[0]['current_user'];
console.log(currentUser); // postgres
} catch (err) {
console.log(err);
}