Build a Data Access Layer with PostgreSQL and Node.js
šŸŽƒ

Build a Data Access Layer with PostgreSQL and Node.js

Created
Jun 3, 2022 06:34 AM
Last Updated
Last updated June 3, 2022
Owners
Tags
nodejs
data
database
postgres
sql
Status
In Progress šŸ”Ø

Project Scaffold to Build a Node and Postgres DAL

The main project has many sub-folders. So, fire up the console and type:
> mkdir node-postgres-data-layer > cd node-postgres-data-layer > mkdir db > mkdir test > mkdir db/model > mkdir db/pg
Be sure to create the following files within the folder structure:
notion image
Here's what each file is for:
  • ActorDtoĀ - Entity type which defines the actor model
  • pg/ActorĀ - Postgres implementation to find and update actor data
  • ActorDbĀ - Interface that's useful outside the DAL
  • Actor.testĀ - Unit tests ā€” no Postgres database necessary
TheĀ indexĀ file is mostly there to glue the code together. TheĀ indexĀ file under the root folder works as the entry point.
Fire upĀ npm initĀ to create a basicĀ package.jsonĀ file. Once complete, execute the following NPM commands:
> npm i @types/chai @types/chai-as-promised @types/mocha @types/pg @types/sinon chai chai-as-promised mocha sinon ts-node typescript --save-dev > npm i pg --save
You can initialize theĀ tsconfig.jsonĀ file viaĀ tsc --init. Be sure to run this command at the project's root. TheĀ tscĀ command is available when you install TypeScript globally or in theĀ node_modules/.binĀ folder. EnableĀ "outDir": "./dist", and addĀ "exclude": ["test"]Ā to the default config file.
Most of these packages are developer dependencies that make our lives easier. The only real dependency isĀ pgĀ which is theĀ node-postgresĀ package that talks to Postgres from a Node codebase.
In theĀ package.jsonĀ underĀ scripts, put the following commands:
{ "start": "tsc && node ./dist/index", "type-check": "tsc", "test": "mocha --require ts-node/register ./test/*.ts" }
Lastly, be sure to have a working copy of Postgres running on your local machine. The database is aĀ sample relational dbĀ calledĀ dvdrental.
Go ahead and download the zip file, unpack it, and restore theĀ dvdrentalĀ database on your local machine. This DAL only focuses on theĀ actor,Ā film_actor, andĀ filmĀ relational tables. It is also possible to explore the rest of the schema via thisĀ PDF download.

Set Up the Bare Necessities in TypeScript

Next, write the bare skeleton code to be fleshed out later. To make things easier, the file name goes on top of the implementation. To follow along, simply find the file in the project structure and copy-paste the code. All these files are in theĀ dbĀ folder.
ActorDto:
export type ActorDto = { // data contract actorId: number; firstName: string; lastName: string; movie: string; rentalRate: number; lastUpdate: string; };
pg/Actor:
import { Pool, QueryResult } from "pg"; import { ActorDto } from "../model/ActorDto"; export class Actor { #pool: Pool; // connection pool constructor(pool: Pool) { this.#pool = pool; } async findByYearAndLastName( year: number, lastName: string ): Promise<ActorDto[]> { return []; } async updateLastNameByIds(lastName: string, ids: number[]): Promise<number> { return 0; } async updateFirstNameByIds( firstName: string, ids: number[] ): Promise<number> { return 0; } private static mapActorResult = ( res: QueryResult ): ActorDto[] => // projection res.rows.map((r) => ({ actorId: r.actor_id, firstName: r.first_name, lastName: r.last_name, movie: r.title, rentalRate: r.rental_rate, lastUpdate: r.last_update, })); }
ActorDb:
import { ActorDto } from "./model/ActorDto"; export interface ActorDb { // external interface findByYearAndLastName(year: number, lastName: string): Promise<ActorDto[]>; updateLastNameByIds(lastName: string, ids: number[]): Promise<number>; updateFirstNameByIds(firstName: string, ids: number[]): Promise<number>; }
index:
import { Pool, types } from "pg"; import { ActorDb } from "./ActorDb"; import { Actor } from "./pg/Actor"; const connectionString = "postgres://postgres:postgres@127.0.0.1:5432/dvdrental"; const pool = new Pool({ // single pool connectionString, }); export const actor: ActorDb = new Actor(pool); // strongly-typed
Be sure to include the correct connection string. It follows this format:Ā 
šŸ”Œ
<username>:<password>@<server>:<port>/<database>
Actor.test:
import chai, { expect } from "chai"; import chaiAsPromised from "chai-as-promised"; import sinon, { SinonMock, SinonStub } from "sinon"; import { Pool } from "pg"; import { actor } from "../db"; // system under test chai.use(chaiAsPromised); class ClientMock { query() {} release() {} } describe("Actor", () => { let query: SinonStub; let connect: SinonStub; let client: SinonMock; beforeEach(() => { query = sinon.stub(Pool.prototype, "query"); connect = sinon.stub(Pool.prototype, "connect"); client = sinon.mock(ClientMock.prototype); }); afterEach(() => { query.restore(); // reset stub/mock connect.restore(); client.restore(); }); });

Clean Architecture Principles in Node

The basic skeleton above follows Clean Architecture as dependencies physically pull the entity away from implementation details.
In Node, dependencies should flow in a single direction, andĀ ActorDto, for example, sits in the innermost layer. This is the Data Transfer Object (DTO) which declares theĀ data contractĀ that the business understands.
At the outermost layer isĀ pg/Actor, which actually talks to the Postgres database. The projection inĀ mapActorResultĀ translates from table columns in the db into the DTO. The interfaceĀ ActorDbĀ declares the external interface, which is useful outside the DAL. This way, it is possible to swap out databases, from Postgres to Mongo for example, without risky invasive surgery. As long as the contracts hold, the changes will theoretically be minimal.
Since TypeScript strongly-typedĀ pg/Actor, the code then becomes testable. InĀ Actor.test, we bring in theĀ indexĀ file inĀ dbĀ to test the DAL. Stubs and mocks are put in place, so tests never actually talk to Postgres but just verify the code we wrote.
To illustrate, this is what the ā€œonionā€ looks like in Clean Architecture:
notion image
Source for original diagram:Ā Uncle Bob's blog - The Clean Architecture
In practical terms, this is what the implementation looks like in Node:
notion image
Dependencies are unidirectional, and the DTO is the business entity. Implementation details inĀ pg/ActorĀ adhere to a strict interface that acts much like a contract. This, in some way, follows theĀ dependency of inversionĀ principle because the contract declares what the code must do. The use cases outside the DAL can then take the contract and not care about how it finds or updates data.

Queries in TypeScript

To practice TDD, we simply write the test first, and look for a failure. Put this inĀ Actor.test:
it("findByYearAndLastName", async () => { query.resolves({ rows: [{}], }); const result = await actor.findByYearAndLastName(0, ""); expect(result.length).to.equal(1); });
Add in a passing implementation by replacingĀ findByYearAndLastName Ā inĀ pg/Actor:
async findByYearAndLastName( year: number, lastName: string): Promise<ActorDto[]> { const res = await this.#pool.query(` SELECT a.actor_id, a.first_name, a.last_name, f.title, f.rental_rate, a.last_update FROM actor AS a INNER JOIN film_actor AS fa ON a.actor_id = fa.actor_id INNER JOIN film AS f ON fa.film_id = f.film_id WHERE f.release_year = $1 AND a.last_name = $2 `, [year, lastName]); return Actor.mapActorResult(res); }
Notice the call at the end that returns aĀ Promise<ActorDto>. This is a projection that contains the DTO, an external entity, and is how people who are not engineers reason about the business domain.
TheĀ #poolĀ object remains hidden, even during runtime, and abstracts implementation details away from use cases.

Transactions in TypeScript

For transactions, let's say there is a list of actor ids with a new last name.
it("updateLastNameByIds#commit", async () => { client.expects("release").once(); // release back to the pool client.expects("query").exactly(4).resolves({ rowCount: 1, }); connect.resolves(new ClientMock()); const count = await actor.updateLastNameByIds("", [0, 0]); client.verify(); expect(count).to.equal(2); }); it("updateLastNameByIds#rollback", async () => { client.expects("release").once(); client.expects("query").twice().rejects().onSecondCall().resolves(); connect.resolves(new ClientMock()); await expect(actor.updateLastNameByIds("", [0, 0])).to.eventually.be.rejected; client.verify(); });
In the rollback test, theĀ clientĀ mock expects the code to executeĀ queryĀ exactly twice. Then, it rejects the promise to throw an exception and resolves the subsequent promise when the exception gets caught. This technique shows howĀ sinonĀ can really help to keep tests fluent and somewhat easy to follow.
There is the potential of a failure mid-changes, so this tests both the happy path and the rollback. Here are the implementation details:
async updateLastNameByIds( lastName: string, ids: number[]): Promise<number> { let count = 0; const client = await this.#pool.connect(); try { await client.query('BEGIN'); const result = await Promise.all(ids.map(id => // loops client.query(` UPDATE actor SET last_name = $1 WHERE actor_id = $2 `, [lastName, id]))); await client.query('COMMIT'); count = result.map(r => r.rowCount).reduce((c, v) => c + v, count); } catch (e) { await client.query('ROLLBACK'); // query must resolve throw e; } finally { client.release(); } return count; }
TheĀ mapĀ method iterates through the list of ids. A promise fires all requests in parallel, so the client waits less. Once all requests complete, it returns the result array used to gather a count via a reduce.
If there are any issues mid-changes, an exception is thrown that rollbacks the changes and rethrows this same exception to unwind the call stack. It is important not to swallow the error, or mask the error with a different exception.
TheĀ finallyĀ block mustĀ releaseĀ the client back to the pool otherwise the connection pool might run dry.

Scalability in Postgres

Too many queries in a single transaction can cause a performance bottleneck in Postgres. One technique to combat this is to shorten long-running connections by sending a single query instead.
async updateFirstNameByIds( firstName: string, ids: number[]): Promise<number> { const res = await this.#pool.query(` UPDATE actor SET first_name = $1 WHERE actor_id = ANY($2) -- int[] `, [firstName, ids]); return res.rowCount; }
This time, the query updates the first name instead of the last name. But, this uses an array of integers as a parameter via ANY instead of a transaction. The unit test is similar to what has already been shown, so feel free to take a peek at the GitHub repo.
AppSignal offers instrumentation for Postgres,Ā with a magic dashboard on topĀ that visualizes your database performance metrics. Magic dashboards are automatically created for you when you integrate AppSignal into your app, so there is no need to configure the graphs manually.
You can check for starving pool connections via a Postgres magic dashboard.
notion image
The Postgres Pool metric captures idle and waiting connections in the pool. This is one effective way to monitor database performance and find ways to reduce SQL pressure.
notion image

Putting It All Together: Execute the DAL

Lastly, in theĀ indexĀ file under the root folder, do:
import { actor } from "./db"; (async () => { const actors = await actor.findByYearAndLastName(2006, "Goldberg"); console.log(actors); let count: number; count = await actor.updateLastNameByIds("Goldberg", [-1, 0, -1, -1, 0]); console.log("Trans update: " + count); count = await actor.updateFirstNameByIds("Parker", [-1, 0, -1, -1, 0]); console.log("Array update: " + count); })().then(() => console.log("DONE"));
With this in place, runĀ npm startĀ to execute the entire DAL. Notice we bring in the strongly-typedĀ actorĀ dependency. In Clean Architecture, the business use cases can use this same technique to interact with persisted data.

Data Types

You may come across a gotcha in the output after running the code:
notion image
TheĀ rentalRateĀ is declared as a number, but the DAL actually returns a string. To tackle this problem, simply add this toĀ db/index:
const NUMERIC_OID = 1700; types.setTypeParser(NUMERIC_OID, (val) => parseFloat(val));
This tells node-postgres to parse from a numeric type in Postgres, based on an OID, into a number type in Node.

Ā 
Ā 
Blog Article by:
Ankur Paul
Ā