Integration Testing
supertest, Real Databases, Full Request Flow
LinkedIn Hook
"Your unit tests are all green. Your API is broken in production. How?"
Unit tests mock the database. They mock the auth middleware. They mock the request. They mock everything that actually matters in a real HTTP request. And then they pass — confidently — while your endpoint returns 500s to real users.
Integration tests are the missing layer. They boot your real Express app, hit real routes through a real HTTP client, run real middleware, and talk to a real database. If
POST /usersis broken because the validation middleware never registered, an integration test catches it. A unit test never will.The tools are simple: supertest for the HTTP layer, a dedicated test database (Postgres in Docker, or testcontainers), and a clean transaction-rollback pattern for isolation. Set
NODE_ENV=test, seed deterministic data, and you have a test suite that catches the bugs unit tests miss — without the flakiness of full end-to-end browser tests.In Lesson 9.2, I walk through the exact patterns: supertest GET/POST, beforeAll/afterAll database setup, transaction rollback for test isolation, authenticated requests with JWT, and when to choose in-memory SQLite vs real Postgres in Docker.
Read the full lesson -> [link]
#NodeJS #Testing #IntegrationTesting #Express #Postgres #InterviewPrep
What You'll Learn
- How to use
supertestto fire HTTP requests at an Express app without binding a port - How to set up and tear down a real test database with
beforeAll/afterAll - The transaction-rollback pattern for fast, isolated tests
- How to run middleware (auth, validation, logging) inside the request lifecycle
- How to test authenticated routes by minting JWTs in test setup
- Why
NODE_ENV=testmatters for environment separation - Seeding deterministic test data with factories and fixtures
- When to choose in-memory SQLite vs real Postgres in Docker vs testcontainers
The Restaurant Analogy — Why Integration Tests Matter
A unit test is like tasting a single ingredient. You taste the salt, the salt is salty — it passes. You taste the tomato, the tomato is tomato-y — it passes. You taste the basil, the basil is fragrant — it passes. Every ingredient is perfect.
Then a customer orders the dish. The chef forgot to actually combine the ingredients. Or the plate was dirty. Or the waiter delivered it to the wrong table. The dish fails — even though every individual ingredient was flawless.
That is exactly the gap between unit tests and integration tests in a Node.js API. Unit tests verify individual functions. Integration tests verify the whole request — the route registration, the body parser, the auth middleware, the validation, the controller, the database query, and the JSON response — running together exactly as they would in production.
+---------------------------------------------------------------+
| UNIT TEST vs INTEGRATION TEST |
+---------------------------------------------------------------+
| |
| UNIT TEST (one ingredient): |
| |
| test('createUser hashes password', () => { |
| const user = createUser({ pw: 'abc' }); |
| expect(user.pw).not.toBe('abc'); |
| }); |
| |
| Mocks: db, req, res, middleware, jwt, everything. |
| Catches: logic bugs in createUser. |
| Misses: route not registered, middleware order wrong, |
| validation skipped, db schema mismatch. |
| |
| INTEGRATION TEST (the whole dish): |
| |
| test('POST /users creates a user', async () => { |
| const res = await request(app) |
| .post('/users') |
| .send({ email: 'a@b.c', pw: 'abc' }); |
| expect(res.status).toBe(201); |
| }); |
| |
| Mocks: nothing (or only third-party APIs). |
| Catches: everything above, plus serialization, headers, |
| status codes, db constraints, real SQL errors. |
| |
+---------------------------------------------------------------+
Napkin AI Visual Prompt: "Dark gradient (#0a1a0a -> #0d2e16). Two columns. LEFT column 'Unit' shows a small green (#68a063) test tube isolating one function with mock walls around it. RIGHT column 'Integration' shows a large amber (#ffb020) arrow flowing HTTP -> Express -> Middleware -> Controller -> Postgres -> JSON response, with green checkmarks at every step. White monospace labels. Subtle grid background."
Tooling Setup — supertest, Jest, and a Real Express App
The key insight behind supertest is that it accepts an Express app object directly. It does not require you to call app.listen(). supertest spins up an ephemeral HTTP server on a random port, fires the request, and tears the server down — all inside the test process. No port conflicts, no flaky cleanup.
# Install testing dependencies
npm install --save-dev jest supertest @types/supertest cross-env
// package.json (scripts section)
{
"scripts": {
"test": "cross-env NODE_ENV=test jest --runInBand",
"test:watch": "cross-env NODE_ENV=test jest --watch --runInBand"
}
}
The --runInBand flag forces Jest to run test files serially in a single process. This matters for integration tests because parallel workers would race against the same test database. We will revisit parallelism strategies later.
cross-env NODE_ENV=test sets the environment variable on every OS (Windows, macOS, Linux). Your config files will read NODE_ENV to decide which database URL to use, whether to send real emails, whether to enable verbose logging, etc.
// src/config/index.js
// Single source of truth for environment-specific configuration
const env = process.env.NODE_ENV || 'development';
const config = {
development: {
dbUrl: 'postgres://localhost:5432/myapp_dev',
jwtSecret: 'dev-secret',
sendRealEmails: false,
},
test: {
// Dedicated test database — never share with dev
dbUrl: process.env.TEST_DATABASE_URL
|| 'postgres://localhost:5432/myapp_test',
jwtSecret: 'test-secret-do-not-use-in-prod',
sendRealEmails: false,
},
production: {
dbUrl: process.env.DATABASE_URL,
jwtSecret: process.env.JWT_SECRET,
sendRealEmails: true,
},
};
module.exports = config[env];
Example 1 — Testing GET and POST with supertest
This is the simplest possible integration test. We import the Express app (note: not the server), pass it to supertest, and assert on the response.
// src/app.js
// Export the Express app WITHOUT calling listen()
// This is the critical pattern that makes supertest work
const express = require('express');
const usersRouter = require('./routes/users');
const app = express();
app.use(express.json());
app.use('/users', usersRouter);
module.exports = app;
// src/server.js
// The actual server file calls listen() — only used in production/dev
const app = require('./app');
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`Server listening on ${port}`);
});
// tests/users.test.js
const request = require('supertest');
const app = require('../src/app');
describe('Users API', () => {
// Test a simple GET endpoint
test('GET /users returns 200 and an array', async () => {
const res = await request(app).get('/users');
// supertest returns a response object with status, body, headers
expect(res.status).toBe(200);
expect(Array.isArray(res.body)).toBe(true);
});
// Test a POST endpoint with a JSON body
test('POST /users creates a new user', async () => {
const res = await request(app)
.post('/users')
.send({ email: 'alice@example.com', name: 'Alice' })
.set('Content-Type', 'application/json');
expect(res.status).toBe(201);
expect(res.body).toMatchObject({
email: 'alice@example.com',
name: 'Alice',
});
// The server should assign an id
expect(res.body.id).toBeDefined();
});
// Test a validation failure
test('POST /users returns 400 for missing email', async () => {
const res = await request(app)
.post('/users')
.send({ name: 'Bob' });
expect(res.status).toBe(400);
expect(res.body.error).toMatch(/email/i);
});
});
Notice how every assertion targets the HTTP contract: status code, body shape, error messages. This is what your real API consumers will see, so this is what you test.
Example 2 — Test Database Setup with beforeAll / afterAll
A real integration test needs a real database. The cleanest pattern is:
beforeAll— connect to the test database, run migrations, ensure a clean state.beforeEach— start a transaction (or truncate tables) so each test starts empty.afterEach— roll back the transaction (or do nothing if you truncated).afterAll— close the database connection so Jest can exit cleanly.
// tests/setup/db.js
// Shared database setup helpers used by all integration tests
const { Pool } = require('pg');
const config = require('../../src/config');
const { runMigrations } = require('../../src/db/migrate');
// One pool for the entire test process
const pool = new Pool({ connectionString: config.dbUrl });
// Called once before any tests run
async function setupDatabase() {
// Safety check — refuse to wipe a non-test database
if (!config.dbUrl.includes('test')) {
throw new Error('Refusing to run tests against non-test database');
}
// Apply schema migrations to the test database
await runMigrations(pool);
}
// Called once after all tests finish
async function teardownDatabase() {
await pool.end();
}
// Called between tests to wipe data
async function truncateAllTables() {
// CASCADE drops dependent rows in foreign-keyed tables
// RESTART IDENTITY resets serial id sequences for deterministic ids
await pool.query(`
TRUNCATE TABLE users, posts, comments
RESTART IDENTITY CASCADE
`);
}
module.exports = { pool, setupDatabase, teardownDatabase, truncateAllTables };
// tests/users.db.test.js
const request = require('supertest');
const app = require('../src/app');
const {
setupDatabase,
teardownDatabase,
truncateAllTables,
} = require('./setup/db');
// Run once before the entire test file
beforeAll(async () => {
await setupDatabase();
});
// Run once after the entire test file
afterAll(async () => {
await teardownDatabase();
});
// Run before EACH test to guarantee isolation
beforeEach(async () => {
await truncateAllTables();
});
describe('Users API with real database', () => {
test('POST /users persists to the database', async () => {
const res = await request(app)
.post('/users')
.send({ email: 'carol@example.com', name: 'Carol' });
expect(res.status).toBe(201);
// Independently verify the row was actually written
const { pool } = require('./setup/db');
const result = await pool.query(
'SELECT email, name FROM users WHERE email = $1',
['carol@example.com'],
);
expect(result.rows).toHaveLength(1);
expect(result.rows[0].name).toBe('Carol');
});
test('GET /users returns previously created users', async () => {
// Seed two users directly via SQL — fast and explicit
const { pool } = require('./setup/db');
await pool.query(`
INSERT INTO users (email, name) VALUES
('a@x.com', 'A'),
('b@x.com', 'B')
`);
const res = await request(app).get('/users');
expect(res.status).toBe(200);
expect(res.body).toHaveLength(2);
});
});
Example 3 — The Transaction Rollback Pattern (Fast Isolation)
Truncating tables works, but it is slow when you have many tables. A faster pattern: wrap each test in a database transaction and roll it back at the end. Nothing ever gets committed, so the next test starts on a perfectly clean slate.
The catch: your application code must use the same connection that the test started the transaction on. This requires a request-scoped database client, usually injected via middleware or a per-request context.
// src/db/context.js
// AsyncLocalStorage gives us a per-request "current connection" without
// passing it through every function signature
const { AsyncLocalStorage } = require('node:async_hooks');
const storage = new AsyncLocalStorage();
function runWithClient(client, fn) {
return storage.run({ client }, fn);
}
function getClient(defaultPool) {
// Return the request-scoped client if one exists, otherwise the pool
const store = storage.getStore();
return store?.client || defaultPool;
}
module.exports = { runWithClient, getClient };
// tests/setup/transaction.js
const { pool } = require('./db');
const { runWithClient } = require('../../src/db/context');
let currentClient;
// Before each test: check out a client and BEGIN a transaction
async function beginTestTransaction() {
currentClient = await pool.connect();
await currentClient.query('BEGIN');
}
// After each test: ROLLBACK and release the client back to the pool
async function rollbackTestTransaction() {
await currentClient.query('ROLLBACK');
currentClient.release();
currentClient = null;
}
// Wrap the supertest request so the app uses our transactional client
function withTransaction(testFn) {
return () => runWithClient(currentClient, testFn);
}
module.exports = {
beginTestTransaction,
rollbackTestTransaction,
withTransaction,
};
// tests/users.tx.test.js
const request = require('supertest');
const app = require('../src/app');
const { setupDatabase, teardownDatabase } = require('./setup/db');
const {
beginTestTransaction,
rollbackTestTransaction,
withTransaction,
} = require('./setup/transaction');
beforeAll(setupDatabase);
afterAll(teardownDatabase);
// Wrap every test in a transaction that gets rolled back
beforeEach(beginTestTransaction);
afterEach(rollbackTestTransaction);
describe('Users API (transactional)', () => {
test(
'POST then GET returns the created user',
withTransaction(async () => {
// Create a user — this INSERT lives inside the transaction
const create = await request(app)
.post('/users')
.send({ email: 'dave@x.com', name: 'Dave' });
expect(create.status).toBe(201);
// Read it back — same transaction sees the row
const list = await request(app).get('/users');
expect(list.body).toHaveLength(1);
expect(list.body[0].email).toBe('dave@x.com');
}),
);
test(
'second test starts with an empty database',
withTransaction(async () => {
// The previous test's INSERT was rolled back — we start clean
const list = await request(app).get('/users');
expect(list.body).toHaveLength(0);
}),
);
});
Why this is fast: BEGIN and ROLLBACK are essentially free at the SQL level. No DELETE, no TRUNCATE, no recreating the schema. On a real project, switching from truncate-per-test to rollback-per-test typically cuts integration suite times by 3x to 10x.
Example 4 — Authenticated Requests with JWT
Most real endpoints require authentication. You should not be calling POST /login before every test — that is slow and couples unrelated tests. Instead, mint a JWT directly in your test helper and attach it to the Authorization header.
// tests/setup/auth.js
const jwt = require('jsonwebtoken');
const config = require('../../src/config');
const { pool } = require('./db');
// Insert a user row and return a signed JWT for that user
async function createAuthenticatedUser(overrides = {}) {
const user = {
email: 'test-user@example.com',
name: 'Test User',
role: 'user',
...overrides,
};
// Insert the user and capture the generated id
const { rows } = await pool.query(
`INSERT INTO users (email, name, role)
VALUES ($1, $2, $3)
RETURNING id, email, name, role`,
[user.email, user.name, user.role],
);
const dbUser = rows[0];
// Sign a JWT exactly like the production /login endpoint would
const token = jwt.sign(
{ sub: dbUser.id, role: dbUser.role },
config.jwtSecret,
{ expiresIn: '1h' },
);
return { user: dbUser, token };
}
module.exports = { createAuthenticatedUser };
// tests/posts.test.js
const request = require('supertest');
const app = require('../src/app');
const { setupDatabase, teardownDatabase, truncateAllTables } = require('./setup/db');
const { createAuthenticatedUser } = require('./setup/auth');
beforeAll(setupDatabase);
afterAll(teardownDatabase);
beforeEach(truncateAllTables);
describe('Posts API (authenticated)', () => {
test('POST /posts requires authentication', async () => {
// No Authorization header -> 401
const res = await request(app)
.post('/posts')
.send({ title: 'Hi', body: 'Hello' });
expect(res.status).toBe(401);
});
test('POST /posts creates a post for the authenticated user', async () => {
const { user, token } = await createAuthenticatedUser();
const res = await request(app)
.post('/posts')
.set('Authorization', `Bearer ${token}`)
.send({ title: 'My Post', body: 'Lorem ipsum' });
expect(res.status).toBe(201);
expect(res.body.title).toBe('My Post');
// The server should attach the JWT subject as the author
expect(res.body.authorId).toBe(user.id);
});
test('Admin-only routes reject regular users', async () => {
const { token } = await createAuthenticatedUser({ role: 'user' });
const res = await request(app)
.delete('/posts/123')
.set('Authorization', `Bearer ${token}`);
// Authentication succeeded but authorization failed
expect(res.status).toBe(403);
});
test('Admin-only routes allow admin users', async () => {
// Seed a post we are allowed to delete
const { token } = await createAuthenticatedUser({ role: 'admin' });
const { pool } = require('./setup/db');
await pool.query(
`INSERT INTO posts (id, title, body, author_id) VALUES (123, 't', 'b', 1)`,
);
const res = await request(app)
.delete('/posts/123')
.set('Authorization', `Bearer ${token}`);
expect(res.status).toBe(204);
});
});
The big win here: the auth middleware runs for real. If you accidentally remove requireAuth from a route, this test catches it. If you mistype the JWT secret in config, this test catches it. A unit test that mocks req.user will not.
Example 5 — Seeding Test Data with Factories
Inline INSERT statements work, but they get noisy. A small factory module keeps tests readable and centralizes default values.
// tests/factories/index.js
const { pool } = require('../setup/db');
let counter = 0;
// Generate a unique value to avoid UNIQUE constraint collisions
const uniq = () => `${Date.now()}-${++counter}`;
// User factory — accepts overrides for fields the test cares about
async function makeUser(overrides = {}) {
const data = {
email: `user-${uniq()}@example.com`,
name: 'Default Name',
role: 'user',
...overrides,
};
const { rows } = await pool.query(
`INSERT INTO users (email, name, role)
VALUES ($1, $2, $3)
RETURNING *`,
[data.email, data.name, data.role],
);
return rows[0];
}
// Post factory — automatically creates an author if none provided
async function makePost(overrides = {}) {
const author = overrides.author || (await makeUser());
const data = {
title: `Post ${uniq()}`,
body: 'Lorem ipsum dolor sit amet.',
authorId: author.id,
...overrides,
};
const { rows } = await pool.query(
`INSERT INTO posts (title, body, author_id)
VALUES ($1, $2, $3)
RETURNING *`,
[data.title, data.body, data.authorId],
);
return rows[0];
}
module.exports = { makeUser, makePost };
// tests/posts.list.test.js
const request = require('supertest');
const app = require('../src/app');
const { setupDatabase, teardownDatabase, truncateAllTables } = require('./setup/db');
const { makeUser, makePost } = require('./factories');
beforeAll(setupDatabase);
afterAll(teardownDatabase);
beforeEach(truncateAllTables);
test('GET /users/:id/posts returns posts authored by that user', async () => {
// Arrange — two users, one with two posts, one with one post
const alice = await makeUser({ name: 'Alice' });
const bob = await makeUser({ name: 'Bob' });
await makePost({ author: alice, title: 'Alice 1' });
await makePost({ author: alice, title: 'Alice 2' });
await makePost({ author: bob, title: 'Bob 1' });
// Act
const res = await request(app).get(`/users/${alice.id}/posts`);
// Assert — only Alice's posts come back
expect(res.status).toBe(200);
expect(res.body).toHaveLength(2);
expect(res.body.map((p) => p.title).sort()).toEqual(['Alice 1', 'Alice 2']);
});
Provisioning the Test Database — Three Strategies
You need a real database somewhere. Three common options, ranked by realism:
+---------------------------------------------------------------+
| TEST DATABASE STRATEGIES |
+---------------------------------------------------------------+
| |
| 1. IN-MEMORY SQLITE |
| Speed: ***** (instant startup, zero IO) |
| Realism: * (different SQL dialect, no JSONB, |
| no advisory locks, no extensions) |
| Use when: pure CRUD, no Postgres-specific features |
| |
| 2. DOCKER-COMPOSE POSTGRES |
| Speed: **** (one container, lives across runs) |
| Realism: ***** (same engine as production) |
| Use when: most projects, especially in CI |
| |
| 3. TESTCONTAINERS |
| Speed: *** (spins a fresh container per test run) |
| Realism: ***** (same engine, perfect isolation) |
| Use when: parallel CI jobs, multi-version matrix testing |
| |
+---------------------------------------------------------------+
Option A: docker-compose Postgres
# docker-compose.test.yml
# A dedicated Postgres instance for running integration tests
services:
postgres-test:
image: postgres:16-alpine
environment:
POSTGRES_USER: test
POSTGRES_PASSWORD: test
POSTGRES_DB: myapp_test
ports:
# Use a non-standard host port so it does not collide with dev pg
- '55432:5432'
tmpfs:
# Store data in RAM for maximum speed — wiped on container stop
- /var/lib/postgresql/data
# Bring it up before running tests (CI or local)
docker compose -f docker-compose.test.yml up -d
TEST_DATABASE_URL=postgres://test:test@localhost:55432/myapp_test npm test
docker compose -f docker-compose.test.yml down
Option B: testcontainers (programmatic)
// tests/setup/testcontainers.js
// Boot a fresh Postgres container from inside the test process
const { PostgreSqlContainer } = require('@testcontainers/postgresql');
let container;
async function startPostgres() {
container = await new PostgreSqlContainer('postgres:16-alpine')
.withDatabase('myapp_test')
.withUsername('test')
.withPassword('test')
.start();
// Expose the connection string to the rest of the test process
process.env.TEST_DATABASE_URL = container.getConnectionUri();
}
async function stopPostgres() {
await container.stop();
}
module.exports = { startPostgres, stopPostgres };
// jest.global-setup.js
// Runs ONCE before the entire test run (not per file)
const { startPostgres } = require('./tests/setup/testcontainers');
module.exports = async () => {
await startPostgres();
};
// jest.config.js
module.exports = {
globalSetup: './jest.global-setup.js',
globalTeardown: './jest.global-teardown.js',
testEnvironment: 'node',
};
testcontainers is the most ergonomic option for CI: zero external setup, perfect isolation between runs, and easy to test against multiple Postgres versions in a matrix.
Common Mistakes
1. Calling app.listen() in the file you import for tests.
If app.js calls listen(), every test run will try to bind a port — and your CI will fail with EADDRINUSE the moment you run two suites in parallel. Always export the bare app from one file and call listen in a separate server.js.
2. Sharing the dev database with the test database.
Tests truncate tables. If NODE_ENV is unset and your config falls back to the dev database URL, your first test run wipes your local data. Always set NODE_ENV=test and add a hard guard in your setup helper that throws if the URL does not contain test.
3. Using --maxWorkers parallelism with a single shared test database.
Jest defaults to running test files in parallel. Multiple workers hitting the same Postgres database with TRUNCATE will race and corrupt each other's state. Either pass --runInBand, or give each worker its own database with JEST_WORKER_ID in the connection string.
4. Logging in via POST /login in every test.
This couples unrelated tests to the auth flow, slows the suite, and makes debugging painful. Mint JWTs directly in a test helper. Only test the actual /login endpoint in its own dedicated suite.
5. Forgetting to close the database pool in afterAll.
Jest will print "A worker process has failed to exit gracefully" and hang for 10 seconds before force-killing. Always call pool.end() (or the equivalent for your ORM) in afterAll. In CI this matters — hung workers eat minutes.
6. Asserting on database internals instead of the HTTP response.
The whole point of an integration test is the HTTP contract. If you find yourself querying the database to verify a test, you usually wanted to assert on res.body instead. Use database queries only when the endpoint returns no body (e.g., a 204 DELETE).
Interview Questions
1. "What is the difference between a unit test and an integration test in a Node.js API?"
A unit test exercises a single function or module in isolation, with all of its collaborators (database, HTTP, third-party APIs) replaced by mocks or stubs. It runs in milliseconds and proves the logic of one piece. An integration test boots the real Express app, fires real HTTP requests through supertest, runs the real middleware chain, and talks to a real database. It proves that all the pieces are wired together correctly: routes are registered, middleware order is right, validation runs, the controller calls the model, the model writes to the database, and the response shape matches the contract. Unit tests catch logic bugs; integration tests catch wiring bugs. You need both.
2. "How does supertest work, and why don't you call app.listen() in your test files?"
supertest accepts an Express app object directly and internally creates an ephemeral HTTP server bound to a random free port for the duration of each request, then tears it down. Because it allocates a fresh port every time, you never have to manage port lifecycle, deal with EADDRINUSE, or worry about parallel test files colliding. The pattern is to export the bare app from app.js and have a separate server.js (used only in production and dev) that calls app.listen(). Tests import app.js, never server.js. This separation is what makes the test suite reliable and parallel-safe.
3. "Explain the transaction rollback pattern for test isolation. What are its tradeoffs versus truncating tables?"
In the rollback pattern, each test starts a database transaction in beforeEach and rolls it back in afterEach. Nothing ever gets committed, so the next test sees a perfectly empty database. The tradeoff is that your application code must use the same connection that started the transaction — typically achieved with AsyncLocalStorage or dependency injection of a request-scoped client. Compared to truncating tables between tests, rollback is dramatically faster: BEGIN and ROLLBACK are essentially free, while TRUNCATE has to scan and reset every table. The downside is that you cannot test code paths that themselves use transactions (nested transactions need savepoints), and you cannot assert on data from outside the test process because nothing is actually committed.
4. "Should you use in-memory SQLite or real Postgres in Docker for integration tests? Why?"
Use real Postgres in Docker (or testcontainers) whenever you can. The whole value of integration testing is realism — running the same SQL against the same database engine your production environment uses. SQLite has a different SQL dialect, no JSONB, no advisory locks, no RETURNING syntax in older versions, no gen_random_uuid, no extensions, no real concurrency model. The first time you ship a query that works on SQLite but breaks on Postgres, you have lost the entire benefit of the test suite. Real Postgres in Docker starts in under two seconds, runs serially in CI without any infrastructure, and gives you complete fidelity. The only legitimate use case for in-memory SQLite is a small library or CLI tool that supports multiple databases and explicitly avoids vendor-specific features.
5. "How do you handle authenticated endpoints in integration tests without making the suite slow and brittle?"
You mint JWTs (or session tokens) directly in test helpers — never by hitting the real /login endpoint. A createAuthenticatedUser helper inserts a row into the users table and signs a JWT with the same secret your production code uses, then returns both. Tests attach the token via request(app).post(...).set('Authorization', 'Bearer ' + token). This keeps each test fast (no extra HTTP round-trip), independent (a broken /login does not break unrelated tests), and explicit about which user context is being tested. The actual /login endpoint gets its own dedicated test file where it is the subject under test, not a setup step. The auth middleware itself still runs for real on every authenticated request, so middleware bugs are still caught.
Quick Reference — Integration Testing Cheat Sheet
+---------------------------------------------------------------+
| INTEGRATION TESTING CHEAT SHEET |
+---------------------------------------------------------------+
| |
| EXPORT APP WITHOUT LISTEN: |
| // app.js |
| module.exports = app; |
| // server.js |
| app.listen(PORT) |
| |
| SUPERTEST BASICS: |
| const request = require('supertest') |
| const res = await request(app).get('/users') |
| const res = await request(app) |
| .post('/users').send({...}).set('Header', 'val') |
| expect(res.status).toBe(201) |
| expect(res.body).toMatchObject({...}) |
| |
| LIFECYCLE HOOKS: |
| beforeAll -> setupDatabase, runMigrations |
| beforeEach -> beginTransaction OR truncate |
| afterEach -> rollbackTransaction |
| afterAll -> pool.end(), stopContainer |
| |
| AUTH IN TESTS: |
| const token = jwt.sign({ sub: user.id }, secret) |
| .set('Authorization', `Bearer ${token}`) |
| |
| ENV SEPARATION: |
| cross-env NODE_ENV=test jest --runInBand |
| if (!dbUrl.includes('test')) throw new Error(...) |
| |
+---------------------------------------------------------------+
+---------------------------------------------------------------+
| KEY RULES |
+---------------------------------------------------------------+
| |
| 1. Export app, listen() in a separate file |
| 2. NODE_ENV=test always, with a hard guard in setup |
| 3. Real Postgres > SQLite for fidelity |
| 4. Transaction rollback > truncate for speed |
| 5. --runInBand unless workers have isolated databases |
| 6. Mint JWTs in helpers, never POST /login per test |
| 7. Use factories for readable seeding |
| 8. Always pool.end() in afterAll |
| |
+---------------------------------------------------------------+
| Concern | Unit Test | Integration Test |
|---|---|---|
| Scope | Single function | Full request -> response |
| Database | Mocked | Real (Postgres in Docker) |
| Middleware | Mocked or skipped | Runs for real |
| HTTP layer | Bypassed | Real via supertest |
| Speed | <1 ms each | 5-50 ms each |
| Catches | Logic bugs | Wiring + contract bugs |
| Isolation tool | Mocks | Transactions or truncate |
| Run command | jest | NODE_ENV=test jest --runInBand |
Prev: Lesson 9.1 -- Unit Testing with Jest Next: Lesson 9.3 -- Testing Strategies and Patterns
This is Lesson 9.2 of the Node.js Interview Prep Course -- 10 chapters, 42 lessons.