Skip to main content

Express REST API — CRUD, Database & Auth

This lecture builds on the Express setup from the previous lecture. We extend the same task-api project with full CRUD operations, a database layer, authentication, testing, and Docker deployment.

Lecture roadmap:

  1. Types & validation — DTOs, manual field validation
  2. In-memory CRUD — tasks and categories with array storage
  3. Database — SQLite, migrations, repository pattern, refactored routes
  4. Authentication — bcrypt, JWT access/refresh tokens, protected routes
  5. Testing — integration tests with supertest
  6. Docker — multi-stage build, compose

Additional dependencies

npm install jsonwebtoken bcrypt better-sqlite3 uuid
npm install -D @types/jsonwebtoken @types/bcrypt @types/better-sqlite3

Types & DTOs

// src/types/index.ts
export type TaskStatus = "todo" | "in_progress" | "done";
export type TaskPriority = "low" | "medium" | "high";

export interface ITask {
id: string;
title: string;
description: string;
status: TaskStatus;
priority: TaskPriority;
dueDate: string | null;
categoryId: string | null;
userId: string;
createdAt: string;
updatedAt: string;
}

export interface ICreateTaskDTO {
title: string;
description?: string;
status?: TaskStatus;
priority?: TaskPriority;
dueDate?: string | null;
categoryId?: string | null;
}

// PUT requires all fields — full replacement semantics
export interface IUpdateTaskDTO {
title: string;
description: string;
status: TaskStatus;
priority: TaskPriority;
dueDate: string | null;
categoryId: string | null;
}

export interface ICategory {
id: string;
name: string;
userId: string;
createdAt: string;
}

export interface IUser {
id: string;
email: string;
passwordHash: string;
firstName: string;
lastName: string;
createdAt: string;
}

Validation

We validate manually here to understand what happens under the hood. NestJS uses class-validator decorators to automate this.

// src/routes/tasks.ts — validation helpers
interface ValidationResult {
valid: boolean;
errors: string[];
}

// Shared field-level checks (used by both create and update)
function validateTaskFields(body: Record<string, unknown>, errors: string[]): void {
if (body.status !== undefined && !["todo", "in_progress", "done"].includes(body.status as string)) {
errors.push("status must be one of: todo, in_progress, done");
}
if (body.priority !== undefined && !["low", "medium", "high"].includes(body.priority as string)) {
errors.push("priority must be one of: low, medium, high");
}
if (body.dueDate !== undefined && body.dueDate !== null && isNaN(Date.parse(body.dueDate as string))) {
errors.push("dueDate must be a valid ISO date string or null");
}
if (body.description !== undefined && typeof body.description !== "string") {
errors.push("description must be a string");
}
if (body.categoryId !== undefined && body.categoryId !== null && typeof body.categoryId !== "string") {
errors.push("categoryId must be a string or null");
}
}

// POST — title required, everything else optional (has defaults)
function validateCreateTask(body: Record<string, unknown>): ValidationResult {
const errors: string[] = [];
if (!body.title || typeof body.title !== "string" || body.title.trim() === "") {
errors.push("title is required and must be a non-empty string");
}
validateTaskFields(body, errors);
return { valid: errors.length === 0, errors };
}

// PUT — all fields required (full replacement)
function validateUpdateTask(body: Record<string, unknown>): ValidationResult {
const errors: string[] = [];
if (!body.title || typeof body.title !== "string" || body.title.trim() === "") {
errors.push("title is required");
}
if (typeof body.description !== "string") {
errors.push("description is required");
}
if (!["todo", "in_progress", "done"].includes(body.status as string)) {
errors.push("status is required and must be one of: todo, in_progress, done");
}
if (!["low", "medium", "high"].includes(body.priority as string)) {
errors.push("priority is required and must be one of: low, medium, high");
}
// dueDate: required field, must be a valid date string or null
if (body.dueDate !== null && (body.dueDate === undefined || isNaN(Date.parse(body.dueDate as string)))) {
errors.push("dueDate is required and must be a valid ISO date string or null");
}
// categoryId: required field, must be a string or null
if (body.categoryId !== null && (body.categoryId === undefined || typeof body.categoryId !== "string")) {
errors.push("categoryId is required and must be a string or null");
}
return { valid: errors.length === 0, errors };
}

REST API — Full CRUD (in-memory)

Checkpoint 1 — in-memory CRUD

This section builds task and category routes with in-memory arrays. The database replaces these arrays later in this lecture.

Tasks router

// src/routes/tasks.ts
import { Router, Request, Response } from "express";
import { v4 as uuidv4 } from "uuid";
import { ITask, ICreateTaskDTO } from "../types/index.js";

const router = Router();

// In-memory store (replaced with database later in this lecture)
let tasks: ITask[] = [];

// GET /api/v1/tasks — list with optional filtering
router.get("/", (req: Request, res: Response) => {
let result = [...tasks];

if (req.query.status) {
result = result.filter((t) => t.status === req.query.status);
}
if (req.query.priority) {
result = result.filter((t) => t.priority === req.query.priority);
}

res.json(result);
});

// GET /api/v1/tasks/:id
router.get("/:id", (req: Request, res: Response) => {
const task = tasks.find((t) => t.id === req.params.id);
if (!task) {
res.status(404).json({ error: "Task not found" });
return;
}
res.json(task);
});

// POST /api/v1/tasks
router.post("/", (req: Request, res: Response) => {
const { valid, errors } = validateCreateTask(req.body);
if (!valid) {
res.status(400).json({ errors });
return;
}

const body: ICreateTaskDTO = req.body;
const now = new Date().toISOString();
const task: ITask = {
id: uuidv4(),
title: body.title,
description: body.description ?? "",
status: body.status ?? "todo",
priority: body.priority ?? "medium",
dueDate: body.dueDate ?? null,
categoryId: body.categoryId ?? null,
userId: "temp-user", // replaced with req.user.userId after auth
createdAt: now,
updatedAt: now,
};

tasks.push(task);
res.status(201).json(task);
});

// PUT /api/v1/tasks/:id — full replace (all fields required)
router.put("/:id", (req: Request, res: Response) => {
const index = tasks.findIndex((t) => t.id === req.params.id);
if (index === -1) {
res.status(404).json({ error: "Task not found" });
return;
}

const { valid, errors } = validateUpdateTask(req.body);
if (!valid) {
res.status(400).json({ errors });
return;
}

const body = req.body;
tasks[index] = {
...tasks[index],
title: body.title,
description: body.description,
status: body.status,
priority: body.priority,
dueDate: body.dueDate,
categoryId: body.categoryId,
updatedAt: new Date().toISOString(),
};

res.json(tasks[index]);
});

// PATCH /api/v1/tasks/:id — partial update
router.patch("/:id", (req: Request, res: Response) => {
const index = tasks.findIndex((t) => t.id === req.params.id);
if (index === -1) {
res.status(404).json({ error: "Task not found" });
return;
}

// Validate only the fields that are present
const errors: string[] = [];
validateTaskFields(req.body, errors);
if (req.body.title !== undefined && (typeof req.body.title !== "string" || req.body.title.trim() === "")) {
errors.push("title must be a non-empty string");
}
if (errors.length > 0) {
res.status(400).json({ errors });
return;
}

const { id, userId, createdAt, ...allowedFields } = req.body;
tasks[index] = {
...tasks[index],
...allowedFields,
updatedAt: new Date().toISOString(),
};

res.json(tasks[index]);
});

// DELETE /api/v1/tasks/:id
router.delete("/:id", (req: Request, res: Response) => {
const index = tasks.findIndex((t) => t.id === req.params.id);
if (index === -1) {
res.status(404).json({ error: "Task not found" });
return;
}

tasks.splice(index, 1);
res.status(204).send();
});

export default router;
Common mistake

Always return proper status codes. A common error: returning 200 for creation (should be 201), or returning 200 with { success: false } instead of a 4xx status. REST clients and frontends depend on status codes — your Axios interceptor in Vue/React checks response.status, not a custom success field.

Categories router

Same CRUD pattern, shorter:

// src/routes/categories.ts
import { Router, Request, Response } from "express";
import { v4 as uuidv4 } from "uuid";
import { ICategory } from "../types/index.js";

const router = Router();
let categories: ICategory[] = [];

router.get("/", (req: Request, res: Response) => {
res.json(categories);
});

router.get("/:id", (req: Request, res: Response) => {
const cat = categories.find((c) => c.id === req.params.id);
if (!cat) {
res.status(404).json({ error: "Category not found" });
return;
}
res.json(cat);
});

router.post("/", (req: Request, res: Response) => {
if (!req.body.name || typeof req.body.name !== "string" || req.body.name.trim() === "") {
res.status(400).json({ errors: ["name is required and must be a non-empty string"] });
return;
}

const category: ICategory = {
id: uuidv4(),
name: req.body.name.trim(),
userId: "temp-user",
createdAt: new Date().toISOString(),
};

categories.push(category);
res.status(201).json(category);
});

router.put("/:id", (req: Request, res: Response) => {
const index = categories.findIndex((c) => c.id === req.params.id);
if (index === -1) {
res.status(404).json({ error: "Category not found" });
return;
}

if (!req.body.name || typeof req.body.name !== "string" || req.body.name.trim() === "") {
res.status(400).json({ errors: ["name is required and must be a non-empty string"] });
return;
}

categories[index] = { ...categories[index], name: req.body.name.trim() };
res.json(categories[index]);
});

router.delete("/:id", (req: Request, res: Response) => {
const index = categories.findIndex((c) => c.id === req.params.id);
if (index === -1) {
res.status(404).json({ error: "Category not found" });
return;
}

categories.splice(index, 1);
res.status(204).send();
});

export default router;

Mount both routers in app.ts:

// src/app.ts
import express from "express";
import tasksRouter from "./routes/tasks.js";
import categoriesRouter from "./routes/categories.js";
import { errorHandler } from "./middleware/errorHandler.js";

const app = express();

app.use(express.json());
app.use(express.static("public"));
app.use("/api/v1/tasks", tasksRouter);
app.use("/api/v1/categories", categoriesRouter);
app.use(errorHandler);

export default app;

Database Layer

Checkpoint 2 — database

This section replaces the in-memory arrays with SQLite. The task and category routers are refactored to use repository classes.

SQLite with better-sqlite3

SQLite is an embedded database — it runs inside your Node.js process, stores data in a single file, requires zero configuration. Perfect for development and testing. For production with concurrent users, use PostgreSQL (separate server, connection pooling, proper concurrency).

better-sqlite3 is synchronous — no callbacks, no promises, just function calls. This is fine because SQLite runs in-process (no network I/O).

// src/db/database.ts
import Database from "better-sqlite3";
import path from "path";
import fs from "fs";

// Lazy initialization — the DB opens when getDb() is first called, not at import time.
// This lets tests set process.env.DB_PATH before the connection opens.
let db: Database.Database | null = null;

export function getDb(): Database.Database {
if (!db) {
const dbPath = process.env.DB_PATH || "task-api.db";
db = new Database(dbPath);
db.pragma("journal_mode = WAL");
db.pragma("foreign_keys = ON");
}
return db;
}

export function closeDb(): void {
if (db) {
db.close();
db = null;
}
}

export function initializeDatabase(): void {
const database = getDb();
const migrationsDir = path.join(import.meta.dirname, "migrations");

// Create migrations tracking table
database.exec(`
CREATE TABLE IF NOT EXISTS migrations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT UNIQUE NOT NULL,
applied_at TEXT DEFAULT (datetime('now'))
)
`);

// Read and apply pending migrations
const applied = database
.prepare("SELECT name FROM migrations")
.all()
.map((row: any) => row.name);

const files = fs.readdirSync(migrationsDir).filter((f) => f.endsWith(".sql")).sort();

for (const file of files) {
if (applied.includes(file)) continue;

const sql = fs.readFileSync(path.join(migrationsDir, file), "utf-8");
database.exec(sql);
database.prepare("INSERT INTO migrations (name) VALUES (?)").run(file);
console.log(`Applied migration: ${file}`);
}
}

Schema & Migrations

-- src/db/migrations/001-initial.sql

CREATE TABLE IF NOT EXISTS users (
id TEXT PRIMARY KEY,
email TEXT UNIQUE NOT NULL,
password_hash TEXT NOT NULL,
first_name TEXT NOT NULL,
last_name TEXT NOT NULL,
created_at TEXT DEFAULT (datetime('now'))
);

CREATE TABLE IF NOT EXISTS categories (
id TEXT PRIMARY KEY,
name TEXT NOT NULL,
user_id TEXT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
created_at TEXT DEFAULT (datetime('now'))
);

CREATE TABLE IF NOT EXISTS tasks (
id TEXT PRIMARY KEY,
title TEXT NOT NULL,
description TEXT DEFAULT '',
status TEXT NOT NULL DEFAULT 'todo'
CHECK(status IN ('todo', 'in_progress', 'done')),
priority TEXT NOT NULL DEFAULT 'medium'
CHECK(priority IN ('low', 'medium', 'high')),
due_date TEXT,
category_id TEXT REFERENCES categories(id) ON DELETE SET NULL,
user_id TEXT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
created_at TEXT DEFAULT (datetime('now')),
updated_at TEXT DEFAULT (datetime('now'))
);

CREATE TABLE IF NOT EXISTS refresh_tokens (
id TEXT PRIMARY KEY,
user_id TEXT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
token TEXT UNIQUE NOT NULL,
expires_at TEXT NOT NULL,
created_at TEXT DEFAULT (datetime('now'))
);

Run migrations on startup — update index.ts:

// src/index.ts
import "dotenv/config"; // side-effect import — must be FIRST so .env is loaded
// before other modules read process.env at import time
import app from "./app.js";
import { initializeDatabase } from "./db/database.js";

initializeDatabase();

const PORT = process.env.PORT || 3000;
app.listen(PORT, (error?: Error) => {
if (error) throw error;
console.log(`Server running on http://localhost:${PORT}`);
});
ESM import order

Why import "dotenv/config" instead of dotenv.config() in the module body?

In ESM, all import declarations are hoisted — imported modules are evaluated before the importing module's body code runs. If you write import dotenv from "dotenv"; dotenv.config(); followed by import app from "./app.js", the app.js module tree (including auth.ts which reads JWT_SECRET) is evaluated before dotenv.config() executes. Your environment variables are missing.

import "dotenv/config" is a side-effect import. It causes dotenv.config() to run during that module's evaluation, which happens before subsequent imports in the same file. This is the correct ESM pattern.

Repository pattern

Encapsulate database queries in a repository class. Each method uses prepared statements.

// src/db/taskRepository.ts
import { v4 as uuidv4 } from "uuid";
import { getDb } from "./database.js";
import { ITask, ICreateTaskDTO, IUpdateTaskDTO } from "../types/index.js";

// Map snake_case DB rows to camelCase interface
function mapRow(row: any): ITask {
return {
id: row.id,
title: row.title,
description: row.description,
status: row.status,
priority: row.priority,
dueDate: row.due_date,
categoryId: row.category_id,
userId: row.user_id,
createdAt: row.created_at,
updatedAt: row.updated_at,
};
}

export class TaskRepository {
findAll(userId: string, filters?: { status?: string; priority?: string }): ITask[] {
const db = getDb();
let sql = "SELECT * FROM tasks WHERE user_id = ?";
const params: unknown[] = [userId];

if (filters?.status) {
sql += " AND status = ?";
params.push(filters.status);
}
if (filters?.priority) {
sql += " AND priority = ?";
params.push(filters.priority);
}

sql += " ORDER BY created_at DESC";
return db.prepare(sql).all(...params).map(mapRow);
}

findById(id: string, userId: string): ITask | null {
const row = getDb()
.prepare("SELECT * FROM tasks WHERE id = ? AND user_id = ?")
.get(id, userId);
return row ? mapRow(row) : null;
}

create(dto: ICreateTaskDTO, userId: string): ITask {
const id = uuidv4();
const now = new Date().toISOString();

getDb().prepare(`
INSERT INTO tasks (id, title, description, status, priority, due_date, category_id, user_id, created_at, updated_at)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(
id,
dto.title,
dto.description ?? "",
dto.status ?? "todo",
dto.priority ?? "medium",
dto.dueDate ?? null,
dto.categoryId ?? null,
userId,
now,
now
);

return this.findById(id, userId)!;
}

update(id: string, dto: IUpdateTaskDTO, userId: string): ITask | null {
const existing = this.findById(id, userId);
if (!existing) return null;

getDb().prepare(`
UPDATE tasks
SET title = ?, description = ?, status = ?, priority = ?, due_date = ?, category_id = ?, updated_at = ?
WHERE id = ? AND user_id = ?
`).run(
dto.title,
dto.description,
dto.status,
dto.priority,
dto.dueDate,
dto.categoryId,
new Date().toISOString(),
id,
userId
);

return this.findById(id, userId);
}

partialUpdate(id: string, fields: Partial<ICreateTaskDTO>, userId: string): ITask | null {
const existing = this.findById(id, userId);
if (!existing) return null;

const merged = {
title: fields.title ?? existing.title,
description: fields.description ?? existing.description,
status: fields.status ?? existing.status,
priority: fields.priority ?? existing.priority,
dueDate: fields.dueDate !== undefined ? fields.dueDate : existing.dueDate,
categoryId: fields.categoryId !== undefined ? fields.categoryId : existing.categoryId,
};

getDb().prepare(`
UPDATE tasks
SET title = ?, description = ?, status = ?, priority = ?, due_date = ?, category_id = ?, updated_at = ?
WHERE id = ? AND user_id = ?
`).run(
merged.title,
merged.description,
merged.status,
merged.priority,
merged.dueDate,
merged.categoryId,
new Date().toISOString(),
id,
userId
);

return this.findById(id, userId);
}

delete(id: string, userId: string): boolean {
const result = getDb()
.prepare("DELETE FROM tasks WHERE id = ? AND user_id = ?")
.run(id, userId);
return result.changes > 0;
}
}

export const taskRepository = new TaskRepository();

Same pattern for categories:

// src/db/categoryRepository.ts
import { v4 as uuidv4 } from "uuid";
import { getDb } from "./database.js";
import { ICategory } from "../types/index.js";

function mapRow(row: any): ICategory {
return {
id: row.id,
name: row.name,
userId: row.user_id,
createdAt: row.created_at,
};
}

export class CategoryRepository {
findAll(userId: string): ICategory[] {
return getDb()
.prepare("SELECT * FROM categories WHERE user_id = ? ORDER BY created_at DESC")
.all(userId)
.map(mapRow);
}

findById(id: string, userId: string): ICategory | null {
const row = getDb()
.prepare("SELECT * FROM categories WHERE id = ? AND user_id = ?")
.get(id, userId);
return row ? mapRow(row) : null;
}

create(name: string, userId: string): ICategory {
const id = uuidv4();
getDb().prepare(`
INSERT INTO categories (id, name, user_id) VALUES (?, ?, ?)
`).run(id, name.trim(), userId);
return this.findById(id, userId)!;
}

update(id: string, name: string, userId: string): ICategory | null {
const existing = this.findById(id, userId);
if (!existing) return null;

getDb().prepare("UPDATE categories SET name = ? WHERE id = ? AND user_id = ?")
.run(name.trim(), id, userId);
return this.findById(id, userId);
}

delete(id: string, userId: string): boolean {
const result = getDb()
.prepare("DELETE FROM categories WHERE id = ? AND user_id = ?")
.run(id, userId);
return result.changes > 0;
}
}

export const categoryRepository = new CategoryRepository();
Common mistake

Always use parameterized queries (? placeholders). Never interpolate user input into SQL strings:

// WRONG — SQL injection vulnerability
getDb().prepare(`SELECT * FROM tasks WHERE id = '${req.params.id}'`).get();

// CORRECT — parameterized
getDb().prepare("SELECT * FROM tasks WHERE id = ?").get(req.params.id);

SQL injection is still the #1 web application vulnerability (OWASP Top 10). Prepared statements are the defense.

Refactor routes to use database

Replace the in-memory array in tasks.ts with the repository:

// src/routes/tasks.ts — updated to use database
import { Router, Request, Response } from "express";
import { taskRepository } from "../db/taskRepository.js";

const router = Router();

router.get("/", (req: Request, res: Response) => {
const userId = req.user!.userId;
const tasks = taskRepository.findAll(userId, {
status: req.query.status as string | undefined,
priority: req.query.priority as string | undefined,
});
res.json(tasks);
});

router.get("/:id", (req: Request, res: Response) => {
const task = taskRepository.findById(req.params.id as string, req.user!.userId);
if (!task) {
res.status(404).json({ error: "Task not found" });
return;
}
res.json(task);
});

router.post("/", (req: Request, res: Response) => {
const { valid, errors } = validateCreateTask(req.body);
if (!valid) {
res.status(400).json({ errors });
return;
}

const task = taskRepository.create(req.body, req.user!.userId);
res.status(201).json(task);
});

router.put("/:id", (req: Request, res: Response) => {
const { valid, errors } = validateUpdateTask(req.body);
if (!valid) {
res.status(400).json({ errors });
return;
}

const task = taskRepository.update(req.params.id as string, req.body, req.user!.userId);
if (!task) {
res.status(404).json({ error: "Task not found" });
return;
}
res.json(task);
});

router.patch("/:id", (req: Request, res: Response) => {
const errors: string[] = [];
validateTaskFields(req.body, errors);
if (req.body.title !== undefined && (typeof req.body.title !== "string" || req.body.title.trim() === "")) {
errors.push("title must be a non-empty string");
}
if (errors.length > 0) {
res.status(400).json({ errors });
return;
}

const task = taskRepository.partialUpdate(req.params.id as string, req.body, req.user!.userId);
if (!task) {
res.status(404).json({ error: "Task not found" });
return;
}
res.json(task);
});

router.delete("/:id", (req: Request, res: Response) => {
const deleted = taskRepository.delete(req.params.id as string, req.user!.userId);
if (!deleted) {
res.status(404).json({ error: "Task not found" });
return;
}
res.status(204).send();
});

export default router;

Replace the in-memory categories router the same way:

// src/routes/categories.ts — updated to use database
import { Router, Request, Response } from "express";
import { categoryRepository } from "../db/categoryRepository.js";

const router = Router();

router.get("/", (req: Request, res: Response) => {
const categories = categoryRepository.findAll(req.user!.userId);
res.json(categories);
});

router.get("/:id", (req: Request, res: Response) => {
const category = categoryRepository.findById(req.params.id as string, req.user!.userId);
if (!category) {
res.status(404).json({ error: "Category not found" });
return;
}
res.json(category);
});

router.post("/", (req: Request, res: Response) => {
if (!req.body.name || typeof req.body.name !== "string" || req.body.name.trim() === "") {
res.status(400).json({ errors: ["name is required and must be a non-empty string"] });
return;
}

const category = categoryRepository.create(req.body.name, req.user!.userId);
res.status(201).json(category);
});

router.put("/:id", (req: Request, res: Response) => {
if (!req.body.name || typeof req.body.name !== "string" || req.body.name.trim() === "") {
res.status(400).json({ errors: ["name is required and must be a non-empty string"] });
return;
}

const category = categoryRepository.update(req.params.id as string, req.body.name, req.user!.userId);
if (!category) {
res.status(404).json({ error: "Category not found" });
return;
}
res.json(category);
});

router.delete("/:id", (req: Request, res: Response) => {
const deleted = categoryRepository.delete(req.params.id as string, req.user!.userId);
if (!deleted) {
res.status(404).json({ error: "Category not found" });
return;
}
res.status(204).send();
});

export default router;

PostgreSQL for production

SQLite is single-writer — one write at a time. For production with concurrent users, use PostgreSQL:

FeatureSQLitePostgreSQL
SetupZero config, file-basedSeparate server process
ConcurrencySingle writer, multiple readersFull concurrent reads/writes
APISynchronous (better-sqlite3)Async (pg, Prisma)
DeploymentEmbedded in appDocker container or managed service
Best forDev, testing, small appsProduction, multi-user apps

Minimal pg connection example:

import pg from "pg";

const pool = new pg.Pool({
connectionString: process.env.DATABASE_URL,
// e.g. "postgresql://user:pass@localhost:5432/taskdb"
});

const result = await pool.query("SELECT * FROM tasks WHERE user_id = $1", [userId]);
const tasks = result.rows;
info

For A6, SQLite is sufficient. When you deploy with Docker, add a PostgreSQL container. NestJS uses Prisma ORM with PostgreSQL — it generates the SQL for you and handles migrations.


Authentication

Checkpoint 3 — auth

This section adds JWT authentication. After this, task and category routes require a Bearer token. The public/index.html test page from lecture 66 no longer works for protected endpoints — use curl, Postman, or your Vue/React frontend instead.

Password hashing

Never store plaintext passwords. bcrypt hashes passwords with a random salt — even identical passwords produce different hashes.

import bcrypt from "bcrypt";

// Hash a password (cost factor 12 = ~250ms on modern hardware)
const hash = await bcrypt.hash("user-password", 12);

// Verify a password against a hash
const isValid = await bcrypt.compare("user-password", hash); // true
info

OWASP recommends Argon2id as the modern first choice for password hashing, with bcrypt as an acceptable alternative. We use bcrypt here because it has mature Node.js support and is widely deployed. Two things to know: (1) bcrypt truncates passwords at 72 bytes — extremely long passwords are silently cut off; (2) if your project can use argon2 (npm package), prefer it for new applications.

JWT access + refresh tokens

The dual-token pattern:

  • Access token — short-lived (15 minutes), sent with every API request in the Authorization header. Contains user identity claims.
  • Refresh token — long-lived (7 days), stored in the database, used only to get a new access token when the current one expires.

Why not a single long-lived access token? If stolen, an attacker has access until it expires. With a 15-minute access token, the damage window is small. The refresh token can be revoked server-side (delete from database) to immediately cut off access.

Account routes

// src/routes/account.ts — uses inline queries for simplicity (tasks use the repository pattern)
import { Router, Request, Response } from "express";
import bcrypt from "bcrypt";
import { v4 as uuidv4 } from "uuid";
import { getDb } from "../db/database.js";
import { generateTokens } from "../auth.js";

const router = Router();

// POST /api/v1/account/register
router.post("/register", async (req: Request, res: Response) => {
const { email, password, firstName, lastName } = req.body;

if (!email || !password || !firstName || !lastName) {
res.status(400).json({ errors: ["All fields are required"] });
return;
}

if (typeof password !== "string" || password.length < 8) {
res.status(400).json({ errors: ["Password must be at least 8 characters"] });
return;
}

// Check if user already exists
// Note: returning 409 reveals whether an email is registered (user enumeration).
// This is a UX/security tradeoff — acceptable for most apps, but high-security
// apps may return a generic success and send a confirmation email instead.
const db = getDb();
const existing = db.prepare("SELECT id FROM users WHERE email = ?").get(email);
if (existing) {
res.status(409).json({ error: "Email already registered" });
return;
}

const id = uuidv4();
const passwordHash = await bcrypt.hash(password, 12);

db.prepare(`
INSERT INTO users (id, email, password_hash, first_name, last_name)
VALUES (?, ?, ?, ?, ?)
`).run(id, email, passwordHash, firstName, lastName);

const tokens = generateTokens({ userId: id, email });
storeRefreshToken(id, tokens.refreshToken);

res.status(201).json(tokens);
});

// POST /api/v1/account/login
router.post("/login", async (req: Request, res: Response) => {
const { email, password } = req.body;

const db = getDb();
const user: any = db.prepare("SELECT * FROM users WHERE email = ?").get(email);
if (!user) {
res.status(401).json({ error: "Invalid email or password" });
return;
}

const valid = await bcrypt.compare(password, user.password_hash);
if (!valid) {
res.status(401).json({ error: "Invalid email or password" });
return;
}

const tokens = generateTokens({ userId: user.id, email: user.email });
storeRefreshToken(user.id, tokens.refreshToken);

res.json(tokens);
});

// POST /api/v1/account/refresh
router.post("/refresh", (req: Request, res: Response) => {
const { refreshToken } = req.body;

if (!refreshToken) {
res.status(400).json({ error: "Refresh token required" });
return;
}

const db = getDb();

// Verify the refresh token exists in the database and is not expired
const stored: any = db
.prepare("SELECT * FROM refresh_tokens WHERE token = ? AND expires_at > datetime('now')")
.get(refreshToken);

if (!stored) {
res.status(401).json({ error: "Invalid or expired refresh token" });
return;
}

// Delete the old refresh token (single use)
db.prepare("DELETE FROM refresh_tokens WHERE token = ?").run(refreshToken);

// Issue a new pair
const user: any = db.prepare("SELECT id, email FROM users WHERE id = ?").get(stored.user_id);
const tokens = generateTokens({ userId: user.id, email: user.email });
storeRefreshToken(user.id, tokens.refreshToken);

res.json(tokens);
});

// POST /api/v1/account/logout — revoke the refresh token
router.post("/logout", (req: Request, res: Response) => {
const { refreshToken } = req.body;
if (refreshToken) {
getDb().prepare("DELETE FROM refresh_tokens WHERE token = ?").run(refreshToken);
}
res.status(204).send();
});

function storeRefreshToken(userId: string, token: string): void {
const id = uuidv4();
const expiresAt = new Date(Date.now() + 7 * 24 * 60 * 60 * 1000).toISOString();
getDb().prepare(`
INSERT INTO refresh_tokens (id, user_id, token, expires_at) VALUES (?, ?, ?, ?)
`).run(id, userId, token, expiresAt);
}

export default router;

Token generation

// src/auth.ts
import jwt from "jsonwebtoken";
import { v4 as uuidv4 } from "uuid";

// process.env is populated by "dotenv/config" in index.ts (loaded before this module)
if (!process.env.JWT_SECRET) {
throw new Error("JWT_SECRET environment variable is required");
}
const JWT_SECRET: string = process.env.JWT_SECRET;

interface TokenPayload {
userId: string;
email: string;
}

export function generateTokens(payload: TokenPayload) {
const accessToken = jwt.sign(payload, JWT_SECRET, { expiresIn: "15m" });
const refreshToken = uuidv4(); // opaque string, not a JWT

return { jwt: accessToken, refreshToken };
}

export function verifyAccessToken(token: string): TokenPayload {
return jwt.verify(token, JWT_SECRET) as unknown as TokenPayload;
}

Store your secret in .env:

# .env
JWT_SECRET=your-secret-key-at-least-32-chars-long
PORT=3000
Common mistake

Never put sensitive data in the JWT payload — it is base64-encoded, not encrypted. Anyone can decode it with atob(token.split('.')[1]). Only include non-sensitive identifiers like userId and email. Never include passwords, credit cards, or internal role details that could be exploited.

Auth middleware

// src/middleware/authenticate.ts
import { Request, Response, NextFunction } from "express";
import { verifyAccessToken } from "../auth.js";

// Extend Express Request type to include user
declare global {
namespace Express {
interface Request {
user?: { userId: string; email: string };
}
}
}

export function authenticate(req: Request, res: Response, next: NextFunction): void {
const header = req.headers.authorization;

if (!header || !header.startsWith("Bearer ")) {
res.status(401).json({ error: "Authorization header required" });
return;
}

const token = header.slice(7); // Remove "Bearer "

try {
const payload = verifyAccessToken(token);
req.user = { userId: payload.userId, email: payload.email };
next();
} catch {
res.status(401).json({ error: "Invalid or expired token" });
}
}

Protecting routes

Apply the authenticate middleware to routes that require authentication. Two approaches:

// Approach 1: Per-router — all routes in the router require auth
router.use(authenticate);
router.get("/", handler); // protected
router.post("/", handler); // protected

// Approach 2: Per-route — selective protection
router.get("/public", handler); // public
router.get("/private", authenticate, handler); // protected

For our API, tasks and categories require auth, account routes do not:

// src/app.ts — final version
import express from "express";
import cors from "cors";
import tasksRouter from "./routes/tasks.js";
import categoriesRouter from "./routes/categories.js";
import accountRouter from "./routes/account.js";
import { authenticate } from "./middleware/authenticate.js";
import { errorHandler } from "./middleware/errorHandler.js";

const app = express();

// Logging middleware — runs for every request
app.use((req, res, next) => {
const start = Date.now();
res.on("finish", () => {
const ms = Date.now() - start;
console.log(`${req.method} ${req.url} ${res.statusCode} ${ms}ms`);
});
next();
});

app.use(express.json({ limit: "1mb" }));
app.use(cors({
origin: process.env.CORS_ORIGIN?.split(",") || ["http://localhost:5173"],
credentials: true,
}));
app.use(express.static("public"));

// Public routes
app.use("/api/v1/account", accountRouter);

// Protected routes
app.use("/api/v1/tasks", authenticate, tasksRouter);
app.use("/api/v1/categories", authenticate, categoriesRouter);

// JSON 404 for unmatched API routes (before error handler)
app.use("/api", (req, res) => {
res.status(404).json({ error: "Endpoint not found" });
});

app.use(errorHandler);

export default app;

Now the tasks router uses req.user!.userId — each user only sees and modifies their own data.

Common mistake

Cross-user category binding. Our schema lets tasks.category_id reference any category — including one belonging to a different user. If a user guesses another user's category ID, they can bind their task to it. In production, validate that the referenced category belongs to req.user.userId before allowing the association. The same applies to any foreign key that crosses tenant boundaries.

Refresh token flow

info

This is the server side of the flow you already implemented in Vue (lecture 53) and React. The client's Axios interceptor catches the 401, calls /account/refresh, gets new tokens, and retries the original request — all transparently. Now you see what happens on the server end of that exchange.

Production checklist

This lecture builds a working API with real auth, but several production concerns are intentionally simplified for teaching:

  • HTTPS — Always use TLS in production. Tokens sent over plain HTTP can be intercepted. In deployment, terminate TLS at nginx or a load balancer (lecture 80), not in Express itself.
  • Rate limiting — Login and register endpoints should be rate-limited to prevent brute-force attacks. Use express-rate-limit or a reverse proxy rate limiter.
  • Auth event logging — Log failed login attempts. Monitor for unusual patterns (many failures from one IP, credential stuffing).
  • Refresh token cleanup — Expired tokens accumulate in the database. Run a periodic cleanup job (DELETE FROM refresh_tokens WHERE expires_at < datetime('now')), or clean up during each refresh request.
  • Token storage — Storing JWTs in localStorage is the common SPA pattern (and what your Vue/React frontends do). It is vulnerable to XSS — any injected script can steal the token. HttpOnly cookies are more secure but require a different auth architecture. Know the tradeoff.

Quick manual test

With the server running (npm run dev), verify the auth flow end-to-end:

# Register
curl -s -X POST http://localhost:3000/api/v1/account/register \
-H "Content-Type: application/json" \
-d '{"email":"test@test.com","password":"password123","firstName":"Test","lastName":"User"}'
# → 201 {"jwt":"eyJ...","refreshToken":"..."}

# Copy the jwt value, then:
curl -s http://localhost:3000/api/v1/tasks \
-H "Authorization: Bearer <paste-jwt-here>"
# → 200 []

# Without the token:
curl -s http://localhost:3000/api/v1/tasks
# → 401 {"error":"Authorization header required"}

Integration Test

We split app.ts from index.ts for testability — now let's use it. supertest lets you make HTTP requests against the Express app without starting a real server.

npm install -D vitest supertest @types/supertest
// src/__tests__/tasks.test.ts
import { describe, it, expect, beforeAll, afterAll } from "vitest";
import request from "supertest";
import fs from "fs";

// Configure env BEFORE importing app modules.
// Static imports are hoisted in ESM — they run before module body code.
// Dynamic import() is evaluated at runtime, so env is set first.
const TEST_DB = "test-task-api.db";
process.env.DB_PATH = TEST_DB;
process.env.JWT_SECRET = "test-secret-key-at-least-32-chars-long";

const { default: app } = await import("../app.js");
const { initializeDatabase, closeDb } = await import("../db/database.js");

let jwt: string;

beforeAll(async () => {
initializeDatabase();

const res = await request(app)
.post("/api/v1/account/register")
.send({ email: "test@test.com", password: "password123", firstName: "Test", lastName: "User" });

jwt = res.body.jwt;
});

afterAll(() => {
// Close the database connection before deleting files
closeDb();

// Clean up test database
if (fs.existsSync(TEST_DB)) fs.unlinkSync(TEST_DB);
if (fs.existsSync(`${TEST_DB}-wal`)) fs.unlinkSync(`${TEST_DB}-wal`);
if (fs.existsSync(`${TEST_DB}-shm`)) fs.unlinkSync(`${TEST_DB}-shm`);
});

describe("Tasks API", () => {
it("returns 401 without auth header", async () => {
const res = await request(app).get("/api/v1/tasks");
expect(res.status).toBe(401);
});

it("creates and retrieves a task", async () => {
const createRes = await request(app)
.post("/api/v1/tasks")
.set("Authorization", `Bearer ${jwt}`)
.send({ title: "Test task" });

expect(createRes.status).toBe(201);
expect(createRes.body.title).toBe("Test task");
expect(createRes.body.id).toBeDefined();

const getRes = await request(app)
.get(`/api/v1/tasks/${createRes.body.id}`)
.set("Authorization", `Bearer ${jwt}`);

expect(getRes.status).toBe(200);
expect(getRes.body.title).toBe("Test task");
});

it("returns 404 for unknown endpoint", async () => {
const res = await request(app).get("/api/v1/nonexistent");
expect(res.status).toBe(404);
expect(res.body.error).toBeDefined();
});
});
npx vitest run

Testing is covered in depth in lecture 55. This is just enough to verify the app/index split actually works and the API behaves as expected.


Docker

You already know Docker from lecture 80 — there we containerized frontend apps with nginx. For a Node.js API server, the pattern is different: we run Node.js directly instead of serving static files.

Dockerfile

Multi-stage build — stage 1 compiles TypeScript, stage 2 runs the compiled JavaScript:

# Stage 1: Build
FROM node:22-alpine AS build
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY tsconfig.json ./
COPY src ./src
RUN npm run build

# Stage 2: Run
FROM node:22-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --omit=dev
COPY --from=build /app/dist ./dist
COPY src/db/migrations ./dist/db/migrations
COPY public ./public
EXPOSE 3000
CMD ["node", "dist/index.js"]

Note the two extra COPY lines: tsc only compiles .ts files, so the .sql migration files and the public/ folder must be copied separately. Without these, initializeDatabase() and express.static("public") break silently in the container.

.dockerignore

node_modules
dist
*.db
.env
.git

docker-compose.yml

Since our app code uses SQLite, the simplest compose file mounts a volume for the database file:

services:
api:
build: .
ports:
- "3000:3000"
environment:
- PORT=3000
- JWT_SECRET=${JWT_SECRET}
- DB_PATH=/data/task-api.db
- CORS_ORIGIN=http://localhost:5173
volumes:
- dbdata:/data

volumes:
dbdata:

For production with PostgreSQL, you would swap better-sqlite3 for the pg library (as shown in the PostgreSQL section) and use a compose file like this:

services:
api:
build: .
ports:
- "3000:3000"
environment:
- PORT=3000
- JWT_SECRET=${JWT_SECRET}
- DATABASE_URL=postgresql://taskuser:taskpass@db:5432/taskdb
- CORS_ORIGIN=http://localhost:5173
depends_on:
db:
condition: service_healthy

db:
image: postgres:16-alpine
environment:
- POSTGRES_USER=taskuser
- POSTGRES_PASSWORD=taskpass
- POSTGRES_DB=taskdb
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U taskuser -d taskdb"]
interval: 5s
timeout: 5s
retries: 5

volumes:
pgdata:
docker compose up --build
info

For local development, just use npm run dev with SQLite — no Docker needed. The SQLite compose file above is for testing your Docker build. The PostgreSQL compose requires switching the database driver in your app code — the NestJS lecture covers this with Prisma.


Endpoint Reference

All endpoints in the final project:

MethodPathAuthRequest BodySuccess ResponseStatus
POST/api/v1/account/registerNo{email, password, firstName, lastName}{jwt, refreshToken}201
POST/api/v1/account/loginNo{email, password}{jwt, refreshToken}200
POST/api/v1/account/refreshNo{refreshToken}{jwt, refreshToken}200
POST/api/v1/account/logoutNo{refreshToken}204
GET/api/v1/tasksYesTask[]200
GET/api/v1/tasks/:idYesTask200 / 404
POST/api/v1/tasksYes{title, description?, status?, priority?, dueDate?, categoryId?}Task201
PUT/api/v1/tasks/:idYes{title, description, status, priority, dueDate, categoryId} (all required)Task200 / 404
PATCH/api/v1/tasks/:idYespartial fieldsTask200 / 404
DELETE/api/v1/tasks/:idYes204 / 404
GET/api/v1/categoriesYesCategory[]200
GET/api/v1/categories/:idYesCategory200 / 404
POST/api/v1/categoriesYes{name}Category201
PUT/api/v1/categories/:idYes{name}Category200 / 404
DELETE/api/v1/categories/:idYes204 / 404

All protected endpoints return 401 if the Authorization header is missing or the JWT is expired.


Defence Preparation

Be prepared to explain topics like these:

  1. What is SQL injection and how do parameterized queries prevent it? — SQL injection occurs when user input is concatenated directly into SQL strings, allowing an attacker to manipulate the query. For example, an ID of '; DROP TABLE tasks; -- could delete your data. Parameterized queries (prepared statements) send the SQL structure and data separately — the database treats parameters as literal values, never as SQL commands. better-sqlite3 uses ? placeholders; PostgreSQL uses $1, $2.

  2. Why use both access tokens and refresh tokens instead of a single long-lived token? — Security tradeoff. A short-lived access token (15 min) limits the damage window if it is stolen — the attacker can only use it until it expires. The refresh token is stored in the database and can be revoked instantly (delete the row) to cut off access. With a single long-lived token, you cannot revoke it without maintaining a blacklist (which defeats the stateless benefit of JWTs). The dual-token pattern gives you both stateless request authentication (JWT) and server-side session control (refresh token).

  3. What is the purpose of bcrypt over a simple hash like SHA-256? — SHA-256 is fast by design — an attacker can compute billions of hashes per second to brute-force passwords. bcrypt is intentionally slow (tunable via the cost factor). With a cost factor of 12, one hash takes ~250ms, making brute-force impractical. bcrypt also automatically generates and embeds a unique random salt per password, preventing rainbow table attacks. SHA-256 with no salt means identical passwords produce identical hashes. Note: OWASP now recommends Argon2id as the modern first choice; bcrypt is an acceptable fallback but has a 72-byte password input limit.

  4. Walk through the JWT refresh flow end-to-end: what happens when the access token expires? — (1) Client sends a request with the expired JWT. (2) Server returns 401 Unauthorized. (3) Client's HTTP interceptor (Axios in Vue/React) catches the 401 and sends a POST /account/refresh with the refresh token. (4) Server looks up the refresh token in the database, verifies it exists and is not expired. (5) Server deletes the old refresh token (single-use) and generates a new JWT + refresh token pair. (6) Server returns the new pair. (7) Client stores the new tokens and retries the original request with the new JWT.

  5. Why should you never put sensitive data in a JWT payload? — JWT payloads are base64url-encoded, not encrypted. Anyone who intercepts the token can decode it with atob(token.split('.')[1]) and read the payload in plain text. The signature only ensures the payload was not tampered with — it does not hide the content. Only include non-sensitive identifiers (userId, email, roles). Never include passwords, financial data, or internal system details.

  6. What is the difference between SQLite and PostgreSQL, and when would you use each? — SQLite is an embedded database that runs inside your application process and stores data in a single file. It requires zero configuration and supports one writer at a time. PostgreSQL is a client-server database that runs as a separate process, supports fully concurrent reads and writes, and scales to millions of rows. Use SQLite for development, testing, small single-user apps, and embedded systems. Use PostgreSQL for production multi-user applications, especially when deployed in containers (each container gets its own SQLite file, which breaks shared state).