Template for Small Projects: A Full TypeScript Stack
In my company, we have many small projects that typically consist of a few pages with simple business logic. Each time we start developing a project like this, we spend time on project setup with boilerplate code for different tech stacks. When we assign projects to different teams, they might use different technologies or set up the development environment differently. This repetitive task consumes valuable time that could be spent on actual development.
We Need a Template for Small Projects
To reduce time spent on repeated project setup, we want a template that bootstraps everything with a single command. By doing this, we can save significant time and accelerate the development phase, allowing us to quickly move on to other projects.
After evaluating our needs, we decided on a full TypeScript stack. Why? Because:
- Single language across the entire stack - Less context switching for developers
- Type safety everywhere - From frontend to backend to infrastructure
- Shared types between services - No more API contract mismatches
- Better IDE support - IntelliSense, refactoring, and error detection across the project
The Tech Stack
Here's what our template includes:
Backend: Express 5 + TypeScript
Express 5 brings native Promise support, which means better async/await handling without wrapper libraries. Combined with TypeScript, we get:
// src/server/routes/users.ts
import { Router, Request, Response } from 'express'
import { UserService } from '../services/user.service'
import { CreateUserDto, UserResponse } from '@shared/types'
const router = Router()
router.post('/users', async (req: Request<{}, UserResponse, CreateUserDto>, res: Response<UserResponse>) => {
const user = await UserService.create(req.body)
res.json(user)
})
export default routerThe key configuration in tsconfig.json for the backend:
{
"compilerOptions": {
"target": "ES2022",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"outDir": "./dist",
"rootDir": "./src",
"declaration": true,
"paths": {
"@shared/*": ["../shared/*"]
}
}
}Frontend: React 19 + TypeScript + ShadCN UI + React Router
React 19 introduces exciting features like Server Components and improved concurrent rendering. Combined with ShadCN UI, we get beautiful, accessible components without the bloat of traditional component libraries.
// src/client/pages/Dashboard.tsx
import { useQuery } from '@tanstack/react-query'
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card'
import { Button } from '@/components/ui/button'
import { UserList } from '@/components/UserList'
import type { User } from '@shared/types'
export function Dashboard() {
const { data: users, isLoading } = useQuery<User[]>({
queryKey: ['users'],
queryFn: () => fetch('/api/users').then(res => res.json())
})
if (isLoading) return <div>Loading...</div>
return (
<Card>
<CardHeader>
<CardTitle>Users</CardTitle>
</CardHeader>
<CardContent>
<UserList users={users ?? []} />
<Button variant="outline">Add User</Button>
</CardContent>
</Card>
)
}React Router handles navigation with type-safe route parameters:
// src/client/routes.tsx
import { createBrowserRouter } from 'react-router-dom'
import { Dashboard } from './pages/Dashboard'
import { UserDetail } from './pages/UserDetail'
export const router = createBrowserRouter([
{ path: '/', element: <Dashboard /> },
{ path: '/users/:id', element: <UserDetail /> }
])Database: PostgreSQL
For small projects, we deploy PostgreSQL on the same VM as the application. This keeps infrastructure simple and cost-effective. For local development, we use Docker Compose to spin up a PostgreSQL instance.
We use Drizzle ORM for type-safe database operations:
// packages/server/src/db/schema.ts
import { pgTable, uuid, varchar, timestamp } from 'drizzle-orm/pg-core'
export const users = pgTable('users', {
id: uuid('id').primaryKey().defaultRandom(),
email: varchar('email', { length: 255 }).notNull().unique(),
name: varchar('name', { length: 255 }).notNull(),
passwordHash: varchar('password_hash', { length: 255 }).notNull(),
createdAt: timestamp('created_at').defaultNow().notNull(),
updatedAt: timestamp('updated_at').defaultNow().notNull()
})
export type User = typeof users.$inferSelect
export type NewUser = typeof users.$inferInsert// packages/server/src/db/index.ts
import { drizzle } from 'drizzle-orm/node-postgres'
import { Pool } from 'pg'
import * as schema from './schema'
const pool = new Pool({
connectionString: process.env.DATABASE_URL
})
export const db = drizzle(pool, { schema })Database configuration in the server package:
// packages/server/drizzle.config.ts
import type { Config } from 'drizzle-kit'
export default {
schema: './src/db/schema.ts',
out: './src/db/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!
}
} satisfies ConfigInfrastructure: Pulumi + TypeScript + Azure VM
This is where TypeScript really shines. Instead of writing YAML or HCL, we define our infrastructure with real code:
// packages/infra/src/index.ts
import * as pulumi from '@pulumi/pulumi'
import * as azure from '@pulumi/azure-native'
import { createVmSetupScript } from './vm-setup'
const config = new pulumi.Config()
const environment = pulumi.getStack()
// Resource Group
const resourceGroup = new azure.resources.ResourceGroup('rg', {
resourceGroupName: `rg-myapp-${environment}`,
location: 'Southeast Asia'
})
// Virtual Network
const vnet = new azure.network.VirtualNetwork('vnet', {
resourceGroupName: resourceGroup.name,
virtualNetworkName: `vnet-myapp-${environment}`,
addressSpace: { addressPrefixes: ['10.0.0.0/16'] },
subnets: [{
name: 'default',
addressPrefix: '10.0.1.0/24'
}]
})
// Network Security Group - Allow SSH, HTTP, HTTPS
const nsg = new azure.network.NetworkSecurityGroup('nsg', {
resourceGroupName: resourceGroup.name,
securityRules: [
{ name: 'SSH', priority: 1000, access: 'Allow', direction: 'Inbound',
protocol: 'Tcp', sourcePortRange: '*', destinationPortRange: '22',
sourceAddressPrefix: '*', destinationAddressPrefix: '*' },
{ name: 'HTTP', priority: 1001, access: 'Allow', direction: 'Inbound',
protocol: 'Tcp', sourcePortRange: '*', destinationPortRange: '80',
sourceAddressPrefix: '*', destinationAddressPrefix: '*' },
{ name: 'HTTPS', priority: 1002, access: 'Allow', direction: 'Inbound',
protocol: 'Tcp', sourcePortRange: '*', destinationPortRange: '443',
sourceAddressPrefix: '*', destinationAddressPrefix: '*' }
]
})
// Public IP
const publicIp = new azure.network.PublicIPAddress('pip', {
resourceGroupName: resourceGroup.name,
publicIPAllocationMethod: 'Static',
sku: { name: 'Standard' }
})
// Network Interface
const nic = new azure.network.NetworkInterface('nic', {
resourceGroupName: resourceGroup.name,
ipConfigurations: [{
name: 'ipconfig',
subnet: { id: vnet.subnets.apply(s => s![0].id!) },
publicIPAddress: { id: publicIp.id }
}],
networkSecurityGroup: { id: nsg.id }
})
// VM Setup Script - installs Node.js, PostgreSQL, Nginx, PM2
const vmSetupScript = createVmSetupScript({
dbName: config.require('dbName'),
dbUser: config.require('dbUser'),
dbPassword: config.requireSecret('dbPassword')
})
// Virtual Machine
const vm = new azure.compute.VirtualMachine('vm', {
resourceGroupName: resourceGroup.name,
vmName: `vm-myapp-${environment}`,
vmSize: config.get('vmSize') || 'Standard_B2s',
osProfile: {
computerName: 'myapp',
adminUsername: config.require('adminUsername'),
adminPassword: config.requireSecret('adminPassword'),
customData: vmSetupScript // Base64-encoded cloud-init script
},
storageProfile: {
imageReference: {
publisher: 'Canonical',
offer: '0001-com-ubuntu-server-jammy',
sku: '22_04-lts-gen2',
version: 'latest'
},
osDisk: {
createOption: 'FromImage',
managedDisk: { storageAccountType: 'Standard_LRS' },
diskSizeGB: 30
}
},
networkProfile: {
networkInterfaces: [{ id: nic.id, primary: true }]
}
})
export const vmPublicIp = publicIp.ipAddress
export const resourceGroupName = resourceGroup.name
export const vmName = vm.nameThe VM setup script installs all required software including PostgreSQL:
// packages/infra/src/vm-setup.ts
import * as pulumi from '@pulumi/pulumi'
interface VmSetupConfig {
dbName: string
dbUser: string
dbPassword: pulumi.Output<string>
}
export function createVmSetupScript(config: VmSetupConfig): pulumi.Output<string> {
return config.dbPassword.apply(dbPassword => {
const script = `#!/bin/bash
set -e
# Update system
apt-get update && apt-get upgrade -y
# Install Node.js 20
curl -fsSL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
# Install PostgreSQL 16
sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add -
apt-get update
apt-get install -y postgresql-16
# Configure PostgreSQL
sudo -u postgres psql -c "CREATE USER ${config.dbUser} WITH PASSWORD '${dbPassword}';"
sudo -u postgres psql -c "CREATE DATABASE ${config.dbName} OWNER ${config.dbUser};"
sudo -u postgres psql -c "GRANT ALL PRIVILEGES ON DATABASE ${config.dbName} TO ${config.dbUser};"
# Allow local connections
echo "host all all 127.0.0.1/32 md5" >> /etc/postgresql/16/main/pg_hba.conf
systemctl restart postgresql
# Install Nginx
apt-get install -y nginx
# Configure Nginx as reverse proxy
cat > /etc/nginx/sites-available/myapp << 'NGINX'
server {
listen 80;
server_name _;
# Frontend (static files)
location / {
root /app/client;
try_files $uri $uri/ /index.html;
}
# Backend API
location /api {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_cache_bypass $http_upgrade;
}
}
NGINX
ln -sf /etc/nginx/sites-available/myapp /etc/nginx/sites-enabled/
rm -f /etc/nginx/sites-enabled/default
systemctl restart nginx
# Install PM2
npm install -g pm2
# Create app directory
mkdir -p /app/{client,server}
chown -R deploy:deploy /app
# Create deploy user
useradd -m -s /bin/bash deploy
echo "deploy ALL=(ALL) NOPASSWD: /bin/systemctl restart nginx" >> /etc/sudoers.d/deploy
echo "VM setup complete!"
`
return Buffer.from(script).toString('base64')
})
}The benefits of infrastructure-as-code with TypeScript:
- Refactoring support - Rename resources safely across your codebase
- Type checking - Catch configuration errors before deployment
- Reusable components - Create factory functions for common patterns
- Testing - Write unit tests for your infrastructure
Monorepo Structure
We use a monorepo structure to keep everything together while maintaining separation of concerns:
project-root/
├── packages/
│ ├── client/ # React frontend
│ │ ├── src/
│ │ ├── package.json
│ │ ├── vite.config.ts
│ │ ├── vitest.config.ts
│ │ └── tsconfig.json
│ ├── server/ # Express backend
│ │ ├── src/
│ │ │ └── db/
│ │ │ ├── schema.ts
│ │ │ ├── migrations/
│ │ │ └── seeds/
│ │ ├── package.json
│ │ ├── drizzle.config.ts
│ │ ├── jest.config.ts
│ │ └── tsconfig.json
│ ├── shared/ # Shared types and utilities
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── e2e/ # End-to-end tests
│ │ ├── tests/
│ │ ├── package.json
│ │ └── playwright.config.ts
│ └── infra/ # Pulumi infrastructure
│ ├── src/
│ ├── package.json
│ ├── Pulumi.yaml
│ ├── Pulumi.dev.yaml
│ ├── Pulumi.staging.yaml
│ ├── Pulumi.prod.yaml
│ └── tsconfig.json
├── scripts/ # Workspace automation scripts
│ ├── setup-env.js
│ ├── setup-azure-pipeline.js
│ └── deploy.js
├── docker-compose.yml # Local PostgreSQL for development
├── package.json # Root package.json with workspaces
├── tsconfig.base.json # Shared TypeScript config
├── turbo.json # Turborepo configuration
└── azure-pipelines.yml # CI/CD configurationLocal Development with Docker Compose
For local development, we use Docker Compose to run PostgreSQL:
# docker-compose.yml
services:
postgres:
image: postgres:16-alpine
container_name: myapp-postgres
restart: unless-stopped
environment:
POSTGRES_USER: ${POSTGRES_USER:-myapp}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-myapp_secret}
POSTGRES_DB: ${POSTGRES_DB:-myapp_dev}
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
- ./scripts/init-db.sql:/docker-entrypoint-initdb.d/init.sql:ro
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-myapp} -d ${POSTGRES_DB:-myapp_dev}"]
interval: 10s
timeout: 5s
retries: 5
volumes:
postgres_data:The initialization script creates the database structure:
-- scripts/init-db.sql
-- This runs only on first container creation
-- Create extensions
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
-- Grant privileges
GRANT ALL PRIVILEGES ON DATABASE myapp_dev TO myapp;The root package.json configures npm workspaces with comprehensive scripts for every workflow:
{
"name": "myapp-monorepo",
"private": true,
"workspaces": [
"packages/*"
],
"scripts": {
"// --- Bootstrap Scripts ---": "",
"bootstrap": "npm ci && npm run bootstrap:env",
"bootstrap:env": "node scripts/setup-env.js",
"bootstrap:local": "npm run bootstrap && npm run db:start && npm run db:wait && npm run db:migrate && npm run db:seed",
"bootstrap:azure:dev": "npm run infra:up:dev && npm run deploy:dev",
"bootstrap:azure:staging": "npm run infra:up:staging && npm run deploy:staging",
"bootstrap:azure:prod": "npm run infra:up:prod && npm run deploy:prod",
"bootstrap:pipeline": "node scripts/setup-azure-pipeline.js",
"// --- Development Scripts ---": "",
"dev": "turbo run dev --parallel",
"dev:client": "npm run dev --workspace=@myapp/client",
"dev:server": "npm run dev --workspace=@myapp/server",
"// --- Build Scripts ---": "",
"build": "turbo run build",
"build:client": "npm run build --workspace=@myapp/client",
"build:server": "npm run build --workspace=@myapp/server",
"build:shared": "npm run build --workspace=@myapp/shared",
"// --- Test Scripts ---": "",
"test": "turbo run test",
"test:unit": "turbo run test:unit",
"test:unit:client": "npm run test:unit --workspace=@myapp/client",
"test:unit:server": "npm run test:unit --workspace=@myapp/server",
"test:e2e": "npm run test:e2e --workspace=@myapp/e2e",
"test:e2e:headed": "npm run test:e2e:headed --workspace=@myapp/e2e",
"test:coverage": "turbo run test:coverage",
"// --- Lint & Format Scripts ---": "",
"lint": "turbo run lint",
"lint:fix": "turbo run lint:fix",
"format": "prettier --write \"packages/**/*.{ts,tsx,json}\"",
"typecheck": "turbo run typecheck",
"// --- Database Scripts ---": "",
"db:start": "docker compose up -d postgres",
"db:stop": "docker compose down",
"db:wait": "node scripts/wait-for-db.js",
"db:migrate": "npm run db:migrate --workspace=@myapp/server",
"db:migrate:generate": "npm run db:migrate:generate --workspace=@myapp/server",
"db:seed": "npm run db:seed --workspace=@myapp/server",
"db:reset": "docker compose down -v && npm run db:start && npm run db:wait && npm run db:migrate && npm run db:seed",
"db:studio": "npm run db:studio --workspace=@myapp/server",
"// --- Infrastructure Scripts ---": "",
"infra:login": "cd packages/infra && pulumi login",
"infra:preview:dev": "cd packages/infra && pulumi stack select dev && pulumi preview",
"infra:preview:staging": "cd packages/infra && pulumi stack select staging && pulumi preview",
"infra:preview:prod": "cd packages/infra && pulumi stack select prod && pulumi preview",
"infra:up:dev": "cd packages/infra && pulumi stack select dev && pulumi up --yes",
"infra:up:staging": "cd packages/infra && pulumi stack select staging && pulumi up --yes",
"infra:up:prod": "cd packages/infra && pulumi stack select prod && pulumi up",
"infra:destroy:dev": "cd packages/infra && pulumi stack select dev && pulumi destroy",
"infra:outputs": "cd packages/infra && pulumi stack output --json",
"// --- Deploy Scripts ---": "",
"deploy:dev": "node scripts/deploy.js dev",
"deploy:staging": "node scripts/deploy.js staging",
"deploy:prod": "node scripts/deploy.js prod",
"// --- Utility Scripts ---": "",
"clean": "turbo run clean && rm -rf node_modules",
"prepare": "husky install"
},
"devDependencies": {
"turbo": "^2.0.0",
"typescript": "^5.4.0",
"prettier": "^3.2.0",
"husky": "^9.0.0"
}
}Workspace Scripts Deep Dive
Let's break down each script category and what they do.
Bootstrap Scripts
These scripts set up your environment from scratch:
# Full local setup - install deps, create .env files, start PostgreSQL, run migrations
npm run bootstrap:local
# Bootstrap Azure VM for each environment
npm run bootstrap:azure:dev # Creates dev infrastructure + deploys
npm run bootstrap:azure:staging # Creates staging infrastructure + deploys
npm run bootstrap:azure:prod # Creates prod infrastructure + deploys
# Set up Azure DevOps pipeline configuration
npm run bootstrap:pipelineThe bootstrap:local command does the following:
- Installs all npm dependencies
- Creates
.envfiles from examples - Starts PostgreSQL container via Docker Compose
- Waits for the database to be ready
- Runs database migrations
- Seeds the database with initial data
Development Scripts
Start your local development environment:
# Start all services (client + server) with hot reload
npm run dev
# Start individual services
npm run dev:client # Only frontend on port 5173
npm run dev:server # Only backend on port 3000Unit Test Scripts
Run unit tests for frontend and backend independently:
# Run all unit tests
npm run test:unit
# Run frontend tests (React Testing Library + Vitest)
npm run test:unit:client
# Run backend tests (Jest + Supertest)
npm run test:unit:server
# Run with coverage report
npm run test:coverageE2E Test Scripts
End-to-end tests using Playwright:
# Run e2e tests headlessly
npm run test:e2e
# Run e2e tests with browser visible (for debugging)
npm run test:e2e:headedDatabase Scripts
Manage the local PostgreSQL database:
# Start PostgreSQL container
npm run db:start
# Stop PostgreSQL container
npm run db:stop
# Wait for database to be ready (used in scripts)
npm run db:wait
# Run migrations
npm run db:migrate
# Generate new migration from schema changes
npm run db:migrate:generate
# Seed database with initial data
npm run db:seed
# Reset database (destroys all data, recreates from scratch)
npm run db:reset
# Open Drizzle Studio (database GUI)
npm run db:studioInfrastructure Scripts
Manage Azure infrastructure with Pulumi:
# Login to Pulumi (required once)
npm run infra:login
# Preview changes before applying
npm run infra:preview:dev
npm run infra:preview:staging
npm run infra:preview:prod
# Apply infrastructure changes
npm run infra:up:dev # Auto-confirms for dev
npm run infra:up:staging # Auto-confirms for staging
npm run infra:up:prod # Requires manual confirmation (safety)
# Tear down infrastructure (use with caution!)
npm run infra:destroy:dev
# Get outputs (IP addresses, resource names, etc.)
npm run infra:outputsDeploy Scripts
Deploy application code to Azure VMs. The deployment includes database migrations:
// scripts/deploy.js
import { execSync } from 'child_process'
const environment = process.argv[2]
if (!['dev', 'staging', 'prod'].includes(environment)) {
console.error('Usage: node deploy.js <dev|staging|prod>')
process.exit(1)
}
// Get VM IP and DB credentials from Pulumi outputs
process.chdir('packages/infra')
execSync(`pulumi stack select ${environment}`)
const outputs = JSON.parse(execSync('pulumi stack output --json').toString())
const vmIp = outputs.vmPublicIp
process.chdir('../..')
console.log(`Deploying to ${environment} (${vmIp})...`)
// Build artifacts
console.log('Building packages...')
execSync('npm run build', { stdio: 'inherit' })
// Copy server files (including migrations)
console.log('Copying server files...')
execSync(`rsync -avz --delete packages/server/dist/ deploy@${vmIp}:/app/server/dist/`)
execSync(`rsync -avz packages/server/src/db/migrations/ deploy@${vmIp}:/app/server/migrations/`)
execSync(`rsync -avz packages/server/package.json deploy@${vmIp}:/app/server/`)
execSync(`rsync -avz packages/server/drizzle.config.ts deploy@${vmIp}:/app/server/`)
// Copy client files
console.log('Copying client files...')
execSync(`rsync -avz --delete packages/client/dist/ deploy@${vmIp}:/app/client/`)
// Copy PM2 ecosystem config
execSync(`rsync -avz ecosystem.config.js deploy@${vmIp}:/app/`)
// Install dependencies and run migrations on VM
console.log('Running migrations...')
execSync(`ssh deploy@${vmIp} "cd /app/server && npm ci --production"`, { stdio: 'inherit' })
execSync(`ssh deploy@${vmIp} "cd /app/server && npx drizzle-kit migrate"`, { stdio: 'inherit' })
// Restart services
console.log('Restarting services...')
execSync(`ssh deploy@${vmIp} "cd /app && pm2 restart ecosystem.config.js --update-env"`, { stdio: 'inherit' })
console.log(`Deployment to ${environment} complete!`)TypeScript 5 Configuration
We leverage TypeScript 5 features for better developer experience. The base configuration:
// tsconfig.base.json
{
"compilerOptions": {
"target": "ES2022",
"lib": ["ES2022"],
"strict": true,
"strictNullChecks": true,
"noUncheckedIndexedAccess": true,
"noImplicitOverride": true,
"moduleDetection": "force",
"esModuleInterop": true,
"skipLibCheck": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true,
"composite": true,
"verbatimModuleSyntax": true
}
}Each package extends this base configuration:
// packages/client/tsconfig.json
{
"extends": "../../tsconfig.base.json",
"compilerOptions": {
"target": "ES2022",
"lib": ["ES2022", "DOM", "DOM.Iterable"],
"module": "ESNext",
"moduleResolution": "bundler",
"jsx": "react-jsx",
"outDir": "./dist",
"rootDir": "./src",
"paths": {
"@/*": ["./src/*"],
"@shared/*": ["../shared/src/*"]
}
},
"include": ["src"],
"references": [{ "path": "../shared" }]
}CI/CD with Azure DevOps + Pulumi
The CI/CD pipeline automates testing, building, and deployment with three stages:
- Build & Test - Lint, typecheck, unit tests, and build artifacts
- Deploy to Dev/Staging - Auto-deploy on branch push with E2E tests
- Deploy to Production - Manual approval required for infrastructure changes
Each environment has its own Pulumi stack configuration with secrets for database credentials:
# Set secrets for each environment
pulumi stack select dev
pulumi config set --secret adminPassword "your-secure-password"
pulumi config set --secret dbPassword "your-db-password"Getting Started with the Template
To use this template, we created a simple CLI command:
# Clone the template
npx degit our-org/typescript-fullstack-template my-new-project
# Navigate to the project
cd my-new-project
# Full local setup (installs deps, starts PostgreSQL, runs migrations)
npm run bootstrap:local
# Start development servers
npm run devAlternatively, for manual setup:
# Install dependencies
npm install
# Create environment files
npm run bootstrap:env
# Start PostgreSQL with Docker Compose
npm run db:start
# Wait for database to be ready
npm run db:wait
# Run migrations
npm run db:migrate
# Seed with test data (optional)
npm run db:seed
# Start development servers
npm run devThe development experience includes:
- Hot reloading for both frontend and backend
- Shared type watching - Changes in shared types reflect immediately
- Concurrent development - Run all services with a single command
Benefits We've Seen
Since adopting this template, we've experienced:
- Faster project kickoff - From days to hours
- Consistent code quality - Same linting rules and TypeScript config across projects
- Easier onboarding - New team members understand the structure immediately
- Fewer runtime errors - TypeScript catches issues at compile time
- Simplified deployment - One pipeline pattern for all projects
Conclusion
A well-designed template can dramatically improve team productivity. By using TypeScript across the entire stack—frontend, backend, and infrastructure—we've created a cohesive development experience that reduces errors and speeds up delivery.
The key takeaways:
- TypeScript everywhere provides consistency and type safety
- Monorepo structure enables code sharing and unified tooling
- Infrastructure as code with Pulumi gives us the same benefits in deployment
- Automated CI/CD ensures quality and enables rapid iteration
Feel free to adapt this template to your team's needs. The specific tools may vary, but the principles of type safety, code sharing, and automation will serve you well regardless of your exact tech choices.
What template patterns have worked well for your team? I'd love to hear about your experiences!