A slow database is the most common cause of slow web applications. No amount of frontend optimisation or CDN caching can compensate for queries that take seconds to execute. The good news is that most database performance issues can be resolved with targeted indexing, query restructuring, and sensible architecture decisions. Here is how we approach database optimisation at Born Digital.
Identifying Slow Queries
Before optimising, you need to know what is slow. Enable slow query logging in your database — both PostgreSQL and MySQL support this natively. Set the threshold low (100 milliseconds) during an audit period to capture queries that are slow but not obviously broken. Tools like pg_stat_statements for PostgreSQL or the Performance Schema in MySQL provide aggregated statistics showing which queries consume the most total time across all executions.
Application Performance Monitoring (APM) tools like Datadog, New Relic, or the open-source alternative SigNoz can trace slow requests back to specific database queries, showing you exactly which endpoint is affected and how the query fits into the broader request lifecycle.
Indexing Strategies
Indexes are the single most impactful optimisation for read-heavy workloads. An index allows the database to find rows without scanning the entire table, reducing query time from seconds to milliseconds. However, indexes are not free — they consume storage, slow down writes, and can become counterproductive if applied indiscriminately.
Key indexing principles:
- Index columns used in WHERE clauses: If you frequently filter by status, date, or user ID, these columns should have indexes.
- Composite indexes for multi-column queries: A query filtering on both user_id and created_at benefits from a single composite index on (user_id, created_at), not two separate indexes.
- Covering indexes: Include all columns needed by a query in the index itself so the database can satisfy the query entirely from the index without touching the table data.
- Use EXPLAIN: Always check the query execution plan with EXPLAIN (or EXPLAIN ANALYSE in PostgreSQL) to verify that your index is actually being used.
Query Optimisation Techniques
Beyond indexing, the way you write queries significantly affects performance. Avoid SELECT * — fetch only the columns you need. Replace subqueries with JOINs where possible, as the query planner can often optimise JOINs more effectively. Use pagination with keyset (cursor-based) pagination instead of OFFSET for large datasets, because OFFSET forces the database to scan and discard rows, getting slower as the offset increases.
Be cautious with ORMs. While they improve developer productivity, they can generate inefficient queries — particularly N+1 query patterns where a list endpoint makes one query per item instead of a single JOIN. Use eager loading (such as Eloquent's with() in Laravel or Prisma's include) to batch related data fetching.
Caching and Read Replicas
Not every request needs to hit the primary database. Application-level caching with Redis or Memcached can serve frequently accessed data — product catalogues, category listings, user sessions — without touching the database at all. For read-heavy applications, read replicas distribute query load across multiple database instances while the primary handles writes.
Implement cache invalidation carefully. Stale data is worse than slow data in many contexts. Use time-based expiry for data that can tolerate brief staleness, and event-driven invalidation (clearing cache entries when the underlying data changes) for data that must always be current.
Ongoing Monitoring
Database optimisation is not a one-time task. As your data grows and usage patterns change, new bottlenecks emerge. Set up automated alerts for queries exceeding acceptable thresholds, monitor connection pool usage, and review slow query logs regularly. Schema changes and new features should always include a review of their database impact before deployment.
At Born Digital, we include database performance analysis as a standard part of our development process. Whether you are launching a new application or troubleshooting a slow existing system, targeted database optimisation often delivers the most dramatic performance improvements for the least effort.