If you're running a data-heavy application, you need visibility into what's happening at the database layer. PostHog lets you capture SQL query execution as events, giving you the data to spot performance bottlenecks and understand which queries your users are actually hitting.
Capture SQL Query Events
The foundation is sending query metadata to PostHog whenever a query executes. This gives you a complete audit trail.
Install and initialize the PostHog Node SDK
Set up the PostHog client with your project API key. You'll use this to capture events whenever a query runs.
const { PostHog } = require('posthog-node');
const client = new PostHog(
'phc_YOUR_PROJECT_API_KEY',
{ host: 'https://us.posthog.com' }
);Wrap your database queries to capture events
Intercept query execution and send an event to PostHog with details like execution time, query type, and table names. This works with any database driver.
const db = require('pg'); // or your database driver
async function executeQuery(sql, params) {
const startTime = Date.now();
try {
const result = await db.query(sql, params);
const duration = Date.now() - startTime;
client.capture({
distinctId: 'backend-service',
event: 'sql_query_executed',
properties: {
query_type: sql.split(' ')[0].toUpperCase(),
execution_time_ms: duration,
rows_affected: result.rowCount,
table: extractTableName(sql)
}
});
return result;
} catch (error) {
client.capture({
distinctId: 'backend-service',
event: 'sql_query_failed',
properties: {
query_type: sql.split(' ')[0].toUpperCase(),
error_message: error.message
}
});
throw error;
}
}
function extractTableName(sql) {
const match = sql.match(/FROM\s+(\w+)/i) || sql.match(/INTO\s+(\w+)/i);
return match ? match[1] : 'unknown';
}Flag slow queries for easier filtering
Add an is_slow property to queries that exceed a threshold (e.g., 200ms). This makes it easy to filter slow queries later in the Events tab.
const SLOW_QUERY_THRESHOLD = 200; // milliseconds
client.capture({
distinctId: 'backend-service',
event: 'sql_query_executed',
properties: {
query_type: 'SELECT',
execution_time_ms: duration,
rows_affected: result.rowCount,
is_slow: duration > SLOW_QUERY_THRESHOLD,
table: 'orders'
}
});Analyze Query Performance in PostHog
Once you're sending events, use PostHog's Insights to surface slow queries and performance trends.
Create an Insight to track slow queries
In Insights, create a new event-based Insight. Filter for sql_query_executed events where is_slow = true. Group by table name to see which tables are causing bottlenecks.
// In PostHog Insights UI:
// Event: 'sql_query_executed'
// Filter: properties.is_slow = true
// Display: Count of events
// Group by: properties.table
// Breakdown: properties.query_type (SELECT, UPDATE, INSERT, etc)Track average execution time over time
Create a second Insight that shows the average execution_time_ms trend across all queries. This reveals performance regressions as they happen.
// In PostHog Insights UI:
// Event: 'sql_query_executed'
// Display: Average of properties.execution_time_ms
// Trend by: Day or Hour
// Compare: Add a second series for 'sql_query_failed' countBuild a monitoring dashboard
Pin multiple Insights to a Dashboard: total queries by type, avg execution time by table, slow query percentage, and failure rate. Refresh every 5 minutes for real-time visibility.
// Dashboard suggestions:
// Card 1: Count of queries (total volume)
// Card 2: Avg execution_time_ms (overall health)
// Card 3: Count where is_slow=true (problem queries)
// Card 4: Count of 'sql_query_failed' (errors)
// Card 5: Queries by table (breakdown)
// Set refresh interval to 5 minutesis_slow = true.Correlate Queries with User Sessions
Link SQL events to user sessions so you can see which users hit slow queries and if it impacts their behavior.
Pass user ID to capture calls
Instead of using 'backend-service' as distinctId, pass the actual user ID. This links each query to a session in PostHog's Session Recordings and Funnels.
// In your request handler:
async function handleUserRequest(sql, params, userId) {
const startTime = Date.now();
const result = await db.query(sql, params);
const duration = Date.now() - startTime;
client.capture({
distinctId: userId, // Link query to user session
event: 'sql_query_executed',
properties: {
query_type: sql.split(' ')[0].toUpperCase(),
execution_time_ms: duration,
is_slow: duration > 200,
table: extractTableName(sql)
}
});
return result;
}Check if slow queries impact conversions
In Funnels, create a funnel with user actions (e.g., page view → form submit → purchase). Filter by is_slow = true to see if users hitting slow queries drop off at higher rates.
// In PostHog Funnels UI:
// Step 1: event = 'page_view' AND properties.page = '/checkout'
// Step 2: event = 'purchase_initiated'
// Step 3: event = 'purchase_completed'
// Filter: properties.is_slow = true
// Compare conversion rate with and without filterSet up performance alerts
Create a Subscription on your avg execution time Insight. Alert via Slack if the average spikes more than 50% above baseline, giving you early warning of regressions.
// In PostHog: Create Insight > Subscriptions
// Metric: Average of properties.execution_time_ms
// Condition: Value is more than 50% higher than baseline
// Notification: Slack webhook or email
// Frequency: Daily or on changeCommon Pitfalls
- Capturing raw SQL text with parameter values exposes credentials and customer PII—always strip sensitive data before sending to PostHog
- Sampling every single query without filtering creates noise and inflates costs. Focus only on slow queries (duration > 200ms) or queries that fail
- Using a static
distinctIdlike 'backend-service' for all queries breaks user correlation. Always pass the actual user ID when available - Forgetting to call
client.flush()before process exit means your last batch of events is lost—add it to shutdown handlers
Wrapping Up
You now have SQL query visibility in PostHog—execution times, failure rates, and which users trigger expensive queries. This data surfaces database bottlenecks faster than logs alone. If you want to track this automatically across tools, Product Analyst can help.