GA4 gives you aggregated reports in the UI, but what if you need the raw data? BigQuery Export lets you stream every user interaction—every event, every parameter—directly to Google's data warehouse. From there, you can query, combine with your product data, and build custom analysis GA4's interface can't handle.
What BigQuery Export Is
BigQuery Export is a direct pipeline from GA4 to Google BigQuery, Google's analytics data warehouse.
Raw event data, not aggregated reports
While GA4's interface shows you aggregated metrics like "total events" or "unique users," BigQuery Export gives you event-level data. Every page view, every custom event, every user parameter lives in BigQuery tables where you control how to analyze it. You get user_pseudo_id, event_timestamp, event_parameters—the building blocks of your analytics.
// Example: Query raw events from BigQuery
const query = `
SELECT
event_name,
event_timestamp,
user_pseudo_id,
param.key,
param.value.string_value
FROM \`project_id.analytics_YOUR_PROPERTY_ID.events_*\`
WHERE _TABLE_SUFFIX = '20240115'
AND event_name = 'purchase'
LIMIT 1000;
`;
// Execute with BigQuery client
const [rows] = await bigquery.query({ query });Requires a Google Cloud project
You can't just enable BigQuery Export in GA4 and go. You need a Google Cloud project with BigQuery API enabled, service account credentials, and permissions to create datasets. GA4 pushes data to a dataset you own, so there are storage costs—usually small for most sites, but worth checking BigQuery's pricing if you have high event volume.
// Check if BigQuery export is enabled via Admin API
import { google } from 'googleapis';
const analyticsAdmin = google.analyticsadmin('v1beta');
const response = await analyticsAdmin.properties.bigQueryLinks.list({
parent: 'properties/YOUR_PROPERTY_ID',
auth: authClient
});
console.log('BigQuery Links:', response.data.bigqueryLinks);Why Teams Use It
GA4's UI is good for quick reporting, but BigQuery Export becomes essential when you need deeper control.
Custom analysis GA4 doesn't support
GA4 limits you to 4 dimensions and 4 metrics in Explore. With BigQuery, write SQL to combine any fields, cross-reference custom dimensions, apply custom attribution logic, or build cohort analysis. You're not bound by GA4's interface constraints anymore.
// Calculate repeat purchase rate (not in GA4 UI)
const query = `
WITH purchase_events AS (
SELECT
user_pseudo_id,
COUNT(*) as purchase_count
FROM \`project_id.analytics_YOUR_PROPERTY_ID.events_*\`
WHERE event_name = 'purchase'
AND _TABLE_SUFFIX >= '20240101'
GROUP BY user_pseudo_id
)
SELECT
COUNTIF(purchase_count >= 2) / COUNT(*) as repeat_purchase_rate
FROM purchase_events;
`;Cost analysis tied to revenue
Join GA4 events with your payment system data (Stripe, payment processor) to connect acquisition cost to customer lifetime value. GA4 doesn't have direct integration with your billing system, but BigQuery lets you match them using shared customer identifiers.
// Match GA4 users to Stripe customer data
const query = `
SELECT
ga.user_pseudo_id,
ga.event_timestamp,
stripe.customer_id,
stripe.mrr
FROM \`project_id.analytics_YOUR_PROPERTY_ID.events_*\` ga
LEFT JOIN \`project_id.stripe_export.customers\` stripe
ON ga.user_id = stripe.ga_user_id
WHERE ga.event_name = 'purchase'
LIMIT 1000;
`;How It Works Technically
Understanding the data flow helps you avoid setup mistakes and write efficient queries.
GA4 pushes events to BigQuery automatically
Once you link a BigQuery project in GA4 settings (Admin > Data Streams > Link BigQuery), GA4 starts pushing events. Each day, a new table appears: events_20240115, events_20240116, and so on. Events arrive with a 24-48 hour delay—real-time isn't available, so don't expect live dashboards.
// Query events from the last 7 days
const query = `
SELECT
event_name,
COUNT(*) as event_count,
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) as event_date
FROM \`project_id.analytics_YOUR_PROPERTY_ID.events_*\`
WHERE PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) >= DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY)
GROUP BY event_name, event_date
ORDER BY event_date DESC, event_count DESC;
`;Data structure is different from the GA4 UI
GA4's interface shows things like "User ID" and "Session ID," but the raw data uses user_pseudo_id (GA4's anonymous identifier) and has no session field—you build sessions yourself. Event parameters are nested JSON objects. Custom dimensions come through as user_properties and event_params with key-value pairs you must unnest.
// Unnest event parameters to access custom properties
const query = `
SELECT
event_name,
event_timestamp,
user_pseudo_id,
param.key as parameter_name,
param.value.string_value as parameter_value
FROM \`project_id.analytics_YOUR_PROPERTY_ID.events_*\`,
UNNEST(event_params) as param
WHERE event_name = 'view_item'
AND _TABLE_SUFFIX = '20240115'
LIMIT 100;
`;Common Pitfalls
- BigQuery export starts from setup date forward—you won't get historical data. Plan ahead if you need historical analysis.
- User IDs in BigQuery (
user_pseudo_id) don't match your product database. You need email or customer_id to cross-reference. - Events arrive 24-48 hours late. Real-time dashboards won't work; use GA4's Realtime report instead.
- GA4 occasionally updates the data schema (new fields, renamed columns). Monitor your queries for schema errors after GA4 releases.
Wrapping Up
BigQuery Export gives you the raw material GA4's interface hides. You get direct access to event data, custom query power, and the ability to combine analytics with your product data. If you want to track this automatically across tools, Product Analyst can help.