GA4 exports your raw event data to BigQuery automatically—but only if you set it up. Most teams miss this and leave performance data locked inside GA4's limited UI. Once enabled, you get full SQL access to every event your users trigger, cheaper at scale than GA4 queries alone.
Enable BigQuery Linking in GA4
This one-time setup connects your GA4 property to a BigQuery project.
Create a BigQuery project (if you don't have one)
In Google Cloud Console, create a new project or use an existing one. BigQuery is free for the first 1TB of queried data per month. You'll need the Project ID later.
Navigate to Admin > Linked Products in GA4
In your GA4 property, go to Admin (bottom left) > Linked Products > BigQuery Links. Click Link BigQuery and select your project from the dropdown.
Configure export settings
Choose which data to export: Events (raw event stream) is standard. GA4 creates a dataset named analytics_XXXXXXXXX (where XXXXXXXXX is your GA4 property ID). Exports run daily, with events appearing 24-48 hours after they're triggered.
// Verify the BigQuery link via Google Analytics Admin API
const adminApi = google.analyticsadmin('v1beta');
const response = await adminApi.properties.bigQueryLinks.list({
parent: 'properties/YOUR_PROPERTY_ID'
});
console.log('BigQuery links:', response.data.bigqueryLinks);
// Output includes project ID, dataset name, and export statusQuery Exported GA4 Data in BigQuery
Once linked, GA4 populates tables daily. The main table is events_* (date-partitioned).
Understand the events table schema
GA4's BigQuery export includes event_name, user_pseudo_id, event_timestamp, and nested fields like event_params (custom parameters) and user_properties. Each row is one event. The table is partitioned by date (events_20250101, events_20250102, etc.).
Write a basic SQL query to validate the export
Query the last 30 days of events. Use the wildcard pattern events_* to scan all date-partitioned tables. This tells you if data is flowing correctly.
// Query GA4 BigQuery data using BigQuery Client Library
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery({projectId: 'YOUR_PROJECT_ID'});
const query = `
SELECT
event_name,
COUNT(*) as event_count,
DATE(TIMESTAMP_MICROS(event_timestamp)) as event_date
FROM \`YOUR_PROJECT_ID.analytics_XXXXXXXXX.events_*\`
WHERE _TABLE_SUFFIX BETWEEN '20250301' AND '20250331'
GROUP BY event_name, event_date
ORDER BY event_date DESC, event_count DESC
`;
const options = {query: query, location: 'US'};
const [rows] = await bigquery.query(options);
console.log('GA4 events by type:', rows);Extract event parameters for custom metrics
GA4 stores custom dimensions in nested event_params. Use UNNEST to flatten them and extract specific values like purchase amount, feature usage, or custom flags.
// Extract custom event parameters from GA4 BigQuery export
const query = `
SELECT
user_pseudo_id,
event_timestamp,
event_name,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'transaction_id') as transaction_id,
(SELECT value.float_value FROM UNNEST(event_params) WHERE key = 'purchase_amount') as purchase_amount
FROM \`YOUR_PROJECT_ID.analytics_XXXXXXXXX.events_*\`
WHERE event_name = 'purchase'
AND _TABLE_SUFFIX >= '20250301'
LIMIT 1000
`;
const [rows] = await bigquery.query({query: query, location: 'US'});
console.log('Purchase events with amounts:', rows);_TABLE_SUFFIX in your WHERE clause to filter date ranges—it's much cheaper than scanning all tables. BigQuery charges by data scanned, so filtering dates early saves money.Automate Export Monitoring
Set up a check to ensure GA4 data is flowing into BigQuery reliably.
Create a daily validation query
Schedule a query that counts events from the previous day and alerts if the count drops below your baseline. Use BigQuery's scheduled queries feature or trigger this from a Cloud Function.
Monitor for schema changes
GA4 occasionally adds new event types or parameters. Query the INFORMATION_SCHEMA.COLUMNS to track schema drift. If new fields appear, your downstream queries might break if they use strict column selection.
Use the Google Analytics Data API for comparison
Query GA4's Data API alongside BigQuery to cross-check totals. Small discrepancies are normal (Data API uses sampling above 100k events), but large gaps indicate export failure.
// Compare GA4 Data API vs BigQuery counts
const analyticsdata = google.analyticsdata('v1beta');
const dataApiResponse = await analyticsdata.properties.runReport({
property: 'properties/YOUR_PROPERTY_ID',
requestBody: {
dateRanges: [{startDate: '30daysAgo', endDate: 'today'}],
metrics: [{name: 'eventCount'}]
}
});
const dataApiEventCount = dataApiResponse.data.rows[0].metricValues[0].value;
console.log('Data API event count (30d):', dataApiEventCount);
// Compare this to your BigQuery query resultCommon Pitfalls
- Forgetting that BigQuery export is 24-48 hours delayed. You won't see today's events until tomorrow. Use the Data API for real-time checks.
- Using table names without the wildcard suffix (
events_*). GA4 partitions by date, so querying justeventsreturns zero rows. - Assuming all users have a
user_id. GA4 defaults touser_pseudo_id(anonymous identifier). Set a User-ID scope custom dimension if you need persistent IDs. - Not setting up BigQuery cost controls. Large exports can rack up scanning costs. Use
--maximum_bytes_billedin queries to cap spend.
Wrapping Up
BigQuery export unlocks custom analysis GA4's UI can't do—cohort analysis, lifetime value, churn prediction, all with SQL. You're no longer limited by GA4's sampling thresholds or pre-built reports. If you want to automate this kind of analysis across all your tools and data sources, Product Analyst can help.