6 min read

How to Set Up Bigquery Export in Google Analytics 4

By default, GA4 lets you query data in the UI—but only aggregated, sampled data with a 14-month retention limit. BigQuery export gives you raw, unsampled access to every single event your users generate, with no sampling and no time limit. This is essential if you're building custom dashboards, running cohort analyses at scale, or feeding event data into your data warehouse. Here's how to wire it up.

Create a Google Cloud Project and Enable BigQuery

You'll need a GCP project with BigQuery enabled before GA4 can start exporting. If you already have one, you can skip ahead and link it directly.

Create or select a GCP project

Go to the Google Cloud Console and click Select a project at the top. If you have an existing project you want to use, select it. Otherwise, click New Project, give it a name (like 'GA4 Analytics'), and click Create. Write down your project ID—you'll need it to authenticate the connection.

javascript
// Create a project via gcloud CLI
// gcloud projects create ga4-analytics --name="GA4 Analytics"

// Set it as the active project
// gcloud config set project ga4-analytics

// Verify the project exists
const {google} = require('googleapis');
const cloudresourcemanager = google.cloudresourcemanager({version: 'v1'});

cloudresourcemanager.projects.list({
  auth: new google.auth.GoogleAuth({
    scopes: ['https://www.googleapis.com/auth/cloud-platform']
  })
}).then(res => {
  console.log('Active project:', res.data.projects[0].projectId);
});
List GCP projects to verify your project is created.

Enable the BigQuery API

Search for BigQuery API in the GCP Console and click Enable. This grants GA4 permission to write event data to BigQuery datasets in your project. It takes a minute to activate.

javascript
// Enable BigQuery API via gcloud
// gcloud services enable bigquery.googleapis.com

// Verify it's enabled by trying to list datasets
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery({projectId: 'ga4-analytics'});

bigquery.getDatasets().then(results => {
  console.log('BigQuery is enabled. Existing datasets:', results[0].length);
}).catch(err => {
  if (err.code === 403) {
    console.error('BigQuery API not enabled. Enable it in the GCP Console.');
  }
});
Test that BigQuery API is enabled by listing datasets.
Watch out: BigQuery export only works with paid GA4 properties. Free GA4 properties don't have access to this feature.

Link Your GA4 Property to BigQuery

Now connect your GA4 property to the GCP project you just set up. GA4 will automatically create a BigQuery dataset and start exporting raw events.

Open BigQuery Link in GA4 Admin

In GA4, click the Admin icon (gear) at the bottom left. Under Property, find BigQuery Links. Click the blue Link BigQuery button. GA4 will show you a list of GCP projects you have owner or editor access to.

javascript
// List existing BigQuery links for your property using the Admin API
const {google} = require('googleapis');
const analyticsAdmin = google.analyticsadmin({version: 'v1beta'});

const propertyId = '123456789'; // Your GA4 property ID
const auth = new google.auth.GoogleAuth({
  keyFilename: './service-account-key.json',
  scopes: ['https://www.googleapis.com/auth/analytics.edit']
});

analyticsAdmin.properties.bigqueryLinks.list({
  parent: `properties/${propertyId}`,
  auth: auth
}).then(response => {
  console.log('BigQuery links:', response.data.bigqueryLinks || 'None yet');
});
Check if your property already has a BigQuery link.

Select your GCP project and confirm

Pick the GCP project you created from the dropdown. GA4 will ask for permissions—click Link to confirm. GA4 automatically creates a BigQuery dataset named analytics_XXXXXXXXX (where X is your property ID) and starts exporting daily event tables.

javascript
// Create a new BigQuery link via the Admin API
const {google} = require('googleapis');
const analyticsAdmin = google.analyticsadmin({version: 'v1beta'});

const propertyId = '123456789';

analyticsAdmin.properties.bigqueryLinks.create({
  parent: `properties/${propertyId}`,
  requestBody: {
    projectId: 'ga4-analytics',
    datasetLocation: 'US' // Options: US, EU, ASIA_NORTHEAST1, etc.
  },
  auth: auth
}).then(response => {
  console.log('BigQuery link created. Dataset will be created shortly.');
}).catch(err => {
  if (err.code === 403) {
    console.error('Service account lacks permissions. Check Admin access.');
  }
});
Programmatically create a BigQuery link using the Admin API.
Tip: Choose the data location (US, EU, etc.) based on where your data should be stored. You can't change this after linking without unlinking and relinking the property.

Query Raw Events and Verify Data Flow

After 24-48 hours, GA4 starts exporting events. Each day gets its own table in BigQuery with raw, unsampled event data.

Check the BigQuery dataset and schema

Open your GCP project's BigQuery workspace. You'll see the analytics_XXXXXXXXX dataset. Inside, you'll find tables named events_YYYYMMDD (one per day). Click into an events_ table and view the Schema tab. Each row is a single event with event_name, event_timestamp, user_id, custom parameters, and user properties.

javascript
// List all tables in the GA4 dataset
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery({projectId: 'ga4-analytics'});

const datasetId = 'analytics_123456789';
const dataset = bigquery.dataset(datasetId);

dataset.getTables().then(results => {
  results[0].forEach(table => {
    console.log('Table:', table.id);
  });
}).catch(err => console.error('Error:', err));
List all event tables in your GA4 BigQuery dataset.

Query events and build a unified view

Each day's events live in a separate table. To query across all days without manually unioning tables, create a view that uses BigQuery's table wildcard syntax. This lets you query the past 90 days with one simple SELECT.

javascript
// Create a view that unions the past 90 days of events
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery({projectId: 'ga4-analytics'});

const datasetId = 'analytics_123456789';
const viewQuery = `
  SELECT
    event_timestamp,
    event_name,
    user_id,
    user_pseudo_id,
    event_params,
    user_properties,
    geo,
    device
  FROM \`ga4-analytics.${datasetId}.events_*\`
  WHERE _TABLE_SUFFIX >= FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 90 DAY))
`;

const options = {
  view: viewQuery,
  useStandardSql: true
};

bigquery.dataset(datasetId).createTable('events_90d', options)
  .then(() => console.log('View created: events_90d'))
  .catch(err => console.error('Error:', err));
Create a view for efficient querying across all 90 days of events.
Watch out: BigQuery charges $6.25 per TB of data scanned. Scanning a month of unfiltered events can cost hundreds of dollars. Always filter by date and select only the columns you need.

Common Pitfalls

  • Expecting real-time data—GA4 exports events daily, so data takes 24-48 hours to appear. Use GA4's real-time reports for immediate feedback.
  • Running queries without date filters on the full events table—each row is one event, so even a week's worth can be terabytes. Costs and query time explode fast.
  • Changing the BigQuery dataset location after linking—you can't. You have to unlink and relink the property, which restarts the export from scratch.
  • Assuming historical data will be exported—BigQuery export only captures events from the day you create the link. Events before that date never appear in BigQuery.

Wrapping Up

You now have raw, unsampled event data in BigQuery with no time limits and no sampling bias. You can build custom funnels, user cohorts, and retention analyses that would be impossible in GA4's UI. If you're running complex analyses across multiple properties or feeding event data into a larger analytics system, Product Analyst can help you automate that workflow.

Track these metrics automatically

Product Analyst connects to your stack and surfaces the insights that matter.

Try Product Analyst — Free