CSV Import
Import historical event data from any analytics platform and get insights immediately
Overview
CSV import lets you bring historical event data into Product Analyst AI from any tool that can export to CSV. Upload a file, map your columns, and the agent can start answering questions about your data immediately — no live integration required.
This is the fastest way to evaluate Product Analyst AI with real data. Export a few months of events from your existing analytics tool, upload the CSV, and start asking questions.
Compatible tools
Any tool that exports event-level data to CSV works. Here are the most common sources:
| Platform | How to export |
|---|---|
| PostHog | Events → select date range → Export CSV. Columns distinct_id, event, timestamp are auto-detected. |
| Amplitude | Microscope or Chart → Export CSV. Map User Id, Event Type, Event Time during upload. |
| Mixpanel | Insights or Users → Export → CSV. Map Distinct ID, Event Name, Time during upload. |
| Google Analytics 4 | Explorations → Free Form → export CSV. Or use BigQuery export for raw events. Map user_pseudo_id, event_name, event_timestamp. |
| Heap | Define chart → Export data (CSV). Map User ID, Event, Time during upload. |
| Pendo | Data Explorer → Export CSV. Map Visitor ID, Event Name, Timestamp. |
| Looker | Run an Explore on your events table → Download → CSV. Column names depend on your model — map them during upload. |
| Segment | Warehouse query or Personas export → CSV. Map user_id, event, timestamp. |
| Custom / data warehouse | Run a SQL query against your events table, export as CSV. Any column names work — you map them during upload. |
CSV format
Your CSV needs three columns (the names can be anything — you map them during upload):
- User ID — a unique identifier for the user (email, UUID, internal ID)
- Event name — what happened (
signup,purchase,page_viewed) - Timestamp — when it happened (ISO 8601 format preferred, e.g.
2026-01-15T10:00:00Z)
Any additional columns are ignored. Here's an example:
user_id,event,timestamp
user-42,signup,2026-01-15T10:00:00Z
user-42,onboarding_completed,2026-01-15T10:05:00Z
user-42,invite_sent,2026-01-16T14:30:00Z
user-99,signup,2026-01-15T11:00:00Z
user-99,upgrade_clicked,2026-01-17T09:00:00ZHow to upload
Step 1 — Export from your analytics tool
Export event-level data as a CSV from your analytics platform (see the table above for platform-specific instructions). Include at least a few weeks of data for meaningful insights — a few months is ideal.
Step 2 — Upload and map columns
Go to Settings → Integrations and click Upload CSV. Drop your file (up to 10 MB) and the uploader will auto-detect standard column names like user_id, event, and timestamp.
If your columns have different names (e.g. Distinct ID, Event Type, Event Time), use the dropdowns to assign each column to the right field. A preview of your first few rows helps you verify the mapping is correct.
Step 3 — Import
Click Import events. The importer shows you how many events were added and how many duplicates were skipped (if you're re-uploading).
Step 4 — Ask a question
Your data is immediately available. Try asking the agent something like: "What are the most common events in the last 30 days?" or "Show me a signup-to-upgrade funnel."
Deduplication
Duplicate events are automatically detected and skipped — it's safe to re-upload the same CSV or overlapping date ranges. Deduplication uses a hash of the user ID, event name, and timestamp, so identical rows are never inserted twice.
Limits
- File size: 10 MB per upload
- No row limit: files up to 10 MB typically contain 50,000–200,000 events depending on column count
- Multiple uploads: upload as many files as you want — they accumulate, duplicates are skipped
Combining with live data
CSV import works alongside all other data sources. A common pattern:
- Upload a CSV with 3–6 months of historical data to bootstrap the agent
- Connect PostHog or the PAI SDK for live event streaming going forward
The agent sees all events together regardless of source, so historical context from your CSV enriches real-time queries from day one.
Other data sources
For live event streaming (no CSV export needed), connect PostHog via webhook or add the PAI JavaScript SDK to your site.