When you ship a feature or change a user flow, you need to know if it's actually working. Impact Analysis in Mixpanel lets you compare user behavior before and after a change, isolating the effect of what you shipped. We'll walk through instrumenting your events, segmenting users, and running the actual analysis.
Instrument Your Core Events
Impact Analysis needs event data to work with. You'll want to track the key actions users take on the feature you're measuring.
Track user actions on the feature
Use mp.track() to capture the main interactions with your feature. Include relevant properties like feature_name and anything that helps segment the data later.
// Track when user interacts with new feature
mixpanel.track('Feature Interaction', {
feature_name: 'checkout_redesign',
interaction_type: 'form_viewed',
timestamp: new Date().toISOString()
});
// Track the completion event
mixpanel.track('Purchase Completed', {
feature_name: 'checkout_redesign',
revenue: 49.99,
user_segment: 'new_customer'
});Set user properties for segmentation
Use mp.identify() and mp.people.set() to tag users with properties that indicate which group they're in. This is how you'll segment control vs. treatment groups in your analysis.
// Identify user and set cohort property
mixpanel.identify(user.id);
mixpanel.people.set({
'Feature Cohort': 'treatment_group',
'Feature Release Date': '2026-03-20',
'Signup Date': user.created_at
});
// For users in control group
mixpanel.people.set({
'Feature Cohort': 'control_group'
});Create Segments for Your Analysis
Impact Analysis works by comparing metrics across segments. You need a clear control group and treatment group.
Define your segments in Mixpanel
Go to Segmentation and create saved segments for your control and treatment groups. Filter by the user property you set during instrumentation. You can also compare before and after a specific date if you're doing a time-based analysis.
// Query raw event data via Mixpanel's Events API
const response = await fetch('https://mixpanel.com/api/2.0/export', {
method: 'GET',
headers: {
'Authorization': 'Bearer ' + process.env.MIXPANEL_API_TOKEN
},
body: new URLSearchParams({
from_date: '2026-03-20',
to_date: '2026-03-26',
event: 'Feature Interaction'
})
});
const events = await response.json();
const treatmentEvents = events.filter(e => e.properties['Feature Cohort'] === 'treatment_group');
const controlEvents = events.filter(e => e.properties['Feature Cohort'] === 'control_group');Check segment sizes
In Insights, filter each segment and check the user count. They don't need to be identical, but if one is 10x larger, your comparison might be skewed.
Measure Impact with Mixpanel Tools
Now you'll compare key metrics between control and treatment groups.
Compare conversion funnels
Use Funnels to measure drop-off rates. Create a funnel with your feature interaction and conversion events, then segment by your cohort property.
// Track both events in your funnel
mixpanel.track('Feature Interaction', {
feature_name: 'checkout_redesign',
funnel_step: 'step1'
});
mixpanel.track('Purchase Completed', {
feature_name: 'checkout_redesign',
funnel_step: 'step2'
});
// In Mixpanel UI: Funnels > New Funnel > select both events > Segment by 'Feature Cohort'Check retention patterns
Use Retention to see if treatment group users come back more often than control. This reveals whether your change improves long-term engagement.
Look for side effects
Create Insights reports for other events (e.g., 'Support Ticket Created'). Segment by cohort to catch unexpected consequences. A feature might increase purchases but also increase support load.
Common Pitfalls
- Forgetting to set cohort properties before launch — you won't be able to retroactively segment users if you set the property after.
- Comparing too soon — 1-2 days of data per segment can look like impact due to random variation. Wait 5-7 days minimum.
- Measuring only one metric — a feature might increase purchases but decrease daily active users. Check secondary metrics for side effects.
- Inconsistent event names — if you rename 'Purchase Completed' to 'order_confirmed' mid-analysis, your comparison breaks.
Wrapping Up
You now have the setup to measure whether your product changes actually work. Track consistently, segment clearly, and compare metrics systematically. The more analyses you run, the better your intuition gets for what signals matter. If you're tracking impact across multiple tools and want a unified view, Product Analyst can consolidate your data and flag unexpected patterns.