GA4 → AI search attribution: tracking ChatGPT and Perplexity referrals correctly
A practical guide to tracking inbound traffic from ChatGPT, Perplexity, Claude, and Google AI Overviews in GA4. The referrer values, the channel-grouping fixes, the gotchas in 2026, and the dashboard view that actually informs decisions.
— TL;DR
AI search referrals show up in GA4 under specific hostnames (chatgpt.com, perplexity.ai, gemini.google.com) but get bucketed as 'Direct' or 'Referral' by default, hiding the trend. The fix is a custom channel grouping that buckets AI search into its own channel. Setup takes 20 minutes; the dashboard becomes the most useful AEO measurement view.
If you've added AEO content to your B2B SaaS site and want to know whether it's working, GA4 is the default attribution surface. The catch: GA4's defaults hide AI search traffic in noise. Out of the box, you'll see a "Referral" channel with chatgpt.com mixed in among forum links, broken backlinks, and partner sites. The trend is invisible.
This piece walks through the 20-minute fix that turns GA4 into a useful AEO measurement tool: the right referrer values, the custom channel grouping, the dashboard view, and the gotchas that catch teams in 2026.
#Why GA4 hides AI search by default
GA4's default channel groupings were finalized in 2022, before AI search was a meaningful traffic source. The buckets are: Direct, Organic Search, Paid Search, Social, Email, Referral, Display, Affiliate, and a handful of others. None of them maps cleanly to AI search.
What happens to AI-referral traffic in the default setup:
- chatgpt.com referrals → bucketed as
Referral(lumped with every other site that links to you) - perplexity.ai referrals → bucketed as
Referral - claude.ai referrals → most often shows as
Directbecause Claude strips the referrer in many configurations - gemini.google.com → bucketed as
Referralor sometimesOrganic Searchdepending on the path - Google AI Overviews → indistinguishable from organic Google search; the referrer is google.com/search regardless
The result: AI search shows up as a fragment of Referral and Direct channels with no way to see the trend. Most B2B SaaS teams running AEO content in 2026 have no measurable signal on whether the work is paying off.
#The 20-minute fix
The fix is a custom channel grouping in GA4 that explicitly buckets AI-search hostnames into their own channel.
#Step 1. List the hostnames you care about
The core 2026 set:
| Engine | Referrer hostname | Notes |
|---|---|---|
| ChatGPT | chatgpt.com | Primary. Some legacy traffic still on chat.openai.com |
| Perplexity | perplexity.ai | Reliable referrer |
| Claude | claude.ai | Often strips referrer; appears as Direct |
| Gemini | gemini.google.com | Reliable referrer |
| Microsoft Copilot | copilot.microsoft.com | Reliable referrer |
| You.com | you.com | Niche but consistent |
| Phind | phind.com | Developer-heavy audience |
| Mistral Le Chat | chat.mistral.ai | Growing; mostly EU |
Add the engines your audience actually uses. For most B2B SaaS, the top 4 (ChatGPT, Perplexity, Gemini, Copilot) cover 90%+ of measurable AI-search traffic.
#Step 2. Create the custom channel grouping
In GA4: Admin → Data Display → Channel Groups → Create new channel group.
Name it "Channel grouping (AI-aware)" or similar. Create a new channel called AI Search with the rule:
Source matches regex: ^(chatgpt|perplexity|gemini|copilot|claude|you|phind|chat\.mistral)\.
Or, if you prefer named-source matching:
Source matches one of: chatgpt.com, chat.openai.com, perplexity.ai, gemini.google.com,
copilot.microsoft.com, claude.ai, you.com, phind.com, chat.mistral.ai
Place the AI Search channel above Referral and Organic Search in the priority order. GA4 evaluates channels top-down; if AI Search isn't first, traffic will fall into Referral instead.
#Step 3. Apply the channel grouping to reports
The custom grouping needs to be selected in any report you want to use it in. In GA4 standard reports: click the pencil icon on a report, change the dimension to "Custom channel group," select your new grouping. The change persists per-report; do it for the Acquisition Overview, the Traffic Acquisition report, and any custom dashboards.
#Step 4. Verify with the realtime report
Go to Reports → Realtime and visit your site from chatgpt.com (search for your brand in ChatGPT, click your citation). Within 60 seconds the visit should appear under your "AI Search" channel. If it appears under "Referral," the channel grouping isn't applied to that report yet.
That's the 20-minute fix.
#Gotchas in 2026
Five things that catch teams setting this up.
#1. Claude strips referrers most of the time
Anthropic's Claude often serves citation links with rel="noreferrer" or strips the referrer header entirely. The result: Claude-driven traffic appears as Direct in GA4 with no way to attribute. The workaround is a UTM parameter, but Claude doesn't add UTMs to citation links, so the traffic remains uncountable.
The honest answer: Claude attribution is a known gap in 2026. Trust the trend on engines that pass referrers (ChatGPT, Perplexity, Gemini), and assume Claude contributes proportionally.
#2. Google AI Overviews is indistinguishable from organic
When a user clicks a citation inside Google's AI Overview, the referrer is google.com/search exactly like a normal organic click. The URL parameters sometimes include udm=14 (the historical AI Overviews flag) but Google has changed this multiple times; relying on it is fragile.
The realistic approach: treat Google AI Overviews citations as a sub-segment of Organic Search you can't isolate. Track total Organic Search trend; if you see a lift in CTR-but-not-position metrics in Google Search Console, that's likely AI Overviews driving the change.
#3. ChatGPT mobile and ChatGPT desktop look the same
ChatGPT iOS/Android apps and chatgpt.com both pass the same referrer. You can segment by device in GA4 to estimate the split, but you can't separate "the ChatGPT app on iOS" from "chatgpt.com on Safari" via referrer alone.
For most teams this doesn't matter. If you specifically need mobile-app vs. web split, you'd need server-side analytics with deeper request-header inspection.
#4. Self-referrals from ChatGPT's preview
When ChatGPT renders a citation, it sometimes prefetches the page. The prefetch often appears as a chatgpt.com referral with zero session duration and no engagement. Filter these out in your engagement metrics; treat them as a noise floor on the AI Search channel.
#5. The 14-day GA4 data delay on custom channel groupings
When you create a new custom channel grouping, GA4 retroactively applies it to historical data, but the application can take 24 to 72 hours to fully propagate. Don't panic if the AI Search channel shows zero traffic in the first day after setup. Wait three days, then validate.
#What to put on the dashboard
Once the channel grouping is live, the AEO dashboard view that's actually useful:
- AI Search sessions, weekly trend (line chart, 12 weeks). The single most useful AEO metric. Shows whether your AEO work is moving the needle.
- AI Search sessions by source (bar chart). Which engines drive the most traffic. Informs which engines to optimize for.
- AI Search sessions by landing page (table). Which of your pages are getting cited most. Informs which pages to deepen and which to add to.
- AI Search conversion rate vs Organic Search conversion rate (comparison table). Validates the "AI traffic converts at 3 to 5x" claim for your specific site. If it's not, your landing pages are mismatched to the answer the AI engine summarized.
- Referrer hostname breakdown over time (stacked area chart). Shows engine-share shifts. ChatGPT vs. Perplexity vs. Gemini growth rates.
Build these once in GA4's Explore module, save as an Exploration, share with the team. This is the AEO measurement view that informs whether to keep investing.
#Cross-referencing with citation tracking
GA4 measures the trailing indicator (clicks). Citation tracking measures the leading indicator (whether you're cited at all). Both are needed for a complete picture.
- Citations rising, GA4 traffic flat: your AEO content is being cited but the citation snippets aren't compelling enough to drive clicks. Improve the cited page's preview metadata (title, description, OG image).
- Citations flat, GA4 traffic rising: AI engines are driving more clicks per existing citation, possibly because the engines themselves are getting more traffic. Ride the wave; expect citation share to matter more than absolute volume going forward.
- Both rising: working as intended. Compound the strategy.
- Both flat or falling: AEO work isn't moving. Audit the technical baseline (schema, llms.txt, robots.txt) and the content depth (answer blocks, FAQ pairs, inline citations). For the technical audit checklist, see AEO vs SEO: what changed in 2026.
For citation tracking tools and the manual baseline approach, see How to track LLM citations.
#What about server-side and edge tracking?
GA4 client-side tracking is fine for the AEO use case. A subset of teams in 2026 have moved to server-side GTM or edge analytics (Vercel Web Analytics, Plausible) for performance and privacy reasons. The same channel-grouping logic applies; the implementation moves from GA4 UI to your server-side configuration.
For most B2B SaaS teams in 2026, GA4 is the right call. The custom channel grouping fix is the same regardless of whether you eventually move to server-side.
#What we ship for clients
For our AEO Retainer engagements, the default GA4 setup we ship in week 1:
- Custom channel grouping with the 8 AI-search hostnames bucketed as "AI Search"
- Saved Exploration with the 5 dashboard views above, shared with the client team
- Cross-channel report linking citation tracking (Otterly or Athena) with GA4 traffic data
- Monthly review where we walk through citation deltas and traffic deltas in the same conversation
That setup makes the AEO work measurable in dollars rather than vibes. Founders consistently underinvest in AEO because they can't see the return; the dashboard is what makes the return visible.
#Bottom line
GA4 hides AI search traffic by default in 2026, but the 20-minute custom channel grouping fix makes it visible. Once visible, the dashboard becomes the single most useful AEO measurement view: weekly AI Search sessions, by engine, by landing page, with conversion rate vs. classic organic.
Pair the GA4 view with citation tracking (leading indicator) and you have the full measurement picture. Without GA4 attribution, AEO investment looks like a content-marketing rabbit hole. With it, the trend is clear and the decisions are easier.
If you want this configured and the dashboards built as part of a broader AEO baseline, that's part of what our AEO Audit covers. Or implement the 20-minute fix yourself this afternoon; the playbook is intentionally written to be self-serve.
— Want this for your SaaS?
AEO and SEO for SaaS, done properly ↗
The schema, llms.txt, pillar content, and technical AEO infrastructure that gets your SaaS cited in ChatGPT, Perplexity, and Google AI Overviews. Not just ranked in classic search.
— Keep reading
AEO
How to get cited by ChatGPT in 2026: a B2B SaaS playbook
A concrete playbook for B2B SaaS teams trying to be cited inside ChatGPT answers in 2026. The signals ChatGPT actually weighs, the page-level fixes that move citation share, and what to ignore from the AEO content-marketing noise.
Read post
AEO
Schema.org for B2B agencies: a 2026 implementation guide
A practical schema.org implementation guide for B2B agencies in 2026. The exact JSON-LD types every agency site needs, the agency-specific patterns (ProfessionalService, OfferCatalog, Service tiers), and the validation discipline that keeps schema healthy as the site grows.
Read post
AEO
GEO vs AEO vs SEO: are they actually different?
The 2026 acronym soup explained. SEO, AEO, GEO. What each one actually means, where they overlap, where they diverge, and which playbook your team should be running based on your buyer's actual search behavior.
Read post