Skip to content

Case 8: Traffic Quality Persistence Analysis

Case 8: Traffic Quality Persistence: Analyzing Multi-Step Conversion

Section titled “Case 8: Traffic Quality Persistence: Analyzing Multi-Step Conversion”

After finding campaign winners at step 1 (landing → achieve_screen), check if quality persists

From filtered winner segment:

StepEventPersonsConversionDrop-offTime
1achieve_screen_loaded7,855 (100%)---
2achieve_screen_viewed4,996 (63.6%)63.6%2,859 (36.4%)9s
3sex_screen_loaded4,517 (57.5%)90.4%479 (9.6%)14s
  • Total conversion rate: 30.85% (much better than baseline ~4-10%)
  • This is the “mio moi” filtered view showing good traffic
  • achieve_screen_loaded (page loaded in browser): 7,855 people
  • achieve_screen_viewed (user actually engaged with page): 4,996 people (63.6%)
  • 36.4% drop-off between page loading and viewing
  • This is a “micro-bounce” within the same page
  • User’s browser loaded the page, but they didn’t engage with it
  • Could be: page closed immediately, tab closed, bounced before interaction

The Critical Insight: Same Page, Different Traffic Sources

Section titled “The Critical Insight: Same Page, Different Traffic Sources”
  • Everyone at achieve_screen sees the EXACT SAME screen
  • The screen doesn’t change based on which landing page they came from
  • Yet conversion rates differ based on original traffic source

When you filter by good campaigns (like campaign 1202363… + affid 1000):

  • Loaded → Viewed conversion is likely HIGHER than 63.6%
  • These users engage more with the content

When you filter by bad campaigns:

  • Loaded → Viewed conversion is likely LOWER than 63.6%
  • These users bounce even from the second screen

Since the screen is identical for everyone, the difference must be:

  • Audience quality: Who you’re targeting
  • User intent: How motivated they are
  • Expectation match: Whether your ad promised what the page delivers

The Conclusion: It’s an Audience Problem

Section titled “The Conclusion: It’s an Audience Problem”
  1. Different campaigns perform differently on same landing page (5% vs 44%)
  2. Good traffic outperforms at step 1 AND step 2 (not just one step)
  3. Same page shows different conversion based on traffic source
  4. Bad traffic performs badly on ALL landing pages

Diagnosis: Primary issue is audience quality/targeting, NOT page design

If you had assumed “low conversion = bad landing page”

Section titled “If you had assumed “low conversion = bad landing page””
  • You’d spend weeks redesigning pages
  • Testing new layouts, copy, colors
  • Optimizing load times
  • Results: Minimal improvement (because traffic is bad)

Now you know “low conversion = bad traffic”

Section titled “Now you know “low conversion = bad traffic””
  • Fix targeting on campaigns
  • Adjust ad creative to attract right users
  • Pause underperforming campaigns
  • Scale winning campaigns
  • Results: 4-5x improvement immediately
  1. Audit all campaigns for targeting quality
  2. Compare ad creative between winners (44%) and losers (5%)
  3. Analyze audience characteristics of high-converters
  4. Replicate winning campaign targeting
  5. Pause or retarget low-quality traffic sources

Priority 2: Landing Page Optimization (Secondary)

Section titled “Priority 2: Landing Page Optimization (Secondary)”
  1. After fixing traffic, THEN optimize pages
  2. Even bad pages convert at 40%+ with good traffic
  3. But good pages can push good traffic from 44% → 60%+

Good traffic + mediocre page > Bad traffic + perfect page

Always fix traffic quality first. Page optimization is the multiplier, not the foundation.

  • Advanced Isolation Method ✓
  • Multi-Step Validation ✓
  • Hold page constant, vary traffic → Find massive difference ✓
  • Check subsequent steps → Confirms traffic quality issue ✓
  • Multi-step funnel with “loaded” vs “viewed” event tracking
  • Ability to filter entire funnel by campaign/affid
  • Conversion rates recalculated for filtered segments
  • Step-by-step drop-off visualization
  • Time between steps tracking (9s from loaded to viewed)
  • Person counts at each step (not just percentages)