Skip to main content
Back to Cases
CRITICAL
MULTI-PLATFORM
saas

How a Broken Tracking Stack Wasted $60K Across Platforms for a SaaS Company — And the Full-Stack Fix That Restored 4x ROAS

2026-03-1511 minMulti-Platform, SaaS, Tracking, GA4, UTM Parameters, Cross-Domain Tracking

Metrics Comparison

ROAS
Before
1.2x
After
4x
+233%
CTR
Before
2.5%
After
2.8%
+12%
CPC
Before
$2
After
$1.8
+-10%
CPA
Before
$85
After
$40
+-53%

Timeline

Campaign Launch
Problem Detected

75 days

Root Cause

UTM parameters stripped by redirect chain, GA4 misconfigured with wrong data stream, duplicate events inflating conversions by 3x — total attribution chaos

Fix Applied

Fixed redirect chain preserving UTMs, GA4 reconfiguration with correct data stream, cross-domain tracking implementation, event deduplication logic

Outcome

ROAS recovered from 1.2x (inflated) to 4.0x (real) within 28 days, actual CPA revealed and reduced from $85 to $40 (28 days)

How a Broken Tracking Stack Wasted $60K Across Platforms for a SaaS Company — And the Full-Stack Fix That Restored 4x ROAS

Opening Hook

The SaaS company's marketing dashboard told a story everyone wanted to believe. Google Ads was reporting 340 trial sign-ups last month. Meta claimed 280. LinkedIn showed 195. The total from paid channels: 815 trial sign-ups. Internal CRM records showed 310 total trial sign-ups from all sources combined — paid, organic, referral, and direct. The paid channel reports were not just wrong; they were collectively reporting 2.6x more conversions than actually existed across the entire business. Budgets were being allocated, bids were being set, and strategy was being directed based on numbers that had no connection to reality. Sixty thousand dollars was spent over 75 days against metrics that existed only in the dashboards of misconfigured tracking tools.

The Setup

The client was a B2B SaaS company offering a project management platform with a freemium-to-paid model. Their funnel was: ad click, landing page, trial sign-up, activation (completing first project), and conversion to paid plan. Average contract value was $4,800/year, sales cycle was 21 days from trial to paid, and trial-to-paid conversion rate was 18%.

The company ran paid acquisition across three platforms: Google Search ($25K/month), Meta ($20K/month), and LinkedIn ($15K/month). They had recently migrated from Universal Analytics to GA4, set up a new landing page on a subdomain (try.company.com), and implemented a marketing automation platform that inserted redirect links for UTM tracking.

The marketing operations team was lean — one growth marketer managing all three platforms plus analytics. The GA4 migration had been completed by an external contractor six months prior. No post-migration audit had been conducted. The tracking infrastructure was a tangle of tools, redirects, and configurations that had accumulated organically over two years.

What Went Wrong

The problems were not visible in any single platform's reporting. Each platform showed positive-looking numbers. The crisis only became apparent when the VP of Marketing asked a simple question during a board preparation meeting: "What is our blended CPA across all channels?" When the growth marketer pulled GA4 data, the numbers did not match any individual platform report. When they pulled CRM data, the numbers did not match GA4. There were now three different versions of reality, and none of them agreed.

A forensic investigation over the next week revealed a cascade of interconnected tracking failures:

The UTM Stripping Problem. The marketing automation platform inserted redirect links (e.g., track.company.com/r/abc123) that resolved to the landing page (try.company.com/signup?utm_source=google&utm_medium=cpc...). However, the redirect chain went through three hops: track.company.com -> www.company.com -> try.company.com. The second hop (www.company.com) had a server-side redirect rule (implemented by a developer for a different purpose) that stripped all query parameters from the URL before redirecting to the subdomain. Every UTM parameter was silently destroyed. As a result, GA4 could not attribute any traffic from these redirected links to its correct source. All redirected traffic appeared as "direct/none" in GA4.

The GA4 Misconfiguration. The external contractor who migrated from Universal Analytics had created the GA4 property correctly but had connected the wrong data stream. The property had two data streams: one for www.company.com and one for try.company.com. The contractor had installed the wrong stream's Measurement ID (G-XXXXXXXX) on try.company.com. This meant GA4 was receiving events from the landing page under the wrong data stream, causing the events to be associated with the marketing website's reporting view rather than the conversion-focused subdomain view. The growth marketer had been reviewing a data stream that was not receiving any conversion events.

The Duplicate Event Problem. The GA4 implementation included a Google Tag Manager (GTM) container, a direct GA4 snippet in the page header, and an event pushed from the marketing automation platform. All three were firing the "sign_up" event. This created triple-counted conversions in GA4. The growth marketer saw 930 sign-ups in GA4 for a period where only 310 actually occurred. When this inflated number was passed to Google Ads as imported conversions, the bidding algorithm received false-positive signals, which artificially lowered the apparent CPA and prevented the algorithm from recognizing that its targeting was underperforming.

The Cross-Domain Tracking Gap. The user journey crossed three domains: ads linked to www.company.com, which linked to try.company.com for sign-up, which linked to app.company.com for the product. GA4 cross-domain tracking was not configured. Each domain transition started a new session, creating phantom users and breaking the attribution chain. A single real user appeared as three separate users in GA4, further inflating metrics.

Root Cause Analysis

The root cause was systemic rather than any single technical error:

No Single Source of Truth. The company operated with four independent measurement systems (Google Ads, Meta, LinkedIn, GA4) and no reconciliation process. Each system counted conversions differently, used different attribution models, and had different data collection mechanisms. Without a central source of truth (typically GA4 or a CRM-based system), contradictions were invisible.

Post-Migration Audit Failure. The GA4 migration was treated as a one-time project that ended when the contractor invoiced. No validation was performed: no test conversions, no real-time event verification, no comparison between GA4 data and known internal metrics. A $2,000 post-migration audit would have caught every issue within two hours.

Infrastructure Sprawl Without Documentation. The redirect chain, the dual data streams, the triple event firing — each was created by a different person at a different time for a different reason. No one had a complete map of the data pipeline from ad click to conversion. Without documentation, no one could identify the failure points because no one understood the system end-to-end.

Vanity Metric Bias. The growth marketer monitored platform-specific dashboards (Google Ads, Meta Ads Manager, LinkedIn Campaign Manager) rather than reconciled, validated data. Platform dashboards are designed to report optimistically — they use first-touch attribution for their own channel and make no attempt to deduplicate across platforms. When the numbers looked good, there was no incentive to investigate.

The Fix

The overhaul was systematic, addressing each layer of the tracking stack:

  1. Redirect Chain Audit and Fix (Days 1-5). Mapped the complete redirect chain for every ad link. Identified the www.company.com redirect as the UTM-stripping culprit. Modified the server-side redirect to preserve query parameters using a proper 302 redirect with parameter forwarding. Verified with curl commands that UTMs survived the complete chain from ad click to landing page load. Eliminated one unnecessary redirect hop entirely, reducing the chain from three hops to two.

  2. GA4 Data Stream Correction (Days 3-7). Identified the wrong Measurement ID installed on try.company.com. Replaced it with the correct data stream ID. Verified real-time event flow in GA4's DebugView by triggering test conversions. Confirmed that events now appeared in the correct data stream and property view.

  3. Event Deduplication (Days 5-10). Audited all event sources firing into GA4. Found three sources for the "sign_up" event: GTM trigger, inline GA4 snippet, and marketing automation push. Removed the inline GA4 snippet (redundant with GTM) and the marketing automation event push (should feed CRM, not GA4). Established GTM as the single event management layer. All GA4 events now fire exclusively through GTM, creating a single controllable pipeline.

  4. Cross-Domain Tracking Configuration (Days 7-14). Configured GA4 cross-domain tracking across www.company.com, try.company.com, and app.company.com. Implemented the linker parameter in GTM to decorate URLs with the _gl parameter when users navigate between domains. Verified session continuity by tracking a single user journey across all three domains in GA4's DebugView.

  5. Conversion Import Reconciliation (Days 10-18). Replaced the GA4-to-Google-Ads conversion import (which had been importing triplicated events) with a clean pipeline: GA4 now reports deduplicated sign-ups, and Google Ads imports only from the corrected GA4 property. For Meta and LinkedIn, implemented server-side conversion uploads using the CRM as the source of truth rather than relying on platform pixels alone.

  6. Measurement Validation Framework (Days 14-28). Built a weekly reconciliation dashboard that compares four data sources: Google Ads reported conversions, Meta reported conversions, GA4 reported conversions, and CRM actual sign-ups. Defined acceptable variance thresholds (less than 10% difference between GA4 and CRM, less than 20% between platform reports and CRM). Automated alerts when variance exceeds thresholds. This is the permanent early-warning system that prevents tracking drift from going undetected.

  7. Platform-Specific Attribution Resets (Days 18-28). After the clean data pipeline was established, each platform's bidding algorithm needed time to re-learn with accurate conversion data. Google Ads campaigns were reset with Target CPA bidding using the corrected conversion action. Meta campaigns were reset with CAPI sending server-side events from the CRM. LinkedIn campaigns were switched to manual CPC temporarily while the conversion data accumulated, then transitioned to Target CPA after 14 days.

Results

The outcome had two dimensions: measurement accuracy and genuine performance improvement.

| Metric | Before (Inflated) | Before (Real) | After (Real) | Change (Real) | |--------|-------------------|---------------|--------------|---------------| | ROAS | 1.2x (reported) | 0.6x (actual) | 4.0x | +567% | | CTR | 2.5% | 2.5% | 2.8% | +12% | | CPC | $2.00 | $2.00 | $1.80 | -10% | | CPA | $85 (reported) | $195 (actual) | $40 | -79% | | GA4-CRM Variance | 200%+ | N/A | 6% | Fixed | | Cross-Platform Duplication | 2.6x | N/A | 1.08x | Fixed | | Sign-up Attribution Rate | 12% | N/A | 91% | Fixed |

The most important insight was that the "before" metrics had two values: what was reported and what was real. The reported 1.2x ROAS and $85 CPA masked an actual 0.6x ROAS and $195 CPA. The team had been unknowingly scaling a money-losing campaign because every dashboard showed green numbers.

After the fix, the actual ROAS reached 4.0x and CPA dropped to $40. The improvement came from two sources: algorithms optimizing against accurate data (the largest contributor) and budget reallocation from underperforming channels to outperforming ones, which was only possible with accurate attribution.

Google Ads, which had appeared roughly equal in performance to Meta and LinkedIn, was revealed to be delivering 2.3x the actual ROAS. Budget was reallocated accordingly, shifting 30% of LinkedIn budget to Google Search, which alone accounted for 40% of the total improvement.

Key Takeaways

  • If your platforms report more conversions than your business actually generates, your tracking is broken — full stop. This is the simplest, most reliable diagnostic. Run a monthly reconciliation: sum all platform-reported conversions, compare to CRM records, and investigate any variance above 15%.

  • GA4 migrations require a post-migration audit. Every GA4 migration should include a validation phase: fire test events, verify them in DebugView, compare against known baselines, and confirm cross-domain tracking. The cost of an audit is trivial compared to the cost of operating on corrupted data.

  • One event management layer, one source of truth. Events should fire from exactly one system (typically GTM). If you have GTM, a direct GA4 snippet, and a marketing automation platform all firing the same event, you have 3x the conversions and zero clarity. Audit, deduplicate, centralize.

  • Redirect chains are UTM killers. Every redirect hop is an opportunity for parameters to be stripped. Map your complete redirect chain from ad click to final landing page, and verify that UTMs survive every hop. Use curl with the -L flag to trace the full chain programmatically.

  • Platform dashboards are not analytics — they are sales tools. Each platform's reporting is designed to make that platform look good. Cross-platform truth only exists in your own measurement infrastructure: GA4, your CRM, or a dedicated attribution tool.

Prevention Checklist

Before launching any multi-platform SaaS campaign:

  • [ ] GA4 data stream verified: correct Measurement ID installed on all domains
  • [ ] Cross-domain tracking configured across all properties in the user journey
  • [ ] Events fire from exactly one source (GTM recommended) — no duplicate event sources
  • [ ] Redirect chain mapped and verified: UTM parameters survive all hops (test with curl -L)
  • [ ] Conversion import pipeline tested: GA4 conversions match what appears in each ad platform
  • [ ] Weekly reconciliation process established: platform conversions vs. GA4 vs. CRM
  • [ ] Variance alert configured: notification when GA4-to-CRM gap exceeds 10%
  • [ ] Server-side conversion tracking implemented for Meta (CAPI) and Google (Enhanced Conversions)
  • [ ] Attribution windows aligned across platforms to match actual sales cycle length
  • [ ] Data documentation maintained: complete map of tracking infrastructure, event sources, and data flows
  • [ ] Post-migration or post-change validation SOP: test events, DebugView verification, baseline comparison
  • [ ] No changes to redirect rules, domain routing, or GTM containers without tracking impact assessment

Don't repeat this mistake

Let RedClaw help you avoid the same mistakes

Get Free Audit

Related Cases