It's been five-plus years since iOS14 dropped App Tracking Transparency on the industry, and some brands are still in crisis mode about their attribution data. Others have swung to the opposite extreme — dismissing Meta data entirely, over-engineering their measurement stack with tools that cost more than they save, and making worse decisions than they were making before.
Both responses are wrong. The right answer is more boring than either: understand what changed, understand what it means for the data you have, and build a decision framework that works with imperfect signal instead of waiting for perfect signal that's never coming back.
What Actually Changed (And What It Means Now)
iOS14 introduced App Tracking Transparency — the prompt that asks users if they want to be tracked across apps and websites. Most users said no. The industry-wide opt-in rate settled somewhere between 25–45%, meaning the majority of iOS users opted out of cross-app tracking.
What this actually broke: Meta's ability to match your site's pixel events back to specific users who saw or clicked your ads. For iOS users who opted out, Meta can no longer track what happens after someone leaves the app. No purchase events tied to individual users. No view-through attribution. Significantly degraded retargeting audience quality — because building a "people who visited your product page" audience requires knowing who visited your product page.
iOS15 added Mail Privacy Protection, which blocked email open tracking at scale — less relevant for paid media but important for understanding the broader privacy direction. iOS16 and iOS17 continued tightening restrictions, including limiting link-click tracking parameters in Safari's private mode and restricting cross-site cookie sharing further. None of these were as impactful as iOS14, but together they reinforced that the pre-2021 data environment is permanently gone.
The brands still waiting for attribution to "get fixed" are making decisions in a vacuum. The data isn't coming back. The question is what you do with what you have.
Here's what this means in 2026: Meta's Ads Manager reports are built on a combination of deterministic data (what it can directly match) and modeled data (statistical inference for the gaps). When you see a conversion reported in Meta, some fraction of it is real attribution and some fraction is a model's best guess. Meta doesn't tell you which is which, and the mix varies by account, audience, and market.
Signal vs. Precision: The Right Mental Model
The instinct when attribution degrades is to say "the data is broken." That's a seductive framing because it excuses you from having to trust anything. But it's the wrong call.
The data didn't become broken — it became less precise. There's a meaningful difference. You still have signal. You just can't treat that signal as a precise measure of reality the way you could before iOS14.
Think of it like weather forecasting. A meteorologist doesn't refuse to give a forecast because the future is uncertain. They give you a probability — "70% chance of rain" — and they're useful even when they're wrong because the directional signal is still there. Your Meta data is the same. A campaign that's trending better than another campaign is probably genuinely performing better. An ad with a 3x reported ROAS might actually be a 1.8x — but if another ad shows 1.2x, you can still make the allocation call.
What you can't do is treat absolute ROAS numbers as gospel. You can't build business models on platform-reported revenue figures. And you can't use platform attribution as your only input into channel-level budget decisions.
The Triangulation Framework
The approach that actually works is triangulation: combining three imperfect data sources into a picture that's more reliable than any single source alone.
The Three-Source Attribution Stack
Platform data (Meta Ads Manager): Best for relative comparisons — creative A vs. B, campaign trends over time, audience performance signals. Do not use absolute ROAS for cross-channel budget decisions.
Post-purchase surveys: Ask customers "How did you first hear about us?" directly on the thank-you page. Captures true self-reported channel attribution, especially for top-of-funnel touchpoints that click-based attribution misses entirely.
MER (Marketing Efficiency Ratio): Total revenue ÷ total ad spend across all channels. Platform-agnostic, unaffected by attribution windows or pixel tracking. Your top-level health metric.
Each source has its job. Platform data informs tactical decisions within a channel. Post-purchase surveys inform channel mix and brand awareness effectiveness. MER tells you whether the aggregate system is working or not. When all three point in the same direction, you have conviction. When they diverge, you have a question worth investigating.
Using Platform Data Correctly
The primary use case for Meta Ads Manager data in a post-iOS world is relative performance comparison within the platform. Is creative A working better than creative B? Is campaign structure X outperforming structure Y? Is this audience segment showing better efficiency trends than that one?
These are questions Meta can still answer reasonably well. Because the signal degradation affects all your campaigns equally (or proportionally), the relative comparison holds even when the absolute numbers are off. If creative A shows 2.8x ROAS and creative B shows 1.9x in the same timeframe, creative A is probably genuinely better — even if the real ROAS of both is lower than reported.
What Meta data cannot reliably tell you: absolute revenue attribution, true CAC at the channel level, or whether your investment in Meta versus TikTok versus Google is correctly balanced. For those questions, you need MER and survey data.
Post-Purchase Surveys: The Underutilized Signal
A single-question survey on your order confirmation page ("How did you first hear about us?") is the highest-signal, lowest-cost attribution improvement you can make. It takes an afternoon to implement and pays dividends for as long as you run it.
Why it matters: click-based attribution systematically misses view-through impact. A customer who saw your Meta video ten times over three weeks but never clicked — then Googled your brand name and purchased — shows as organic search in your attribution. The survey captures the truth: they found you on Instagram.
Post-purchase surveys consistently show Meta underreporting its own contribution by 20–40% for brands with strong top-of-funnel investment. They also frequently reveal that channels you thought were underperforming are actually doing significant awareness work that never gets credited. Use a tool like Fairing, Triple Whale's survey feature, or simply a custom Klaviyo flow on the confirmation page. Segment the responses by new vs. returning customer and by order value — the patterns are usually illuminating.
MER: The North Star
MER is the metric that cuts through all the attribution noise. If you're spending $500K/month total across channels and generating $2M in revenue, your MER is 4x. This number tells you the aggregate efficiency of your paid media investment regardless of how individual platforms report it.
The value of MER compounds as you scale and add channels. When Google, Meta, TikTok, and Pinterest are all competing to claim credit for the same conversions (multi-touch attribution at its messiest), MER ignores all of that politics and tells you a simple truth: for every dollar in, you're getting four dollars out.
Track MER weekly. Set a floor (the minimum MER at which the business is viable given your contribution margins). Use it as the guardrail for total spend decisions. If MER is above floor and trending up, you can be aggressive with spend. If MER is at or below floor, pull back regardless of what individual channels are reporting.
What Still Works vs. What Doesn't
To be concrete about where platform data is and isn't reliable:
Still reliable:
- Relative creative performance comparison (within the same campaign/timeframe)
- Audience-level trend analysis (is this segment's CPC trending up or down?)
- Directional budget reallocation signals (campaign A is clearly outperforming campaign B)
- Frequency and reach metrics (largely unaffected by iOS)
- Android-heavy audience performance (Android attribution remains accurate)
No longer reliable as absolute measures:
- Absolute ROAS as a business-level truth
- View-through attribution (was already inflated, now even less reliable)
- Retargeting audience precision (your "30-day site visitor" audience has significant noise)
- Cross-channel attribution comparisons using platform data
- LTV modeling based on platform attribution alone
How DTCo Makes Budget Decisions Today
Here's the actual decision framework: we lead with MER as the top-level health check. If MER is healthy and we have room to scale, we look at platform trends to decide where to allocate incremental spend. We use post-purchase survey data to calibrate channel-level budget weight — particularly for awareness-stage Meta investment that platform data systematically undercounts.
For creative decisions within Meta, we use platform data with high confidence — because creative A/B testing is exactly the use case where relative comparison works well. We're not asking "what's the true ROI of this ad in absolute terms" — we're asking "which ad is working better than the other" and platform data answers that question reliably.
Perfect attribution was always a fiction. iOS14 just made it obvious. The brands that thrive now are the ones that got comfortable making decisions with directional data.
For channel mix decisions — whether to invest more in Meta vs. Google vs. TikTok — we weight post-purchase survey responses heavily. We look at where new customers say they found us, we cross-reference with which channels are showing healthy MER contribution (using simple regression against spend changes), and we allocate accordingly. This isn't as precise as multi-touch attribution was supposed to be — but it's more accurate than multi-touch attribution actually was.
What Most Brands Got Wrong in Their Response
The over-engineered response to iOS was to bolt on third-party attribution tools — Northbeam, Triple Whale, Rockerbox — and spend significant monthly fees getting a new set of numbers to stare at. Some of these tools add genuine value. Many don't. The question to ask is: is this tool changing the decisions I make, or is it just giving me more data to look at?
If you can describe a specific decision you made differently because of your attribution tool, keep it. If you can't, evaluate whether the cost is justified. The fundamental problem of iOS-driven signal loss isn't solved by any third-party tool — they're working with the same degraded input data as Meta Ads Manager, just presenting it differently.
The simpler response — MER tracking, post-purchase surveys, and platform data for relative comparisons — is more actionable for most DTC brands and costs less than a new attribution SaaS contract. Start there. Add complexity only when you can articulate what decision the added complexity enables.
The Honest Truth About 2026
We're not getting pre-iOS14 attribution back. Apple has continued tightening restrictions, and regulatory pressure in Europe and elsewhere is pushing other platforms and browsers in the same direction. The direction of travel is toward less tracking, not more.
The competitive advantage in this environment goes to brands that got comfortable making good decisions with imperfect data faster than their competitors. That means building a three-source measurement stack (platform + survey + MER), making creative decisions with speed rather than waiting for "enough data," and trusting business outcomes (MER, CAC, LTV) over platform-reported metrics.
The brands that are scaling past $1M+/month in Meta spend today aren't doing it with better attribution tools. They're doing it with better creative, better offers, and a decision-making culture that doesn't require certainty to move fast.
Frequently Asked Questions
How did iOS affect Meta advertising?
iOS14's App Tracking Transparency prompt caused the majority of users to opt out of cross-app tracking, breaking Meta's ability to track post-click behavior for those users. This degraded attribution accuracy, reduced retargeting audience quality, and shortened effective attribution windows. The result: Meta's reported ROAS is now a combination of real attribution and statistical modeling for the untracked portion.
Can you still trust Meta ROAS after iOS?
Meta ROAS is still useful as a relative signal for comparing creative performance and identifying trends within the platform. It's not reliable as an absolute business-level truth. Always triangulate with post-purchase surveys and MER before making channel-level budget decisions based on platform ROAS alone.
What's the best attribution approach for DTC in 2026?
Triangulation: platform data for relative creative and campaign comparisons, post-purchase surveys for channel-level attribution reality checks, and MER as your top-level business health metric. No single source tells the complete truth — but three imperfect sources together tell a much better story than any one source alone.
How do post-purchase surveys improve attribution?
Post-purchase surveys ask customers directly how they found you — and the responses frequently don't match platform attribution. They capture view-through impact that click-based tracking misses entirely. Brands with strong Meta top-of-funnel investment consistently see Meta undercounted by 20–40% in click-based attribution vs. self-reported survey data.
What is MER and why does it matter after iOS?
MER (Marketing Efficiency Ratio) is total revenue divided by total ad spend. It's unaffected by iOS tracking changes, attribution windows, or platform-specific counting. When iOS degraded platform attribution, MER became the most reliable top-level gauge of whether advertising is working in aggregate. It's the north star metric for brands navigating imperfect attribution environments.
Scaling a DTC brand spending $150K+/month on paid?
We built this system for brands at your level. Tell us about your brand and we'll show you what this looks like for your specific situation.
Tell us about your brand →