Most DTC brands have reporting. They have dashboards, they have weekly decks, they have channel-level breakdowns. And most of that reporting is completely useless for making decisions.

Here's the tell: if your team leaves the weekly review without at least two clear actions for the following week, the report failed. Reporting that doesn't produce decisions is just documentation. And documentation is what lawyers and accountants need — not growth teams.

The brands that compound their paid efficiency quarter over quarter have a different relationship with reporting. They've simplified to the metrics that actually matter. They've built decision rules into the review process. And they've committed to a consistent cadence that makes the data comparable week-over-week rather than a different analysis every Monday.

Here's how to build that system.

The Problem With Most DTC Reporting

The typical DTC reporting failure mode looks like this: the marketing team pulls every metric available, organizes it by channel, and presents a 30-slide deck every week. The team spends an hour discussing what happened — why CPC went up on Tuesday, what caused the conversion rate dip on mobile, whether the new creative test has "enough data yet" — and then the meeting ends with no decisions made and nothing changes.

Three specific problems drive this:

Backward-looking without forward-looking context. Reporting what happened without a clear framework for what it means about what to do next. "Meta CPA was $47 this week" is a fact. "Meta CPA was $47 this week against a $42 target, meaning we're 12% over on acquisition cost and need to either improve creative or reduce spend" is a decision prompt.

Too granular too early. Looking at ad set-level data before you've assessed account-level trends is the equivalent of examining a tree's bark before you've looked at the forest. You need to establish the macro picture before diving into micro diagnostics.

No decision framework. Without predefined thresholds that trigger specific actions, every data point becomes a discussion topic rather than a decision input. You end up in analysis paralysis — everyone sees the same numbers and no one knows what they mean for the week ahead.

The Five Metrics That Matter Weekly

Your weekly report should be anchored on five metrics. Not fifteen. Not thirty. Five.

1. MER (Marketing Efficiency Ratio)

Total revenue divided by total ad spend, across all channels. This is your macro health check. MER rises and falls with creative performance, seasonality, channel mix, and landing page conversion. It's not a perfect metric, but it's the best single number for understanding whether your paid marketing engine is getting more or less efficient at the blended level.

Track it weekly and set a target. If you're at a 3.2x blended MER and your target is 3.0x, you have room to scale. If you're at 2.6x and your target is 3.0x, something in the system is broken and needs investigation before you spend another dollar more.

2. New Customer CAC by Channel

Not blended CAC. New customer CAC on your primary acquisition channels — Meta, Google, TikTok. Returning customers skew your blended numbers and hide acquisition problems. New customer CAC is the number that tells you whether you're actually growing the customer base efficiently or just re-monetizing the base you already have.

3. Top Creative Performance

Which 3–5 creatives drove the most efficient new customer acquisition this week? What were their hook rates? What is their performance trajectory (improving, stable, declining)? Creative performance is often the leading indicator of account performance changes — when your best creative starts fatiguing, account performance follows within 2–3 weeks.

4. Blended Conversion Rate by Traffic Source

The weekly conversion rate sanity check. If your CVR drops materially on Meta traffic but not Google traffic, the problem is likely a creative/landing page alignment issue, not a site-wide problem. If CVR drops across all sources simultaneously, you might have a site issue, a pricing issue, or a macro demand shift.

5. Week-Over-Week Revenue vs. Plan

Simple, but essential. Are you pacing ahead, on track, or behind your monthly/quarterly revenue target? This gives context to every other metric. If you're 15% behind plan, an MER that's slightly below target becomes a critical problem. If you're 10% ahead of plan, the same MER signal might be acceptable given the volume.

The Decision Tree

What each metric triggers

MER >10% above target: Scale spend 15–20%. Push volume while efficiency is high.

MER within 10% of target: Hold spend. Optimize creatives.

MER >10% below target: Cut lowest-performing ad sets. Launch new creative tests. Do not scale.

New customer CAC rising 2+ weeks consecutively: Creative fatigue or audience saturation signal. Prioritize new creative immediately.

CVR drop >15% week-over-week: Pull all traffic sources and isolate. Check for site issues, offer changes, and creative-to-landing page alignment.

How to Separate Signal from Noise in Weekly Data

Weekly data is inherently noisy. A single day of bad weather in your primary market, a viral moment that drove organic traffic, a site bug that lasted two hours — all of these can move weekly metrics in ways that look meaningful but aren't.

The tool for separating signal from noise is rolling comparison. Never compare this week to last week in isolation. Compare this week's metrics to the 4-week rolling average and to the same week in the prior year (when you have the data). A metric that's meaningfully below its 4-week rolling average is signal. A metric that moved vs. last week but is in line with the rolling average is probably noise.

"The single biggest mistake in weekly reporting is over-indexing on week-over-week changes. Last week might have been anomalously good or bad. Compare to the trend, not just the prior week."

Also: be consistent with your definition of "week." Always use the same day-of-week boundaries (Monday–Sunday, or Sunday–Saturday). Mixed week definitions produce false comparisons that send teams chasing non-existent problems.

Creative Performance Review: The Most Important Part

If your weekly review spends the most time on channel-level budget allocation and the least time on creative, you have it backwards. At 8-9 figure DTC brands running primarily Meta and TikTok, creative is the primary variable that moves performance. Budget allocation optimizes around it — but creative is the fuel.

Your weekly creative review should answer four questions:

  1. Which creatives are in the top quartile of performance and should receive additional budget or creative iterations?
  2. Which creatives are in decline (rising CPA, falling CTR, falling CVR) and should be paused?
  3. What do the top performers have in common? (Hook style, format, message angle, talent type)
  4. What new creative tests need to launch this week based on what we're seeing?

This is where your creative strategy evolves week over week. Brands that don't review creative this consistently eventually find themselves with an aging creative library, no new winners in the pipeline, and wondering why their account performance has been drifting down for three months.

Running the Weekly Review in 30 Minutes

The reason most reporting reviews run too long is that they're structured as presentations rather than decision sessions. The person who built the report is presenting data; everyone else is reacting. This is the wrong format.

The right format: the report goes out 24 hours before the review. Everyone comes having already read it. The review itself covers only three things:

  1. Exceptions (10 minutes): What is outside the normal bands? What moved significantly vs. the rolling average? What triggered a decision threshold?
  2. Creative review (12 minutes): Top performers, bottom performers, what to pause, what to launch, what patterns we're seeing.
  3. Next week's actions (8 minutes): Explicitly list 3–5 concrete actions. Who owns each. What the success criteria is.

That's it. No history lessons. No context setting. No re-explaining last week's events. The team already read the report. Use the meeting time for decisions.

"A weekly review that ends without a written list of specific actions for the following week accomplished nothing. The test of a good reporting cadence isn't how much data it contains — it's how clearly it drives the next move."

Building the Report Template

Your weekly report template should be the same every week. Same structure, same metrics, same comparisons. Consistency is what makes weekly data meaningful — when you've been tracking the same metrics in the same format for 12 months, you develop genuine intuition for what normal looks like and what signals real change.

The template we use at DTCo across all accounts:

Four pages. Thirty minutes to produce. Thirty minutes to review. Every decision documented. Every action owned. This is what a functioning growth reporting cadence looks like.


Frequently Asked Questions

What metrics should DTC brands track weekly?

The five metrics that matter most weekly are: (1) MER (Marketing Efficiency Ratio — total revenue divided by total ad spend), (2) new customer CAC by channel, (3) top creative performance (hook rate and hold rate), (4) blended conversion rate by traffic source, and (5) week-over-week revenue vs. plan. Everything else is either a monthly metric or a diagnostic you pull when something looks wrong.

How do you build a DTC marketing dashboard?

A good DTC marketing dashboard has three layers: a one-page executive summary with 4–5 core metrics and clear trend indicators, a channel-level breakdown showing efficiency and volume by platform, and a creative performance table showing top and bottom performers by hook rate and conversion rate. Build it in Looker Studio, Notion, or even a well-formatted spreadsheet — the tool matters less than the consistency and clarity.

What should be in a weekly paid media report?

A weekly paid media report should include: total spend and revenue by channel, MER vs. prior week and vs. target, new customer CAC vs. target, top 3 and bottom 3 creatives by performance, one clear highlight (what worked this week), one clear concern (what needs attention), and 2–3 specific actions for next week. Keep it to one page. If it takes more than 30 minutes to review, it's too long.

How do you make paid media reporting actionable?

Build a decision tree into your reporting template. For each metric, define thresholds: if MER is above target by 10%+, scale spend; if MER is within 10% of target, hold; if MER is below target by more than 10%, cut spend and investigate. When every metric in your report has a predefined response, reporting becomes decision-making, not documentation.

What KPIs should DTC brands focus on?

At the executive level: MER, new customer CAC, and contribution margin per acquired customer. At the channel level: efficiency metrics (ROAS or CPA vs. targets) and volume metrics (impressions, reach, new customer acquisition rate). At the creative level: hook rate, hold rate, click-through rate, and creative-level CPA. Avoid tracking vanity metrics like total impressions or engagement rate as primary KPIs — connect everything to revenue impact.

Scaling a DTC brand spending $150K+/month on paid?

We built this system for brands at your level. Tell us about your brand and we'll show you what this looks like for your specific situation.

Tell us about your brand →