The 5 Metrics Every UA Manager Should Check Weekly
- Fátima Castro Franco
- Mar 11
- 4 min read
User acquisition management is not about launching campaigns and hoping performance holds. It is about consistent monitoring, disciplined evaluation, and early detection of changes in cohort behavior.
Performance rarely collapses overnight. More often, it slowly drifts. CPIs rise gradually. Retention weakens slightly across new cohorts. Creative performance declines without obvious warning. If these signals are not monitored consistently, profitability erodes before anyone notices.
For UA managers working in mobile games, reviewing the right metrics weekly is essential. Not every dashboard number deserves equal attention. The following five metrics provide the clearest view of traffic quality, scalability, and long-term profitability.
1. Cost Per Retained User
While cost per install (CPI) is the most visible metric in user acquisition, it does not reflect traffic quality. A campaign with a low CPI can still be unprofitable if users churn quickly.
Cost per retained user offers a clearer picture. It measures how much you are paying for users who remain active after a meaningful period, typically Day 7.
The formula is simple:
Cost per retained user = Total spend ÷ Number of users active on Day 7
Tracking this weekly allows UA managers to see whether lower CPIs are actually translating into sustainable cohorts. If CPI decreases but cost per retained user increases, it indicates that traffic quality has declined. Conversely, a slightly higher CPI paired with stable retention may still represent stronger long-term value.
This metric connects acquisition cost directly to product performance, making it far more reliable than CPI alone.
2. D7 Retention Trends Across Cohorts
Daily retention metrics can fluctuate, but weekly trend analysis reveals deeper patterns. Day 1 retention primarily reflects onboarding quality. Day 7 retention reflects the strength of the core gameplay loop and early progression design.
Rather than reviewing a single percentage, UA managers should compare multiple weekly cohorts and observe directional changes. Are newer cohorts performing better than older ones? Has recent scaling affected stickiness? Are certain geographies showing instability?
Even a small decline in D7 retention can significantly affect lifetime value projections. Monitoring trends — not isolated values — provides early warning before monetization performance is impacted.
Retention stability is often the strongest indicator of whether scaling can continue safely.
3. ROAS Progression Over Time
Return on ad spend (ROAS) should not be evaluated as a static snapshot. What matters most is the progression curve.
Weekly review should include:
Day 3 ROAS
Day 14 trajectory
Expected Day 30 projection
If the ROAS curve is flattening earlier than expected, it suggests changes in traffic intent, monetization timing, or player behavior. If the curve is accelerating at a healthy pace, scaling may be justified even if early numbers appear modest.
The shape of the ROAS curve reveals more than the current value. A campaign at 18% D7 ROAS with a steep upward trend can outperform a campaign at 22% that has already plateaued.
Understanding progression dynamics is essential for confident budget decisions.
4. Creative Performance and Fatigue Indicators
Creative testing remains one of the most influential factors in UA performance. However, creative fatigue often develops gradually and can be overlooked without weekly review.
UA managers should monitor:
Click-through rate (CTR) trends
Install rate shifts
Cost per click changes
Frequency exposure levels
If CTR declines while targeting remains unchanged, creative fatigue is likely. Similarly, if install rates drop while CTR remains stable, the issue may lie in store conversion or messaging alignment.
Weekly creative analysis prevents performance cliffs. By identifying fatigue early, teams can refresh assets before acquisition costs increase significantly.
Creative strategy is no longer secondary to targeting — it is central to performance stability.
5. Spend Distribution Across Channels
Channel diversification reduces risk. Weekly analysis of budget allocation helps prevent overexposure to a single traffic source.
UA managers should review:
Percentage of total spend per channel
Performance consistency across platforms
Revenue contribution by source
Volatility signals in specific channels
If the majority of spend is concentrated in one platform, fluctuations in auction dynamics or seasonality can disrupt overall performance. A balanced acquisition ecosystem provides greater stability and negotiating power when scaling.
This metric is less about optimization and more about strategic risk management.
How These Metrics Work Together
Each of these metrics provides value individually, but their combined interpretation is where strategic insight emerges.
For example:
Declining CPI combined with declining D7 retention signals low-quality scale.
Stable retention paired with rising CPI may still support profitable growth.
Strong retention but flattening ROAS progression may indicate monetization timing issues.
Healthy ROAS alongside declining CTR often suggests impending creative fatigue.
UA performance should always be evaluated in context. Single metrics rarely tell the full story.
Why Weekly Discipline Matters
User acquisition is a compounding system. Small inefficiencies accumulate quickly at scale. A two-percent retention decline may appear minor in one week but can materially reduce lifetime value across thousands of users.
By reviewing these five metrics consistently, UA managers shift from reactive decision-making to proactive optimization. Scaling becomes deliberate rather than speculative.
Successful growth in 2026 depends less on aggressive expansion and more on controlled, data-driven progression.
FAQ
What is the most important metric for UA managers?
Cost per retained user and long-term ROAS provide a clearer view of traffic quality than CPI alone.
How often should UA performance be reviewed?
Core performance metrics should be evaluated weekly, with deeper cohort analysis conducted monthly.
Is D1 retention enough to judge campaign quality?
No. D7 and beyond provide a more reliable measure of long-term engagement and monetization potential.
Why is creative performance reviewed weekly?
Because fatigue develops gradually, and early detection prevents performance deterioration at scale.
Ready to transform your game's outreach?
Unleash the potential of an AI-powered platform featuring a user-friendly dashboard to effortlessly enhance your user acquisition efforts. With this intuitive dashboard, you have complete control over your budget and a wide array of targeting options, making Gamelight, the AI-driven advertising platform, the smart choice for expanding your game's audience.
Explore Gamelight: The Magic of AI in Mobile Marketing. With an AI-powered advertising platform, CPI rates, and no creative work required, you can initiate campaigns in just 5 minutes. It's all about ease and effectiveness.
If you need assistance, please fill out THIS FORM, and one of our team members will get in touch with you within 24 hours.




Since I can only recommend trustworthy and useful content, I think it is: soda water