Portfolio Analysis Process
A structured approach to analyzing game performance data. This guide explains each step, what data your team needs to provide, and how the analysis produces actionable insights.
Method: Mathematical modeling of revenue growth patterns over time
Output: Product quality assessment, release impact analysis
Method: Extrapolation based on observed revenue curves and industry benchmarks
Output: Kill/scale decisions, budget allocation
Why Two Tracks?
Product quality and marketing efficiency are different questions requiring different data. Mixing all channels masks whether a ROAS change is due to product improvements or just a shift in ad network mix. By isolating a single geo and network for Product Review, we see pure product signal. The Marketing Report then uses the full dataset to answer "where should we scale?"
The Five Phases
Marketing Data (from MMP)
Your attribution platform (AppsFlyer, Adjust, Singular, etc.) tracks where users came from and how they monetize:
- Cohort Export with these dimensions:
- Install Date — Groups users into cohorts
- Cohort Day — Days since install (D0, D1, D7, D30, D60, D90...)
- Revenue — Cumulative revenue per cohort day
- Cost/Spend — UA spend attributed to that cohort
- Media Source — Ad network (for channel analysis)
- Country — Geo for regional analysis
- Platform — iOS vs Android
- Version History: App release dates with version numbers (App Store Connect / Google Play Console).
Game Data (from your analytics)
Internal game telemetry tells us what users do in the game:
- Retention curves — D1, D7, D30 retention by cohort
- Progression data — Where users get stuck or churn
- Session metrics — Sessions per day, session length
- Economy data — Currency sinks/sources, IAP conversion
Raw data mixes product quality, network performance, geo economics, and platform differences. We isolate signals through:
- Geo Isolation: Single geography removes currency/regional effects
- Network Isolation: Single ad network removes algorithm differences
- TTR Filtering: Exclude buggy versions hotfixed within days
- Maturity Filters: Exclude cohorts too young to project reliably
Regime Detection
Performance shifts in "regimes"—structural breaks caused by game updates, economy rebalances, or creative refreshes. We analyze periods separately rather than averaging across them.
Long-Term Projection
We project from observed data (D7, D30, D60) out to D365 to estimate lifetime value:
- D7-D30: High confidence—many mature cohorts
- D60-D90: Moderate confidence—extrapolation begins
- D180-D365: Projection zone—confidence depends on mature data available
Product Review
"Is the product improving?" ROAS trends, regime-by-regime analysis, version impact. Use for evaluating whether updates move metrics correctly.
Marketing Report
"Where should we spend?" Performance by geo/network/platform with kill/scale recommendations. Use for budget allocation.
What's Included
- Charts: ROAS curves over time
- Projections: D365 ROAS with confidence ranges
- Recommendations: Clear next steps
- Methodology: Data used, filters applied, and why
Document Findings
Each analysis generates insights: what we learned, what surprised us, hypotheses confirmed or rejected.
Version Attribution
When ROAS changes, we identify which version(s) caused it. A regime might span multiple versions—usually 1-2 specific releases are responsible.
Hypothesis Updating
Patterns emerge over time: which features improve monetization, which markets respond to which creatives, optimal update cadence. This compounds into strategic advantage.
Founder Checklist
Before your first analysis, ensure your team has:
Ready to Start?
Contact your Transcend partner to schedule your first analysis. We'll walk through the data requirements together and help you set up the export process from your MMP.
