The Challenge: From Install to Insight
Users install our games from various ad channels, but where do they drop off? To improve retention and monetization, we must understand the entire player journeyβfrom ad click to in-app action. Our goal is a pragmatic, scalable plan that avoids analysis paralysis.
Data Ownership: 100% of raw event and attribution data is retained in a warehouse we control.
The Proposed Analytics Stack
A curated selection of best-in-class SaaS vendors to form a cohesive data pipeline, prioritizing data ownership, scalability, and minimal operational overhead.
Singular
Attribution (MMP)
Certifies which ad campaigns deliver installs and unifies ad spend data. Hourly raw data exports to BigQuery are required to prevent vendor lock-in.
PostHog Cloud
Product Analytics
Captures all in-game user behavior via a simple API. Provides funnels, retention analysis, and future-proofs us for A/B testing.
BigQuery
Warehouse / Source of Truth
The central, serverless repository for all raw data. Enables custom joins and ensures we own our historical data indefinitely.
Metabase
Business Intelligence (BI)
Sits on top of BigQuery to create readable dashboards and alerts for the entire team, from marketing to the C-suite.
Supabase
Identity & Auth
Provides a stable, non-PII user ID post-login, allowing us to stitch anonymous activity to a known user for deeper analysis.
Cloudflare Workers
Backend & Telemetry
Lightweight endpoints for server-side tasks like IAP verification, ensuring authoritative revenue data lands in the warehouse.
The Data Flow: Device to Dashboard
A high-level view of how user data travels from the game client to our analytics dashboards, with BigQuery as the central source of truth.
Device
Godot Client with Analytics Wrapper & MMP Plugin
PostHog & Singular
Capture Gameplay & Attribution Events
BigQuery
Central Warehouse (Source of Truth)
Metabase
BI Dashboards & Alerts
Guiding Principles & Trade-offs
This approach is optimized for specific constraints, requiring deliberate trade-offs between short-term convenience and long-term control.
Core Constraints
- βGodot 4 Engine: Requires a custom, reusable analytics wrapper.
- βAvoid Vendor Lock-in: Mandates raw data exports to our warehouse.
- βSmall Team: Favors managed SaaS solutions to reduce ops burden.
- βPrivacy First: Anonymous identifiers are used; no PII in analytics.
Explicit Sacrifices vs. Gains
What We Gain:
- Full Data Ownership: Warehouse is the source of truth, eliminating vendor lock-in.
- Long-term Flexibility: Ability to join any data source and switch vendors without losing history.
- Deeper Insights: Unrestricted querying capabilities beyond what vendor UIs offer.
What We Sacrifice (For Now):
- Vendor UI Convenience: We will replicate essential views, not rely on vendor dashboards.
- Advanced "Black-Box" Features: We're opting out of predictive LTV or managed fraud models initially.
- Initial Setup Time: Requires a small upfront effort to build core BI views in Metabase.
The "V0" Foothold: Path to a Unified Pipeline
The initial goal is not to track everything, but to prove the end-to-end flow by wiring together the core services. This path establishes the foundation for all future analysis.
Instrument the Client (Device β PostHog/Singular)
First, establish the data collection layer. The Godot client, via the Rust wrapper, will send core gameplay events directly to PostHog. Simultaneously, the native MMP plugin will capture install sources and send attribution data to Singular.
Centralize in Warehouse (SaaS β BigQuery)
Next, we unify the data streams. We will configure and enable the hourly "Data Destinations" from both Singular (for attribution/spend) and PostHog (for gameplay events) to pipe all raw data into our BigQuery project, establishing it as the single source of truth.
Verify & Enrich (Server β BigQuery)
To ensure data integrity, a Cloudflare Worker will be deployed for server-side IAP receipt validation. This provides an authoritative revenue stream that writes directly to a separate table in BigQuery, which can be joined against client-side events.
Visualize Insights (BigQuery β Metabase)
Finally, with all data centralized and verified, we connect Metabase to BigQuery. The first deliverable is a single, crucial dashboard visualizing D1 Retention and 7-day ROAS, proving the entire pipeline is functional and delivering value.
Enabling Key KPIs Across the Stack
Our critical KPIs are not measured by a single tool, but are products of the entire integrated stack. Here's how each component contributes to the metrics that matter.
Acquisition KPIs (ROAS, CPI)
- Singular:Captures install source and ad cost.
- PostHog:Provides the cohort's engagement/revenue behavior.
- BigQuery:Joins cost data with revenue data.
- Metabase:Visualizes Return on Ad Spend.
Engagement KPIs (DAU, Session Length)
- PostHog:Fires `session_start` and `session_end` events.
- Supabase:Provides the stable `distinct_id` for accurate user counts.
- BigQuery:Aggregates sessions and calculates durations.
- Metabase:Builds DAU/MAU trends and session histograms.
Monetization KPIs (ARPDAU, LTV)
- PostHog:Captures client-side `iap_purchase` event context.
- Cloudflare:Server-verifies IAP receipts for authoritative revenue.
- BigQuery:Joins verified revenue with user activity.
- Metabase:Tracks revenue per user over time.
Retention KPIs (D1, D7, Churn)
- PostHog:Provides the stream of `session_start` events.
- Supabase:Ensures returning users are correctly identified.
- BigQuery:Cohorts users by their install date.
- Metabase:Visualizes the classic retention curves.
Appendix: Technical Details
A detailed breakdown of the implementation plan and data architecture.
3-Cycle Implementation Plan
Cycle 1: Identity, Sessions & Core Funnel
- Identity & Plumbing: Implement user identification, so we can associate persistent device pre-login events to post-login user events in PostHog.
- Session Tracking: Establish session tracking to measure user engagement time and identify potential app crashes or instability.
- Core Events: Track the First-Time User Experience (FTUE) funnel to understand where new players are dropping off.
- Data Validation: Perform continuous data validation to ensure event data is accurate and user identities are merging correctly.
Cycle 2: Core Loop, Monetization & Warehouse Setup
- Core Loop Events: Instrument the core gameplay loop to analyze player behavior, strategies, and match outcomes.
- Monetization Events: Implement initial client-side tracking for in-app purchases to begin analyzing the monetization funnel.
- Warehouse Connection: Establish the data pipeline from our SaaS tools into BigQuery, centralizing all raw data in our warehouse.
Cycle 3: Server Verification & First Dashboard
- Server IAP Verification: Deploy a server-side process to create an authoritative record of revenue, protecting against client-side spoofing.
- BI Setup: Connect our Business Intelligence tool to the warehouse and model the raw data for analysis.
- Deliverable: Deliver the first executive dashboard, proving the end-to-end pipeline is functional and providing actionable insights.