UX KPIs and engagement metrics
1) Why UX metrics and where are their boundaries
UX metrics translate interface solutions into the language of numbers: speed, clarity, lack of friction, habituation of patterns. They do not replace business metrics (revenue, GGR/NGR, ARPPU), but explain the "why" behind changes in conversion. A good metrics stack should:- Be linked to screen goals (one goal - one main KPI).
- Separate behavior (what they do) and quality (how easy/understandable).
- Support A/B experiments and "before/after."
2) UX-KPI map (levels)
Global (end-to-end): Activation, Retention, Engagement, Satisfaction (CSAT/NPS/SUS).
Paged: FMC, TTV, Success Rate, Error Rate, Scroll Depth, Rage/Dead Clicks.
Component: Adoption/Usage of specific functions, Time on Task, Backtrack Rate.
3) Basic behavioral metrics
DAU/WAU/MAU - active audience by time window.
Stickiness = DAU/MAU. Interpretation: 0.2-0.6 for utilitarian products;> 0.5 for frequent scenarios.
Sessions per User/Day - frequency of visits.
Avg Session Duration - average session duration (caution: not equal to value).
Events per Session - depth of interaction (in conjunction with target events).
4) Activation and speed to value
Activation Rate = First Value Users/New Users.
Examples of "first value" in iGaming: launching the first game, successful deposit, joining the tournament.
TTV (Time to Value) - time from entry to key value (median/quantiles).
FMC (First Meaningful Click) - the proportion of users who completed the target action ≤ N seconds from the download.
Success Rate (tasks) -% of users who completed a script (for example, a deposit).
Step Conversion - conversion by flow steps (onboarding, KYC, box office).
5) Interaction quality metrics
Error Rate = error sessions/all sessions (divided into UI validation and network/HTTP).
Backtrack Rate = returns to the previous step/all transitions within the flow.
Rage Click Rate = sessions with ≥3 quick clicks at one point/all sessions.
Dead Click Share = clicks without effect/all clicks.
Scroll Depth p50/p90 - viewing depth (important for landing/stocks).
Mis-Click Distance - the average distance from the click to the nearest active target ("false affordance" proxy).
6) Engagement metrics
Feature Adoption Rate = users who have used the feature/target base (optional).
Repeat Usage = percentage of users who returned to the feature ≥N times in the period.
Session Depth = target actions per session (game launches, favorites, etc.).
Time in Feature - the total active time with a specific module (not to be confused with the "hanging tab").
Share of Attention - share of time/clicks on P1 zones vs P2/P3.
7) Hold and Returns
N-day Retention (D1/D7/D30) - the proportion of returning on Day N (classic cohorts).
Rolling Retention N - returned any day ≥N (softer and more visual).
Churn Rate = Departed/Active at the start of the period.
Reactivation Rate - the proportion of "woke" inactive for the period.
Survival Curve/Hazard - cumulative retention and probability of "fall" at the moment.
8) Subjective perceptual metrics
CSAT = satisfaction (scale 1-5).
CES (Customer Effort Score) - effort to complete the task (1-7).
NPS - willingness to recommend (− 100... + 100).
SUS (System Usability Scale) - perceived convenience (0-100).
9) Metrics of "interface quality" (Web Vitals and availability)
INP/LCP/CLS - responsiveness, speed of the first content, layout stability.
A11y metrics: proportion of screens with visible focus styles, hit-area ≥44×44px size, AA/AAA contrast on critical paths.
10) UX ↔ business combination (iGaming context)
Cashier Conversion = reached the deposit/opened the cashier.
Net Depositing Users Rate (NDU) = deposited/active.
Journey to First Deposit: TTV to first deposit +% dumps in steps.
Bonus/Promo Clarity: CTR on "Join" + Error/Backtrack in rules/conditions.
Game Discovery Efficiency: FMC in game launch, Success Rate search/filters, TTV before first launch.
11) Formulas (short reference)
FMC = users with target click ≤N sec/all screen users.
TTV = median (t (value) − t (input)).
Success Rate (flow) = Users who have completed Step N/started Step 1.
Error Rate (UI) = error events/target input events.
Feature Adoption = used feature/optional-base.
Stickiness = DAU / MAU.
Rolling Retention D7 = users returning on Days 2-7/Cohort D0.
12) Instrumentation: what to log
Single data layer (minimum):
session_id, user_bucket (A/B), device, page ui_click(zone, component_id, outcome)
ui_error(type, code, field, step)
ui_state_change(component_id, state)
route_change(from, to)
visibility(zone, time_in_view)
experiment_variant, cohort (signup_date)
Stable selectors: 'data-ux-zone', 'data-component-id' - do not bind to CSS classes.
Hygiene: field masking, absence of PII in events, consent/opt-in.
13) Dashboards (skeletons)
A. Main UX-dashboard
FMC and TTV on key screens (home, catalog, game, box office).
Success/Step Conversion in critical flow.
Rage/Dead Clicks and Error Rate (7/28 day trends).
Scroll Depth vs CTR key CTAs.
Web Vitals (INP/LCP/CLS) by device.
B. Engagement and Retention
Stickiness, DAU/WAU/MAU, Sessions per User.
Feature Adoption/Repeat Usage by module (search, favorites, tournaments).
Retention D1/D7/D30 by cohort, Survival curve.
C. Cash and monetization (UX-slice)
Cashier Conversion in steps (with errors).
TTV to first deposit, Abandonment @ Step.
Validation/network errors, Backtrack Rate.
14) Analytics and methods
Cohort analysis: group by registration/first deposit/first game start.
Power analysis A/B: estimate traffic volume and effect-size in advance (so as not to "shoot into the darkness").
Causality: Use before/after experiments and method with control screens.
Segmentation: new vs returning, mobile vs desktop, traffic channels, VIP clusters.
Triangulation: metrics + heatmaps + session records + tickets/support.
15) Target thresholds (benchmarks, adapt to product)
FMC (hero-CTA): ≥35 -50% in the first 5-8 seconds.
TTV (starting the first game): P50 ≤ 30-60 seconds.
Success Rate (deposit flow): ≥75 -85% with understandable limits/fees.
Rage Click Rate: <1–2%.
Dead Click Share: <8-12% on key screens.
Stickiness: 0.25-0.45 (frequent scenarios closer to the upper limit).
16) OCD examples (how to formulate goals)
KR1: reduce TTV to the first start of the game from 75s → 50s (median).
KR2: increase FMC in the main CTA from 38% → 50% in the first 8 seconds.
KR3: reduce the Rage Click Rate at the checkout from 2.3% → <1.2%.
KR4: raise the Success Rate deposit from 78% → 86%.
KR5: increase Feature Adoption new search to 35% of eligible users.
17) Implementation procedure (team ritual)
1. Indicate the purpose of the screen and the main KPI.
2. Audit the hierarchy (single P1, contrast, hit-area).
3. Hypothesize → prioritize (P1/P2/P3).
4. Run A/B or before/after release with logging.
5. Measure FMC/TTV/Success/Errors/Scroll and business deltas.
6. Fix solutions in the design system and guides.
7. Repeat iterations (weekly/sprint cycles).
18) Anti-patterns
"Vanity Metrics": Average session and "site time" without being tied to goals.
Mixing mobile and desktop data in one output.
Interpretation of metrics without statistics (no confidence intervals).
Conclusions on heat maps without outcome metrics (Dead/Rage/Success).
Experiments without power analysis and without predetermined success criteria.
Lack of PII masking and user consent.
19) Acceptance Criteria for UX tasks with KPI
The main screen KPI and target threshold are defined.
Added events to data layer and checked on staging.
Built widget in dashboard (real time/daily).
Planned A/B or before/after window with check selection.
There are "Go/No-Go" criteria (for example, FMC + 8 pp, TTV − 20%).
The results are documented and entered into the design system.
20) TL; DR
Select one main KPI per screen (FMC, TTV, Success Rate...), measure them stably, associate them with retention and cash register, confirm with A/B tests. Avoid vanity metrics, segment traffic and capture improvements in the design system. UX metrics are a decision discipline, not a set of beautiful numbers.