GH GambleHub

Heat risk map

1) Purpose and value

The Risk Heatmap is a visual tool for ranking and communicating risk across the Probability × Impact matrix, linked to controls, metrics, and action plans.

Objectives:
  • a single prioritization language (business, those, legal blocks);
  • transparent CAPA/investment decisions;
  • progress tracking (before/after measures), audit-ready.

2) Taxonomy and coverage area

Recommended domains:
  • Regulatory/Licenses, Privacy/Data, Information Security/Technical Processes, Payments/AML/KYC, Operations/Availability, Marketing/Responsible Advertising, Suppliers/VRM.
Sections:
  • Jurisdictions/Markets, Business Lines/Products, Services/Platforms, Critical Providers.

3) Probability and impact scales

3. 1 Probability (example of a 5-level scale)

1. Rare (once every> 3 years/p <5%)

2. Low (once every 1-3 years)

3. Average (annually)

4. High (quarterly)

5. Very high (monthly/more frequent)

3. 2 Impact (multivariate)

Evaluate according to the maximum of the criteria:
  • Finance: direct losses/penalties/chargeback.
  • Licenses/Legal implications: suspensions, bans, investigations.
  • Privacy/Data: PII scope, notifications, supervisory actions.
  • Operations/Uptime: MTTR, SLO, disrupted releases, RTO/RPO.
  • Reputation: media, social networks, partner sanctions.
  • Scale 1-5 with clear thresholds (e.g. 1: <€10k, 5:> €1m).

4) Scoring and risk levels

Individual risk: 'Score = Likelihood × Impact' (1-25).

Categories:
  • 20-25 - Critical (red)
  • 12-19 - High (orange)
  • 6-11 - Medium (yellow)
  • 1-5 - Low (green)
  • Residual risk: after taking into account current controls (efficacy confirmed by ToD/ToE/CCM).
  • Target risk: after planned measures; the date of achievement is fixed.

5) Data sources and linkage to controls

GRC-register: risk descriptions, owners, current/target assessments.
JMA/metrics: pass-rate of control rules, incidents, KRI.
Vendors/VRM: certificates, SLAs, incidents, changes in data locations.
Finance/Payments: fines, chargeback ratio, fraud loss%.
All values ​ ​ affecting the scales must have evidence links (logs/reports) and timestamps.

6) Aggregation and consolidation

Bottom-up: from services/jurisdictions to domains and company.
Aggregation rules: Impact maximum, Likelihood percentile, or weighted median (by business volume).
Separate layers: Inherent (without controls), Residual (with controls), Target (after CAPA).
Separate correlating risks (e.g., shared infrastructure vulnerability) and independent ones.

7) Visualization

Color-coded 5 × 5 matrix; interactive risk points with pop-up cards (description, owner, controls, CAPA).
Layer switches: Inherent/Residual/Target.
Filters: jurisdiction, product, domain, provider, period.
Trends "before/after" measures and "drift" (drift) in 30-90 days.

8) Roles and RACI

ActivityRACI
Method and scalesRisk Office / Compliance EngHead of RiskLegal/DPO, FinanceInternal Audit
Update of estimatesRisk OwnersHead of FunctionControl OwnersCommittee
Control linkage/KRICompliance EngHead of ComplianceSecOps/DataInternal Audit
Publishing dashboardsCompliance AnalyticsHead of ComplianceBI/Data PlatformExec/Board
Review and solutionsRisk & Compliance CommitteeExecutive SponsorAll DomainsBoard

9) KRI and escalation thresholds

Examples of KRI (link to risks on the map):
  • Privacy: dsar_response_p95, TTL deletion, complaints/ombudsman.
  • Security: p95 TTR vulnerabilities, share of critical red CCM rules, SoD violations.
  • Payments: chargeback ratio, fraud loss%, win-rate appeals.
  • Operations: SLO breach rate, incidents p1/p2, RTO/RPO tests.
  • Escalation: Amber when going beyond warning thresholds, Red - mandatory CAPA and "stop-the-line" for critical areas.

10) Decision making and communication with CAPA

For each "red" point, an action plan is required: Corrective/Preventive, owner, term, budget, KPI of success.

Threshold rules (example):
  • Critical: CAPA ≤ 30 days, re-audit in 60-90 days; committee - weekly.
  • High: CAPA ≤ 60 days, follow-up 90 days.
  • Medium/Low: In the quarter/half year plan.
  • If reduction is impossible - waiver with expiration date and compensating controls.

11) Dashboards (minimum)

Heatmap View: current matrix + Residual/Target layers.
Risk Trend: Before/After CAPA.
Controls Linkage: CCM pass-rate by risk, red gates.
Regulatory Exposure: Risks by Jurisdiction and License.
Vendor Risk: heat map of critical providers (certificates, SLA, incidents).
Audit-Readiness: completeness evidence/hash receipts for risks.

12) Performance metrics

Risk Reduction Index: ∆ of weighted average risk rate by quarters.
On-time CAPA:% of measures on time (by severity).
Repeat Findings (12 months): share of repetitions for related risks.
Evidence Completeness:% risks with full evidence package.
Drift After Fix: cases of return to the "red" zone after 30-90 days.
Coverage: Proportion of business assets/jurisdictions reflected on the map.

13) SOP (standard procedures)

SOP-1: Procedure initialization

Determine scales and thresholds → agree in the Committee → fix in the repository (versioning).

SOP-2: Quarterly cycle

Input data collection/KRI → recalculation of ratings → review by owners → committee decisions → publication of dashboards → export "audit pack."

SOP-3: Trigger Incident

In a Critical/High incident, an unscheduled map update, CAPA binding and re-audit plan.

SOP-4: Vendor loop

VRM Survey/Certificates → Vendor Risk Update → Vendor Mirror Confirmation

SOP-5: Archive and evidence

Snapshots heatmap (PDF/PNG/CSV) + hash receipts → WORM archive → links in GRC.

14) Artifact patterns

14. 1 Risk card (fragment)

ID/Name, Owner, Domain/Jurisdictions

Likelihood/Impact/Inherent/Residual/Target

Controls (ID, metrics, CCM rules)

KRI and actual values

CAPA/waivers, dates, budget, KPIs

Evidence links and hash receipts

14. 2 Scales policy (shutter speed)


Likelihood:
1: p<5% / >3y
3: annual
5: monthly+
Impact (finance):
1: <€10k
3: €100k–€300k
5: >€1m
Escalation:
Critical: CAPA≤30d; Committee weekly
High: CAPA≤60d; Committee bi-weekly

14. 3 Before/After Report

Heatmap screenshots (Residual vs Target)

Table ∆ - Changes by Risk

Completed CAPAs, resilience metrics

15) Antipatterns

"Beautiful picture" without reference to controls/KRI and CAPA.
Unclear scales → manipulation of estimates.
No versioning/evidence of score change.
Summarize disparate risks without aggregation rules.
Rare updates → map does not reflect reality.
Waivers without deadlines and compensatory measures.

16) Maturity model (M0-M4)

M0 Ad-hoc: one-time picture, no methods/metrics.
M1 Planned: agreed scales, quarterly updates.
M2 Managed: link with controls/KRI, CAPA, dashboards, WORM archive.
M3 Integrated: automatic recalculation (CCM), policy-/assurance-as-code, slices by jurisdiction/vendor.
M4 Continuous Assurance: predictive KRIs, scenario modeling, what-if, priority recommendations.

17) Related wiki articles

Risk-Based Audit (RBA)

KPIs and compliance metrics

Continuous Compliance Monitoring (CCM)

Remediation Plans (CAPAs)

Re-audits and follow-up

Policy and compliance repository

Compliance Roadmap

Partner/VRM Compliance Guide

Total

The risk heatmap is not a report, but a management mechanism: uniform scales, communication with controls and KRIs, regular updates, provable decisions, and post-measure sustainability controls. This approach makes prioritization objective, speeds up committee decisions, and maintains ongoing audit-ready readiness.

Contact

Get in Touch

Reach out with any questions or support needs.We are always ready to help!

Start Integration

Email is required. Telegram or WhatsApp — optional.

Your Name optional
Email optional
Subject optional
Message optional
Telegram optional
@
If you include Telegram — we will reply there as well, in addition to Email.
WhatsApp optional
Format: +country code and number (e.g., +380XXXXXXXXX).

By clicking this button, you agree to data processing.