GH GambleHub

KPIs and compliance metrics

1) Why compliance metrics

Metrics translate requirements and risks into manageable goals. Good KPI/KRI system:
  • makes the compliance status transparent and comparable over time;
  • links compliance with business result (reduction of losses/fines/delays of releases);
  • allows you to manage priorities and resources based on facts, not feelings;
  • simplifies auditing: there are traceable formulas, sources and invariable artifacts (evidence).
Terms:
  • KPI - performance indicators (process efficiency).
  • KRI - risk indicators (probability/impact of events).
  • SLO/SLA - target service levels/term commitments.
  • Leading vs Lagging: leading and lagging indicators.

2) Metrics map by domain (reference matrix)

DomainKPI/KRITypeFormula (brief)Purpose (example)
Policies/TrainingCoverage of assessmentsKPIcompleted _ course/must _ complete≥ 95 %/quarter
MTTU PolicyKPIt_publikatsii − t_triggera≤ 30 days
Accesses/IAMAccess HygieneKPIobsolete _ rights/all _ rights≤ 2%
SoD ViolationsKRInumber of toxic combinations0 (critical)
Data/privacyDSAR SLA on timeKPIin _ term/total≥ 98%
TTL ViolationsKRI_ over _ TTL objects↓ to zero
Infra/Cloud/IaCDrift RateKPIdrifts/month↓ trend
Encryption CoverageKPI_ encrypted _ resources/all100%
DevSecOps/codeSecrets in ReposKRIleaks _ of secrets/month0 critical
License ComplianceKPIpackages _ with _ non _ license0
AML/TransactionsSTR/SAR TimelinessKPIin _ term/total≥ 99%
False Positive Rate AMLKPIfalse/all alerts≤ 10% (with context)
Incidents/AuditsTime-to-Remediate FindingsKPImedian t_zakrytiya≤ 30 days High
Repeat FindingsKRI% repetitions in 12 months≤ 5%

3) Compliance North Star

1. Audit-ready in N hours (all evidence collected automatically).
2. Zero Critical Violations.
3. ≥ 90% Coverage with automated controls (policy-as-code + CCM).

4) Taxonomy of metrics

4. 1 Coverage

Control Coverage: controlled systems/all critical systems.
Evidence Coverage: artifacts collected/by audit checklist.
Policy Adoption: processes where requirements are implemented ,/all target processes.

4. 2 Effectiveness (efficacy of controls)

Pass Rate of control tests: passed/total period tests.
FPR/TPR (false/true) for detective rules.
Incidents Prevented: cases prevented by preventive controls.

4. 3 Efficiency (cost/speed)

MTTD/MTTR violations: time to detection/elimination.
Cost per Case (AML/DSAR): hours × rate + infrastructure costs.
Automation Ratio: auto-solutions/all solutions.

4. 4 Timeliness

Execution SLA (DSAR/STR/training): on time/total.
Lead Time policies: from trigger to publication.
Change Lead Time (DevSecOps gates): from PR to release for compliance checks.

4. 5 Quality (data/process quality)

Evidence Integrity:% of artifacts in WORM with hash summary.
Data Defects: errors in reg reporting/reports.
Training Score: average test score,% from the first time.

4. 6 Risk Impact

Risk Reduction Index: ∆ of total risk rate after remediation.
Regulatory Exposure: Open Critical Gaps vs License/Certification Requirements.
$ Avoided Losses (estimated): Penalties/losses averted by closing gaps.

5) Formulas and examples of calculations

5. 1 DSAR SLA

'DSAR _ SLA = (number of applications closed ≤ 30 days )/( number of applications total) '

Goal: ≥ 98%; red <95%, yellow 95-97. 9.

5. 2 Access Hygiene

'AH = obsolete _ rights (no owner/past due )/all _ rights'

Threshold: ≤ 2% (red zone> 5%).

5. 3 Drift Rate (IaC/Cloud)

'DR = drifts (IaC↔fakt mismatches )/month'

Trend: steady decline for 3 months in a row.

5. 4 Time-to-Remediate (по severity)

High: median ≤ 30 days; Critical: ≤ 7 days. Delay → auto-escalation.

5. 5 AML FPR

'FPR = false-positive _ alerts/all _ alters'

Balance with TPR and handling losses.

5. 6 Evidence Coverage (audit)

'EC = collected _ artifacts/mandatory _ by _ checklist'

Objective: 100% by the D-date of the audit; operational goal - ≥ 95% continuously.

6) Data and evidence sources (evidence)

Compliance DWH showcase: DSAR, Legal Hold, TTL, audit logs, alerts.
IAM/IGA: roles, owners, attestation campaigns.
CI/CD/DevSecOps: SAST/DAST/SCA, secret scan, licenses, gates.
Cloud/IaC: snapshots of configs, drift reports, KMS/HSM logs.
SIEM/SOAR/DLP/EDRM: correlations, playbooks, locks.
GRC: register of requirements, controls, waivers and audits.
WORM/Object Lock: unchangeable archive of artifacts + hash summary.

7) Dashboards (minimum set)

1. Compliance Heatmap - Systems × regulations × status.
2. SLA Center - DSAR/STR/training: deadlines, delinquencies, forecast.
3. Access & SoD - toxic roles, orphan accounts, progress of attestation.
4. Retention & Deletion - TTL violations, Legal Hold locks, trends.
5. Infra/Cloud Drift - IaC inconsistencies, encryption, segmentation.
6. Findings Pipeline - open/expired/closed by owners and severity.

7. Audit Readiness - evidence coverage and time to readiness "on the button."

Color zones (example):
  • Green - target met/stable.
  • Yellow - risk of deviation, plan required.
  • Red - critical deviation, immediate escalation.

8) OKR link (example quarter)

Objective: Reduce regulatory and operational risk without slowing down releases.

KR1: Increase Coverage of automated controls from 72% → 88%.
KR2: Reduce Access Hygiene from 4. 5% → ≤ 2%.
KR3: 99% DSAR on time; median response ≤ 10 days.
KR4: Drift Rate clouds − 40% QoQ.
KR5: Time-to-Audit-Ready ≤ 8 hours (dry-run).

9) RACI for metrics

RoleArea of responsibility
Head of Compliance / DPO (A)Selection of target KPI/KRI, thresholds and reporting updates
Compliance Analytics (R)Models, formulas, data marts, dashboards
Data Platform (R)Pipelines, data quality, WORM-archive evidence
SecOps/Cloud Sec (C)Drift, encryption, SOAR playbooks
IAM/IGA (C)Appraisals, SoDs, Access Holders
Product/DevSecOps (C)Gates, vulnerabilities, secret scan
GRC (R/C)Register of requirements/controls, waivers
Internal Audit (I)Verification of calculations and sources

10) Measurement frequency and procedures

Daily: CCM alerts, drift, secrets, critical incidents.
Weekly: SLA DSAR/STR, DevSecOps gates, Access Hygiene.
Monthly: pass rate controls, repeated findings, Evidence Coverage.
Quarterly: OKR-summary, Risk Reduction Index, audit-rehearsal (dry-run).

Threshold review procedure: trend, cost and risk analysis; changing thresholds - via Board.

11) Quality of metrics: rules

Unified semantics: dictionary of terms and SQL templates.
Formula versioning: "metric as code" (repository + review).
Reproducibility check: reperform scripts for auditors.
Immutability of artifacts: WORM + hash chains.
Privacy: minimization, masking, control of access to KPI showcases.

12) Query examples (SQL/pseudo)

12. 1 DSAR SLA (30 days):

sql
SELECT
COUNTIF(closed_at <= created_at + INTERVAL 30 DAY) / COUNT() AS dsar_sla_rate
FROM dsar_requests
WHERE created_at BETWEEN @from AND @to;

12. 2 Access Hygiene:

sql
SELECT
SUM(CASE WHEN owner IS NULL OR expires_at < CURRENT_DATE THEN 1 END)
/ COUNT() AS access_hygiene
FROM iam_entitlements
WHERE system_critical = TRUE;

12. 3 Drift (Terraform vs fact):

sql
SELECT COUNT() AS drifts
FROM drift_detections
WHERE detected_at BETWEEN @from AND @to
AND severity IN ('high','critical');

13) Thresholds (reference examples, adapt)

MetricsGreenYellowRed
DSAR SLA≥ 98%95–97. 9%< 95%
Access Hygiene≤ 2%2. 01–5%> 5%
Drift Rate (high/crit)≤ 5/month6-15/month> 15/month
Evidence Coverage100%95–99. 9%< 95%
Pass Rate Controls≥ 97%90–96. 9%< 90%
Time-to-Audit-Ready≤ 8 h8-24 h> 24 h

14) Antipatterns

Metrics "for report" without owner and action plan.
Mixing formula versions → disparate trends.
Coverage without efficiency: high Coverage, but high drift and repeated findings.
Ignores the cost of false positives (FPR) in AML/CCM.
Metrics without risk context (no association with KRI and licenses).

15) Checklists

KPI system startup

  • Metrics dictionary and single "metrics as code" repository.
  • Assigned owners (RACI) and refresh rates.
  • Sources and the Compliance showcase are connected.
  • Dashboards and color zones, SLO/SLAs and escalations are configured.
  • WORM archive and report hash.
  • Dry-run for audit with reperform.

Before quarterly report

  • Verification of formulas, anomaly control.
  • Update of near-regulatory thresholds.
  • Cost/benefit analysis FPR vs TPR.
  • Red Zone Improvement Plan.

16) Metrics maturity model (M0-M4)

M0 Manual accounting: Excel-tables, irregular reports.
M1 Catalogue: single showcase, basic SLAs and trends.
M2 Automated: real-time dashboards, escalation.
M3 Orchestrated: policy-as-code, CCM, auto-evidence, reperform.
M4 Continuous Assurance: "audit-ready by button," predictive (ML) risk metrics.

17) Related wiki articles

Continuous Compliance Monitoring (CCM)

Compliance and reporting automation

Risk-based audit

Policies and Procedures Lifecycle

Legal Hold and Data Freeze

DSAR: user requests for data

Data Retention and Deletion Schedules

Total

Strong compliance KPIs are clear formulas, reliable sources, owners and thresholds, an automated showcase, and deviation actions. This makes compliance a predictable service with a measurable impact on business risk and speed.

Contact

Get in Touch

Reach out with any questions or support needs.We are always ready to help!

Telegram
@Gamble_GC
Start Integration

Email is required. Telegram or WhatsApp — optional.

Your Name optional
Email optional
Subject optional
Message optional
Telegram optional
@
If you include Telegram — we will reply there as well, in addition to Email.
WhatsApp optional
Format: +country code and number (e.g., +380XXXXXXXXX).

By clicking this button, you agree to data processing.