Provider Capability Matrix
The provider capabilities matrix is a single catalog with normalized characteristics of external suppliers (gaming RGS/studios, PSP, KYC/AML, fraud, communications), which allows you to quickly answer questions: what is supported, where is available, how reliable, what risks, how much integration and operation costs.
The matrix is needed by the product, architecture, compliance and procurement for informed choice, migration planning and SLO control.
1) Scope
RGS/Game Providers: Game Types, Jackpots, RTP/Volatility, Betting Limits, Responsible Play Functions, Bonus Mechanics.
PSP/Payments: methods, 3DS/SDK, routing, retrays, currencies, commissions, chargebacks.
KYC/AML: verification levels, sources, SLA, accuracy, sanctions/PEP sets, price-per-check.
Fraud/Risk: signals, real-time API/batches, explainability, A/B releases, region restrictions.
Communications: e-mail/SMS/push, templates, limits, deliverability, signatures.
2) Matrix measurements (what we fix)
1. Functions and coatings
Feature categories (for example, for RGS: free spins, buy feature, jackpots, tournaments).
Support for bonuses/vager, responsible gaming hooks (reality check, session limit).
For PSP: tokenization, PCI scope, recurring, payouts, split, reconciliation.
2. Protocols and integration
Transport: REST/gRPC/WebSocket, webhooks, format (JSON/Proto).
Idempotency-Key, order (by key), signatures (HMAC, mTLS).
Events: list and schemes, delivery guarantees, retrays.
3. Reliability and performance
SLO/SLA (uptime, p95, p99), RPS/burst limits, queues, backoff, circuit breaker.
Quotas and rate limits per tenant, 'Retry-After'.
4. Regionality and licences
Geography/jurisdiction, data residency, certification (GLI/eCOGRA/PCI/KYC-provider attestations).
Localization (languages/currencies/taxes/restrictions).
5. Safety and compliance
Encryption, keys/certificates, OAuth2/HMAC, audit log.
PII/card data: disguise, tokens, shelf life, GDPR/local laws.
6. Economics and TCO
Pricing model: fix/per transaction/revshare, minimals, commissions, free tier.
Evaluation of integration costs: time, command slots, the need for certification.
7. Evolution and stability
Breaking changes frequency, versioning policy, sandboxes/canaries, incident response time.
Roadmap compatibility with your goals.
8. Risks
Vendor lock, traffic concentration, dependence on a specific region, legal risks.
Incident history, DLQ-rate/timeout-rate under your loads.
3) Unified rating scale
For comparability, use scores 0-3 and flags:- 0 - Not supported/Not acceptable.
- 1 - basic support, significant limitations.
- 2 - advanced, compliance with requirements without reserve.
- 3 - excellent implementation, additional advantages.
Additional: 'risk _ low' medium 'high', 'region _ allowed []', 'notes', 'evidence' (link to the dock/certificate is in your internal database).
4) Data scheme (recommendation)
yaml provider_id: "acme_rgs"
type: "RGS" # RGS PSP KYC FRAUD COMMS name: "Acme Gaming"
versions:
api: ["v2","v3"]
regions: ["eu","uk","ca","latam"]
capabilities:
rgs:
games:
slots: 3 live_casino: 2 table_games: 2 features:
free_spins: 3 jackpots: { score: 2, type: ["network","local"] }
bonus_hooks: { score: 3, events: ["stake","win","session"] }
rg_hooks:
reality_check: 2 session_limit: 2 protocols:
transport: ["REST","WebSocket"]
webhooks: { score: 3, retry: "at-least-once", signature: "HMAC" }
idempotency: { score: 3, header: "Idempotency-Key" }
reliability:
sla_uptime_pct: 99. 9 p95_ms: 180 rate_limit_rps: 500 security:
mTLS: true oauth2: false pii_redaction: true compliance:
certifications: ["GLI-19"]
data_residency: ["eu-central","uk-south"]
pricing:
model: "revshare"
notes: "min monthly guarantee applies"
risk:
vendor_lock: "medium"
incident_history: { last12m: 2, major: 0 }
5) Relational model (minimum)
providers(id, type, name, status, created_at, updated_at)
provider_regions(provider_id, region, residency, allowed)
capability_groups(id, provider_id, group, key, score, meta_jsonb)
slas(provider_id, sla_name, target, unit)
security(provider_id, control, value)
pricing(provider_id, model, unit_cost, notes)
risks(provider_id, category, level, notes)
evidence(provider_id, kind, doc_ref, valid_until)
6) Reports/slices that are really needed
Selection of a provider for the market: filter by 'region', 'data _ residency', 'license'.
Technical compatibility: Only those with 'webhooks + idempotency + HMAC/mTLS'.
Performance: 'p95 ≤ X', 'rate _ limit ≥ Y', version stability.
Bonus mechanics of RGS: the presence of 'free spins', 'jackpot', 'bonus _ hooks'.
Payments: methods' PIX ',' PayID ',' cards', 'crypto', payouts ≤ N hours.
Risks: 'risk. level!= high`, `incident_history. last12m <= 3`.
Economy: 'revshare ∈ [X; Y] 'or'CPT ≤ Z', available discounts.
7) Capability tests (automatic validation)
The idea: Every opportunity is backed up by a test case and/or a sandbox "trial run."
Examples:- Idempotency: two identical queries with'Idempotency-Key '→ one effect.
- Webhooks: transfer of duplicates/Out-of-Order → adapter suppresses, keeps order by key.
- Rate limit: withstand burst and see 'Retry-After'.
- RGS functions: free spins → correct 'stake/win' events; The RTP window fits into the contract.
- PSP payouts: SLA in time, correctness of reconciliation.
Store the test result next to the provider's record: 'last _ run _ at', 'passed', 'failures []'.
8) Implementation and Upgrade Process
1. Collection of sources: documentation, certification checklists, sandboxes, contact persons.
2. Normalization: mapping of terms to the internal dictionary (via ACL).
3. Assessment and points: filling the matrix, launching capability tests.
4. Solution: supplier selection by weight model (see below).
5. Integration: phicheflags, canary by tenants/markets, SLA threshold alerts.
6. Operation: metrics, incident-reports, quarterly score review.
7. Output/migration: offboarding criteria, traffic migration plan.
9) Selection weight model (example)
yaml weights:
capabilities. features: 0. 25 protocols. reliability: 0. 20 security. compliance: 0. 15 region_coverage: 0. 15 economics. tco: 0. 15 vendor_risk: 0. 10 decision:
score = Σ(weight_i normalized_score_i)
thresholds:
adopt: score >= 0. 75 pilot: 0. 60 <= score < 0. 75 monitor: 0. 45 <= score < 0. 60 reject: score < 0. 45
Normalize based on the 0-3 scale and numerical metrics (min-max or z-score).
10) UI/directory: what should be in the interface
Filters: type, region, SLA, functions, security, price/model.
Comparison of 2-4 providers in the table, highlighting differences.
Risk plates: 'High/Medium/Low' with decoding.
Changelog, certificate expiration date, last cap-test date.
Button "export" (CSV/JSON) and "create integration" (connection with task tracker).
11) Observability in the product (feed the matrix with facts)
Those. metrics: successes/errors by class, p95/p99, DLQ-rate, redrive-success, opening breaker.
Case metrics: deposit/payout conversion, limit failure, KYC negotiation speed.
Incidents: MTTR/MTBF by provider, cause, feedback.
Synchronization: auto-uploading facts to the matrix (daily), recalculating points.
12) Versioning and Change Management
Each entry has a 'schema _ version', 'capabilities _ version', 'reviewed _ at', 'reviewer'.
Breaking changes creates draft vNext; vCurrent vs vNext comparison.
Use canary flags and SLO "soft thresholds" until a complete update.
Expiring certificates/keys → alerts for 30/7/1 day.
13) Security and access
RLS: access to the matrix by role (architecture, compliance, product, procurement).
Audit log: who changed the scores/risks/evidence.
PII/secrets do not keep; references to Vault/KMS references.
14) Typical errors
Comparison "by marketing," not by contracts and tests.
There is no normalization of terms → it is impossible to compare.
The lack of weights and thresholds → decisions are emotional.
The matrix is static → does not take into account real p95/DLQ in sales.
Ignoring regional restrictions and residency.
The same limits for all tenants → a "noisy" client breaks SLO.
15) Playbooks
The provider does not pass the cap test: fix the gap, open the ticket to the provider, put 'pilot '/' reject'.
Timeout growth/5xx: activate throttling, open breaker, switch traffic to backup over the matrix.
Commercial changes (tariff): we update the'pricing', recalculate the TCO, reshuffle the weights of "economics."
Regulatory change: update 'regions/licensing', block markets by flag, launch migrations.
16) Checklist before matrix start-up
- Glossary of terms and scale 0-3 approved.
- Completed key measurements (functions, protocols, SLAs, security, regions, price, risk).
- Configured capability tests and daily synchronization of metrics from production.
- Weights and thresholds'adopt/pilot/monitor/reject 'defined.
- Change auditing and RLS access enabled.
- There are exports and dashboards for comparing 2-4 providers.
- Configured alerts for certificate expiration and SLO degradation.
- Review process documented (quarterly/per incident).
Conclusion
The "provider capability matrix" turns supplier selection and management into engineering practice rather than guesswork. Normalize language, capture facts, automate validations, and rely on real-world metrics to ensure solutions are fast, comparable, and transparent to product, architecture, and compliance.