Score
Running Simulation
Initializing...
 Distributed Systems

Implement Consistent Hashing

Build a hash ring that distributes keys across nodes uniformly and minimizes key migration when the cluster topology changes. Naive modulo hashing remaps almost every key on each change — consistent hashing limits disruption to only 1/N of keys.

How It Works

1. The Ring. Hash both nodes and keys onto a circle (0 → 2³²). Each key walks clockwise to find its owner.

2. Virtual Nodes. Each physical node occupies 100–200 positions (e.g. node-A#42) to smooth distribution.

3. Minimal Disruption. Adding/removing a node only moves ~1/N of keys. Modulo hashing remaps almost everything.

Your API

add_node(node_id)

Add a node with virtual node positions. Idempotent.

remove_node(node_id)

Remove node. Only its keys move clockwise. No-op for unknown nodes.

get_node(key) → str|None

Return responsible node. Deterministic. None on empty ring.

Scoring

30%
Correctness
25%
Distribution
25%
Rebalancing
10%
Scalability
10%
Performance

Correctness is gated — below 50% skips all other phases. Rebalancing is hardest: <1% unnecessary remaps = 100/100.

Interactive Demo

0 nodes · 0 keys
Add nodes, then keys to see distribution.
solution.py
0
/ 100
Simulation Complete

Phase Radar

Benchmark vs Ideal

MetricYoursIdeal

Correctness Tests

10 functional checks: CRUD, edge cases, determinism, and error handling.

TestGotExpected

Key Distribution

10,000 keys distributed across 10 nodes. Lower CV = more uniform.

Keys per Shard

Rebalancing Efficiency

The hardest phase — how many keys move unnecessarily when the ring changes?

Actual vs Ideal Remap %

Migration Success

Scalability

50 nodes, 20K lookups — does the ring hold up at scale?

Performance

3,000 get_node calls benchmarked with microsecond precision.

Latency Percentiles

Latency Breakdown