Self-Balancing Trees
Self-balancing trees are a class of binary search trees (BSTs) that automatically maintain their height close to optimal, ensuring efficient operations like search, insert, and delete.
1. Prerequisites
Before understanding self-balancing trees, you must grasp these foundational concepts:
- Binary Search Trees (BSTs): A hierarchical data structure where each node follows the rule: left subtree contains smaller values, right subtree contains larger values.
- Tree Height: The longest path from root to a leaf, crucial for efficiency.
- Time Complexity: In unbalanced BSTs, operations can degrade to O(n), but self-balancing trees ensure O(log n) complexity.
- Rotation Operations: A key mechanism used in balancing, involving left and right rotations to maintain structure.
2. What is a Self-Balancing Tree?
A self-balancing tree dynamically maintains a near-optimal height by applying balance operations during insertion and deletion. This prevents performance degradation due to skewed structures.
Popular self-balancing trees include:
- AVL Tree: Ensures balance using height difference (balance factor) and performs rotations to restore balance.
- Red-Black Tree: Uses color properties and rotation rules to balance while ensuring O(log n) complexity.
- B-Trees: A generalization used in databases to handle large amounts of data efficiently.
- Splay Trees: Uses access-based self-adjustment to improve access to frequently used nodes.
3. Why Does This Algorithm Exist?
Self-balancing trees are used where fast and consistent performance is required:
- Database Indexing: B-Trees are heavily used in databases to ensure quick searches and updates.
- File Systems: Many modern file systems (like ext4) use balanced trees for efficient storage lookup.
- Networking & Routing: Red-Black Trees help implement routing tables for consistent performance.
- Compiler Optimization: Symbol tables in compilers use AVL Trees or Red-Black Trees for fast lookups.
- Memory Allocation: Used in dynamic memory allocation (e.g., Linux kernel's memory management).
4. When Should You Use a Self-Balancing Tree?
Use self-balancing trees when:
- Frequent Insertions and Deletions: Unlike static arrays or simple BSTs, self-balancing trees handle dynamic updates efficiently.
- Consistent O(log n) Time Complexity is Needed: Ideal when worst-case guarantees matter.
- Ordered Data Structure is Required: Suitable for applications that require in-order traversal.
- Memory Constraints Are Important: Compared to hash tables, trees provide an ordered structure with better memory efficiency.
5. How Does It Compare to Alternatives?
5.1 Strengths
- Guaranteed O(log n) operations (unlike unbalanced BSTs which can degrade to O(n)).
- Sorted Order Maintenance: Unlike hash tables, trees allow in-order traversal.
- Efficient Range Queries: Faster than hash tables for ordered queries.
- Scalability: Used in large-scale databases and file systems.
5.2 Weaknesses
- Higher Overhead: Requires rotations and balance maintenance, making insertion/deletion slower than unbalanced BSTs in some cases.
- More Complex Implementation: Rotations and color changes in Red-Black Trees add complexity.
- Suboptimal for Some Use Cases: Hash tables provide O(1) lookup time for exact matches but lack order guarantees.
6. AVL Tree Implementation (Python)
An AVL Tree is a self-balancing binary search tree where the height difference between the left and right subtrees (balance factor) is at most 1. Rotations (left, right, left-right, right-left) maintain balance.
class Node:
def __init__(self, key):
self.key = key
self.left = None
self.right = None
self.height = 1 # New node is initially at height 1
class AVLTree:
# Get height of a node
def get_height(self, node):
return node.height if node else 0
# Get balance factor
def get_balance(self, node):
return self.get_height(node.left) - self.get_height(node.right) if node else 0
# Right rotate
def right_rotate(self, y):
x = y.left
T2 = x.right
# Perform rotation
x.right = y
y.left = T2
# Update heights
y.height = max(self.get_height(y.left), self.get_height(y.right)) + 1
x.height = max(self.get_height(x.left), self.get_height(x.right)) + 1
return x # New root
# Left rotate
def left_rotate(self, x):
y = x.right
T2 = y.left
# Perform rotation
y.left = x
x.right = T2
# Update heights
x.height = max(self.get_height(x.left), self.get_height(x.right)) + 1
y.height = max(self.get_height(y.left), self.get_height(y.right)) + 1
return y # New root
# Insert a node
def insert(self, root, key):
# Step 1: Perform standard BST insert
if not root:
return Node(key)
elif key < root.key:
root.left = self.insert(root.left, key)
else:
root.right = self.insert(root.right, key)
# Step 2: Update height
root.height = max(self.get_height(root.left), self.get_height(root.right)) + 1
# Step 3: Get balance factor and rebalance if needed
balance = self.get_balance(root)
# Left Heavy (Right Rotation)
if balance > 1 and key < root.left.key:
return self.right_rotate(root)
# Right Heavy (Left Rotation)
if balance < -1 and key > root.right.key:
return self.left_rotate(root)
# Left-Right Case (Left Rotate then Right Rotate)
if balance > 1 and key > root.left.key:
root.left = self.left_rotate(root.left)
return self.right_rotate(root)
# Right-Left Case (Right Rotate then Left Rotate)
if balance < -1 and key < root.right.key:
root.right = self.right_rotate(root.right)
return self.left_rotate(root)
return root # Return unchanged node
7. Dry Run on Sample Input
Insert sequence: 30, 20, 40, 10, 25, 35, 50
Step-by-step execution:
7.1 Insert 30
- Tree is empty, 30 becomes root.
7.2 Insert 20
- 20 < 30, insert as left child.
- Balance factor = 1 (Left-heavy but balanced).
7.3 Insert 40
- 40 > 30, insert as right child.
- Balance factor = 0 (Perfectly balanced).
7.4 Insert 10
- 10 < 20 < 30, insert as left child of 20.
- Balance factor = 2 at 30 (Left-heavy) → Right Rotation.
30 20
/ \ → / \
20 40 10 30
/ \
10 40
7.5 Insert 25
- 25 > 20 but < 30, insert as right child of 20.
- Balance factor = 1 (Balanced, no rotation).
7.6 Insert 35
- 35 > 30 but < 40, insert as left child of 40.
- Balance factor = 0 (Balanced).
7.7 Insert 50
- 50 > 40, insert as right child.
- Balance factor = 0 (Balanced).
Final AVL Tree:
20
/ \
10 30
/ \
25 40
/ \
35 50
8. Variable Tracking During Execution
Step | Node Inserted | Root | Balance Factor | Rotation |
---|---|---|---|---|
1 | 30 | 30 | 0 | None |
2 | 20 | 30 | 1 | None |
3 | 40 | 30 | 0 | None |
4 | 10 | 30 → 20 | 2 (Unbalanced) | Right Rotation |
5 | 25 | 20 | 1 | None |
6 | 35 | 20 | 0 | None |
7 | 50 | 20 | 0 | None |
9. Quick Recap
- AVL Trees maintain O(log n) operations by ensuring a balanced structure.
- Rotations (left, right, left-right, right-left) keep balance intact.
- Real-time balance tracking ensures worst-case O(log n) insertions and deletions.
10. Time & Space Complexity Analysis
Self-balancing trees ensure that the height remains logarithmic, maintaining efficient operations. Let n be the number of nodes in the tree.
11. Time Complexity Analysis
11.1 Search Complexity
In a Binary Search Tree (BST), search complexity depends on tree height. For an unbalanced BST, it can degrade to O(n), but in a self-balancing tree:
- Best Case (O(1)): When the root is the searched key.
- Average Case (O(log n)): Height is maintained at O(log n), so search follows logarithmic time.
- Worst Case (O(log n)): Even in the worst scenario, height remains O(log n).
11.2 Insertion Complexity
Insertion consists of BST insertion (O(log n)) and rebalancing operations.
- Best Case (O(log n)): No rotations needed.
- Average Case (O(log n)): Insertions require at most one or two rotations.
- Worst Case (O(log n)): Height remains O(log n), and at most O(1) rotations are required.
11.3 Deletion Complexity
Deletion consists of three parts:
- Find the node (O(log n))
- Remove the node (O(log n))
- Rebalance the tree (O(log n))
Thus, the total deletion complexity remains O(log n).
11.4 Rotation Complexity
Rotations (left, right, left-right, right-left) are constant-time operations (O(1)), but the number of rotations in insertion or deletion is at most O(log n).
11.5 Overall Complexity Summary
Operation | Best Case | Average Case | Worst Case |
---|---|---|---|
Search | O(1) | O(log n) | O(log n) |
Insert | O(log n) | O(log n) | O(log n) |
Delete | O(log n) | O(log n) | O(log n) |
Rotation | O(1) | O(1) | O(1) |
12. Space Complexity Analysis
Space complexity depends on the number of nodes stored and auxiliary space for balancing.
12.1 Space Used by Nodes
- Each node stores key, left pointer, right pointer, and height (O(1) space per node).
- Total space = O(n) for n nodes.
12.2 Auxiliary Space for Recursive Calls
- Insertion and deletion use recursion.
- Each recursive call adds O(1) space.
- Max recursion depth = O(log n), leading to an auxiliary space complexity of O(log n).
12.3 Overall Space Complexity Summary
Component | Space Complexity |
---|---|
Tree Storage | O(n) |
Recursive Call Stack | O(log n) |
Total | O(n) |
13. Trade-offs and Practical Considerations
13.1 Strengths
- Maintains O(log n) operations, ensuring efficiency.
- Useful in databases, memory management, and real-time applications.
- Better than unbalanced BSTs in worst-case performance.
13.2 Weaknesses
- Insertion and deletion are slower than simple BSTs due to rotation overhead.
- Extra space for height/balance tracking increases memory usage.
- More complex to implement compared to simple BSTs or hash tables.
13.3 When to Use?
- When ordered data access is required.
- When worst-case guarantees are essential.
- When insertion/deletion frequency is high.
14. Optimizations & Variants
Self-balancing trees ensure O(log n) operations by maintaining height balance through rotations. Optimizations improve efficiency, and different tree variants exist based on application needs.
15. Common Optimizations
15.1 Optimizing Rotations
- Delayed Rotations: Instead of rebalancing immediately after every insertion, defer until imbalance exceeds a threshold.
- Bulk Rebalancing: Instead of rebalancing for each insert, process multiple insertions and perform batch rotations.
- Minimal Rotations: Choose the best rotation sequence to minimize tree restructuring.
15.2 Optimizing Height Updates
- Avoid Unnecessary Height Recomputations: Store height differences instead of recalculating on each update.
- Lazy Height Updates: Update heights only when necessary, reducing constant overhead.
15.3 Efficient Memory Usage
- Use pointer compression techniques to reduce node size in memory.
- Maintain threaded nodes to avoid recursion overhead.
15.4 Cache-Aware Implementations
- Optimize node access patterns to improve CPU cache efficiency.
- Use cache-friendly data structures like B-Trees for disk-based storage.
16. Different Variants of Self-Balancing Trees
Several self-balancing tree variants exist, each optimized for specific use cases.
16.1 AVL Tree
- Balance Factor: Height difference of left and right subtrees is at most 1.
- Rotations: Requires rotations after insertions/deletions.
- Use Case: Applications needing fast lookups (e.g., compilers, database indexing).
16.2 Red-Black Tree
- Balance Property: Uses color-based balancing (red/black nodes) instead of height.
- Rotations: Fewer rotations than AVL trees.
- Use Case: Widely used in STL (C++ map/set, Java TreeMap), database indexing, and Linux kernel.
16.3 B-Trees
- Multi-Way Tree: Each node stores multiple keys and has multiple children.
- Use Case: Databases, File Systems (e.g., NTFS, PostgreSQL).
16.4 Splay Tree
- Self-adjusting: Recently accessed elements move to the root.
- Use Case: Caching, network routers, ZIP trees.
16.5 Treap (Tree + Heap)
- Combines BST & Heap: Each node has a random priority, ensuring balance.
- Use Case: Randomized balancing, fast updates.
16.6 Weight-Balanced Tree
- Balances Based on Node Counts, not height.
- Use Case: Useful in concurrent environments where rebalancing is costly.
17. Iterative vs. Recursive Implementations
17.1 Recursive Implementation
Most AVL and Red-Black tree implementations use recursion for insertions and deletions.
- Pros: Code is cleaner and easier to implement.
- Cons: Uses extra space for recursive call stack (O(log n) auxiliary space).
17.2 Iterative Implementation
Instead of recursion, an explicit stack or parent pointers handle traversal.
- Pros: Avoids recursion overhead, reducing memory usage.
- Cons: More complex to implement due to manual stack handling.
17.3 Performance Comparison
Implementation | Time Complexity | Space Complexity | Ease of Use |
---|---|---|---|
Recursive | O(log n) | O(log n) (stack space) | Simpler |
Iterative | O(log n) | O(1) (no recursion stack) | More complex |
17.4 When to Use Which?
- Recursive: When memory constraints aren’t tight, and clean code is preferred.
- Iterative: When performance is critical, and recursion depth is a concern (e.g., system-level applications).
18. Key Takeaways
- Use AVL Trees for fast lookups, Red-Black Trees for dynamic insertions, and B-Trees for large datasets.
- Optimize rotations, height updates, and cache efficiency for better performance.
- Choose between recursive and iterative implementations based on memory constraints and efficiency needs.
19. Common Edge Cases & Pitfalls
19.1 Edge Cases in Insertion
- Inserting Duplicate Keys: Self-balancing trees typically do not support duplicates. Solutions:
- Reject duplicate insertions.
- Modify structure to store frequencies.
- Insertion in Skewed Order: Inserting values in strictly increasing/decreasing order should still maintain balance.
- Rotations Not Triggering When Needed: Ensure balance checks occur after every insertion.
19.2 Edge Cases in Deletion
- Deleting a Leaf Node: No rebalancing is required.
- Deleting a Node with One Child: Child should replace the node properly.
- Deleting a Node with Two Children: Inorder successor/predecessor must be used.
- Rebalancing Failure: Improper height updates after deletion can cause incorrect balancing.
19.3 Rotations & Height Updates
- Missed Rotation Scenarios: Ensure that Left-Right and Right-Left cases are correctly handled.
- Incorrect Height Updates: Always update node height after structural modifications.
19.4 Memory & Performance Issues
- Stack Overflow (Recursive Implementations): Large trees may exceed recursion depth.
- Excessive Rotations: AVL trees may perform unnecessary balancing when batch insertions occur.
20. Test Cases to Verify Correctness
To ensure correctness, the following test cases should be executed.
20.1 Basic Functionality Tests
- Insert single element, check root.
- Insert elements in random order, verify structure.
- Delete root when it has no children, one child, or two children.
- Search for elements present and absent.
20.2 Edge Case Tests
Handle extreme scenarios.
- Insert elements in sorted order and verify balance.
- Insert duplicate values, check rejection or frequency count.
- Delete a deep node and ensure proper rebalancing.
- Delete a node involved in rotation, verify correct reordering.
20.3 Stress Tests
Simulate large-scale operations.
- Insert 1 million elements, measure time.
- Delete half the elements, check rebalancing.
- Perform bulk insertions & deletions, ensure consistent height.
20.4 Automated Test Cases (Python)
import unittest
class TestAVLTree(unittest.TestCase):
def setUp(self):
self.tree = AVLTree()
self.root = None
def test_insert_single(self):
self.root = self.tree.insert(self.root, 10)
self.assertEqual(self.root.key, 10)
def test_insert_sorted_order(self):
for val in [10, 20, 30, 40, 50]:
self.root = self.tree.insert(self.root, val)
self.assertEqual(self.root.key, 30) # Tree should be balanced
def test_delete_leaf(self):
self.root = self.tree.insert(self.root, 20)
self.root = self.tree.insert(self.root, 10)
self.root = self.tree.insert(self.root, 30)
self.root = self.tree.delete(self.root, 10)
self.assertIsNone(self.root.left)
def test_delete_node_with_two_children(self):
self.root = self.tree.insert(self.root, 50)
self.root = self.tree.insert(self.root, 30)
self.root = self.tree.insert(self.root, 70)
self.root = self.tree.insert(self.root, 60)
self.root = self.tree.insert(self.root, 80)
self.root = self.tree.delete(self.root, 70)
self.assertEqual(self.root.right.key, 80) # Ensure correct replacement
if __name__ == "__main__":
unittest.main()
21. Real-World Failure Scenarios
21.1 Database Index Corruption
Many databases use B-Trees and Red-Black Trees for indexing. Failure to rebalance after deletion leads to inefficient queries.
21.2 File System Errors
File systems using self-balancing trees (like ext4) may suffer from corruption if rotations are skipped.
21.3 Network Routing Failures
Routing tables use balanced trees for quick lookups. If balance maintenance fails, lookup speeds degrade significantly.
21.4 Memory Leaks in Long-Lived Applications
Improper memory management in tree-based caches (e.g., LRU caches using AVL Trees) can cause memory leaks.
22. Quick Recap
- Handle duplicate keys, edge insertions, and extreme deletions.
- Write comprehensive test cases to validate operations.
- Watch for real-world failure cases in databases, OS, and networking.
23. Real-World Applications of Self-Balancing Trees
23.1 Database Indexing (B-Trees, AVL Trees, Red-Black Trees)
- Use Case: Indexing for efficient search, insert, delete operations.
- Examples: MySQL, PostgreSQL, and Oracle databases use B-Trees and B+ Trees for indexing.
23.2 File Systems (B-Trees, Red-Black Trees)
- Use Case: Directory lookups, metadata storage, fast file retrieval.
- Examples: Linux ext4 and XFS file systems use self-balancing trees for quick access.
23.3 Networking & Routing Tables (Red-Black Trees, AVL Trees)
- Use Case: Efficient IP routing, load balancing, DNS caching.
- Examples: Linux Kernel uses Red-Black Trees in the network routing table.
23.4 Memory Allocation (Red-Black Trees, AVL Trees)
- Use Case: Efficient allocation & deallocation of memory blocks.
- Examples: The Linux kernel memory allocator (SLUB) uses Red-Black Trees.
23.5 Compiler Optimization & Symbol Tables (AVL Trees, Splay Trees)
- Use Case: Fast lookups in symbol tables and syntax trees.
- Examples: GCC and LLVM compilers use self-balancing trees for variable/function lookups.
23.6 Artificial Intelligence & Machine Learning (KD-Trees, Ball Trees)
- Use Case: Efficient nearest-neighbor search, decision trees.
- Examples: Scikit-learn uses KD-Trees for fast nearest-neighbor searches.
24. Open-Source Implementations of Self-Balancing Trees
24.1 Linux Kernel (Red-Black Trees)
The Linux Kernel uses a Red-Black Tree for managing process scheduling.
Code reference: Linux RB-Tree Source Code
24.2 SQLite (B-Trees for Indexing)
SQLite uses a B-Tree structure to optimize indexing and storage efficiency.
Code reference: SQLite Source Code
24.3 C++ STL (Red-Black Tree in std::map & std::set)
The C++ Standard Library implements Red-Black Trees for ordered maps and sets.
Code reference: GCC STL Documentation
24.4 PostgreSQL (B-Trees for Query Optimization)
PostgreSQL databases rely on B-Trees for fast indexing and query execution.
Code reference: PostgreSQL Source Code
25. Practical Project: Implementing a Caching System with an AVL Tree
We will implement a simple in-memory caching system using an AVL Tree. This cache will support:
- Insertions: Add a key-value pair.
- Searches: Retrieve a value in O(log n).
- Eviction Policy: Delete the least recently accessed node.
25.1 AVL Tree Cache Implementation (Python)
class Node:
def __init__(self, key, value):
self.key = key
self.value = value
self.left = None
self.right = None
self.height = 1 # New node starts at height 1
class AVLTreeCache:
def get_height(self, node):
return node.height if node else 0
def get_balance(self, node):
return self.get_height(node.left) - self.get_height(node.right) if node else 0
def right_rotate(self, y):
x = y.left
T2 = x.right
x.right = y
y.left = T2
y.height = max(self.get_height(y.left), self.get_height(y.right)) + 1
x.height = max(self.get_height(x.left), self.get_height(x.right)) + 1
return x
def left_rotate(self, x):
y = x.right
T2 = y.left
y.left = x
x.right = T2
x.height = max(self.get_height(x.left), self.get_height(x.right)) + 1
y.height = max(self.get_height(y.left), self.get_height(y.right)) + 1
return y
def insert(self, root, key, value):
if not root:
return Node(key, value)
elif key < root.key:
root.left = self.insert(root.left, key, value)
else:
root.right = self.insert(root.right, key, value)
root.height = max(self.get_height(root.left), self.get_height(root.right)) + 1
balance = self.get_balance(root)
if balance > 1 and key < root.left.key:
return self.right_rotate(root)
if balance < -1 and key > root.right.key:
return self.left_rotate(root)
if balance > 1 and key > root.left.key:
root.left = self.left_rotate(root.left)
return self.right_rotate(root)
if balance < -1 and key < root.right.key:
root.right = self.right_rotate(root.right)
return self.left_rotate(root)
return root
def search(self, root, key):
if not root or root.key == key:
return root
elif key < root.key:
return self.search(root.left, key)
else:
return self.search(root.right, key)
# Usage Example
cache = AVLTreeCache()
root = None
root = cache.insert(root, "user:100", "John Doe")
root = cache.insert(root, "user:200", "Alice Smith")
# Search for user:100
result = cache.search(root, "user:100")
print(result.value if result else "Not found") # Output: John Doe
25.2 Project Use Case
This AVL Tree Cache can be integrated into:
- Web applications: Store frequently accessed user data.
- API rate limiters: Store request timestamps efficiently.
- Session management: Track active sessions dynamically.
26. Quick Recap
- Self-balancing trees power databases, file systems, and networking applications.
- Open-source projects like Linux, PostgreSQL, and SQLite use these trees extensively.
- An AVL Tree Cache is a practical example of using self-balancing trees in real-world applications.
27. Competitive Programming Assignments
Self-balancing trees are widely used in competitive programming and system design due to their guaranteed O(log n) operations. Solve the following problems using AVL Trees, Red-Black Trees, or Treaps.
27.1 Basic Problems (Warm-Up)
- Problem 1: Insert N elements into an AVL Tree and perform in-order traversal.
- Problem 2: Implement a self-balancing tree map that stores key-value pairs.
- Problem 3: Given a series of insert and delete operations, maintain a balanced BST.
27.2 Intermediate Problems (Algorithmic Challenges)
- Problem 4: Find the k-th smallest element in a dynamic self-balancing BST.
- Problem 5: Implement an order statistics tree to answer range queries efficiently.
- Problem 6: Design a persistent AVL Tree that keeps track of previous versions.
27.3 Advanced Problems (Competitive Programming)
- Problem 7: Solve the "Count of Smaller Numbers After Self" (LeetCode).
- Problem 8: Implement an interval tree using a self-balancing BST.
- Problem 9: Design a dynamic range sum query using an AVL Tree.
- Problem 10: Given a dynamic array, use an AVL Tree to efficiently support insert, delete, and sum queries.
27.4 Resources for Problem Solving
- LeetCode (Search for AVL Tree problems)
- GeeksforGeeks (Red-Black Tree, Treap problems)
- Codeforces (Advanced self-balancing tree problems)
28. System Design Assignment - Use Case Integration
Apply self-balancing trees in real-world system design problems.
28.1 System Design Problem: Implement a Scalable Rate Limiter
Design a distributed rate limiter for an API using self-balancing trees.
28.2 Problem Statement:
You are designing a rate limiter for an API with millions of requests per second. Each request has a unique user ID, and you must allow at most 100 requests per minute per user.
28.3 Solution Using Self-Balancing Trees:
- Store each user request timestamp in an AVL Tree.
- When a request arrives:
- Insert the current timestamp.
- Remove timestamps older than 60 seconds.
- If the tree contains more than 100 entries, block the request.
28.4 Code Implementation (Python)
import time
class Node:
def __init__(self, timestamp):
self.key = timestamp
self.left = None
self.right = None
self.height = 1
class RateLimiter:
def __init__(self):
self.root = None
self.time_window = 60 # Time window in seconds
self.max_requests = 100
def get_height(self, node):
return node.height if node else 0
def get_balance(self, node):
return self.get_height(node.left) - self.get_height(node.right) if node else 0
def left_rotate(self, x):
y = x.right
T2 = y.left
y.left = x
x.right = T2
x.height = max(self.get_height(x.left), self.get_height(x.right)) + 1
y.height = max(self.get_height(y.left), self.get_height(y.right)) + 1
return y
def right_rotate(self, y):
x = y.left
T2 = x.right
x.right = y
y.left = T2
y.height = max(self.get_height(y.left), self.get_height(y.right)) + 1
x.height = max(self.get_height(x.left), self.get_height(x.right)) + 1
return x
def insert(self, root, key):
if not root:
return Node(key)
elif key < root.key:
root.left = self.insert(root.left, key)
else:
root.right = self.insert(root.right, key)
root.height = max(self.get_height(root.left), self.get_height(root.right)) + 1
balance = self.get_balance(root)
if balance > 1 and key < root.left.key:
return self.right_rotate(root)
if balance < -1 and key > root.right.key:
return self.left_rotate(root)
if balance > 1 and key > root.left.key:
root.left = self.left_rotate(root.left)
return self.right_rotate(root)
if balance < -1 and key < root.right.key:
root.right = self.right_rotate(root.right)
return self.left_rotate(root)
return root
def inorder(self, root):
if not root:
return []
return self.inorder(root.left) + [root.key] + self.inorder(root.right)
def enforce_rate_limit(self, user_id):
current_time = int(time.time())
self.root = self.insert(self.root, current_time)
# Remove old timestamps
valid_requests = [t for t in self.inorder(self.root) if t > current_time - self.time_window]
if len(valid_requests) > self.max_requests:
print("Rate limit exceeded for user:", user_id)
return False
print("Request allowed for user:", user_id)
return True
# Example Usage
rate_limiter = RateLimiter()
for _ in range(105): # Exceed the limit
rate_limiter.enforce_rate_limit("user123")
time.sleep(0.5) # Simulate incoming requests
28.5 Real-World Application:
- API Gateways: AWS API Gateway, Cloudflare use rate limiters.
- DDoS Protection: Cloudflare and Akamai use tree-based rate limiters.
- Spam Detection: Limit login attempts in authentication systems.
29. Time-Constrained Implementation Challenges
To master self-balancing trees under competitive conditions:
29.1 Timed Assignments
- Implement an AVL Tree from scratch in 20 minutes.
- Solve a problem using a self-balancing tree within 10 minutes.
- Optimize an unbalanced BST to an AVL Tree within 30 minutes.
29.2 Competitive Coding Practice
- Participate in LeetCode Contests with self-balancing tree problems.
- Speed-run Red-Black Tree implementations in less than an hour.
- Optimize heap-based solutions using AVL Trees.
30. Key Takeaway
- Practice problem-solving with at least 10 coding assignments.
- Integrate self-balancing trees into system design problems.
- Train under time constraints to improve efficiency in competitive programming.