Article Categories
- All Categories
-
Data Structure
-
Networking
-
RDBMS
-
Operating System
-
Java
-
MS Excel
-
iOS
-
HTML
-
CSS
-
Android
-
Python
-
C Programming
-
C++
-
C#
-
MongoDB
-
MySQL
-
Javascript
-
PHP
-
Economics & Finance
Clear LRU Cache in Python
In this article, we will learn how to clear an LRU cache implemented in Python. LRU Cache (Least Recently Used Cache) is a data structure that improves application performance by storing frequently-used data and removing the least recently used items when the cache becomes full.
The LRU Cache is particularly useful in applications with high-cost data retrieval operations, such as disk I/O or network access. By caching frequently-used data in memory, applications can significantly reduce expensive operations and improve performance.
Understanding LRU Cache in Python
Python's functools module provides the @lru_cache decorator to implement LRU caching. This decorator can be applied to functions that require frequent computations with the same arguments, dramatically improving execution time.
Basic Syntax for Clearing LRU Cache
The cache_clear() method removes all entries from the LRU cache ?
from functools import lru_cache
@lru_cache(maxsize=128)
def some_function(arg):
# function implementation
return arg * 2
# Clear the cache
some_function.cache_clear()
Example: Fibonacci with Cache Clearing
Here's a practical example using the Fibonacci sequence to demonstrate cache clearing ?
from functools import lru_cache
@lru_cache(maxsize=128)
def fibonacci(n):
"""Return the nth Fibonacci number."""
if n < 2:
return n
return fibonacci(n-1) + fibonacci(n-2)
# Call the function to populate the cache
print("First calls:")
print(fibonacci(10))
print(fibonacci(15))
print("Cache info:", fibonacci.cache_info())
# Clear the cache
fibonacci.cache_clear()
print("Cache cleared!")
# Call the function again - it's recomputed
print("After clearing:")
print(fibonacci(10))
print("Cache info:", fibonacci.cache_info())
First calls: 55 610 Cache info: CacheInfo(hits=14, misses=16, maxsize=128, currsize=16) Cache cleared! After clearing: 55 Cache info: CacheInfo(hits=8, misses=11, maxsize=128, currsize=11)
Using cache_info() to Monitor Cache Performance
The cache_info() method provides valuable statistics about cache performance, including hits, misses, and current size. This helps you understand when cache clearing might be beneficial.
Example: Edit Distance with Cache Management
Here's another example using edit distance calculation ?
from functools import lru_cache
@lru_cache(maxsize=128)
def edit_distance(s1, s2):
"""Compute the edit distance between two strings."""
if not s1:
return len(s2)
elif not s2:
return len(s1)
elif s1[0] == s2[0]:
return edit_distance(s1[1:], s2[1:])
else:
d1 = edit_distance(s1[1:], s2) + 1 # deletion
d2 = edit_distance(s1, s2[1:]) + 1 # insertion
d3 = edit_distance(s1[1:], s2[1:]) + 1 # substitution
return min(d1, d2, d3)
# Calculate edit distances
print("Edit distances:")
print(edit_distance("kitten", "sitting"))
print(edit_distance("abcde", "vwxyz"))
print("Before clear:", edit_distance.cache_info())
# Clear the cache
edit_distance.cache_clear()
print("After clear:", edit_distance.cache_info())
# Recalculate - will be recomputed
print("Recalculated:", edit_distance("kitten", "sitting"))
Edit distances: 3 5 Before clear: CacheInfo(hits=21, misses=34, maxsize=128, currsize=34) After clear: CacheInfo(hits=0, misses=0, maxsize=128, currsize=0) Recalculated: 3
Key Points
-
cache_clear()removes all cached entries completely - Use
cache_info()to monitor cache performance and decide when to clear - Clearing the cache frees memory but forces recomputation on next calls
- Each decorated function has its own independent cache
- The
maxsizeparameter controls the maximum cache size
Conclusion
Use cache_clear() to remove all LRU cache entries when you need to free memory or ensure fresh computations. Monitor cache performance with cache_info() to optimize when to clear the cache for better application performance.
