Can you please explain Python dictionary memory usage?

Python dictionaries use a sophisticated hash table structure that affects memory usage. Understanding how dictionaries store data helps optimize memory usage in your applications.

Dictionary Internal Structure

Each dictionary consists of multiple buckets, where each bucket contains ?

  • The hash code of the stored object (unpredictable due to collision resolution)
  • A pointer to the key object
  • A pointer to the value object

This structure requires at least 12 bytes per bucket on 32-bit systems and 24 bytes on 64-bit systems.

Example: Basic Dictionary Memory

import sys

# Empty dictionary
empty_dict = {}
print(f"Empty dictionary: {sys.getsizeof(empty_dict)} bytes")

# Small dictionary
small_dict = {'a': 1, 'b': 2, 'c': 3}
print(f"3-item dictionary: {sys.getsizeof(small_dict)} bytes")

# Larger dictionary
large_dict = {f'key{i}': i for i in range(100)}
print(f"100-item dictionary: {sys.getsizeof(large_dict)} bytes")
Empty dictionary: 64 bytes
3-item dictionary: 232 bytes
100-item dictionary: 4704 bytes

Dictionary Resizing Behavior

Dictionaries start with 8 empty buckets and resize by doubling capacity when full ?

import sys

# Track memory growth as items are added
data = {}
sizes = []

for i in range(20):
    data[f'item{i}'] = i
    size = sys.getsizeof(data)
    sizes.append((i + 1, size))

# Show memory jumps during resizing
for count, size in sizes:
    print(f"{count:2d} items: {size:4d} bytes")
 1 items:  232 bytes
 2 items:  232 bytes
 3 items:  232 bytes
 4 items:  232 bytes
 5 items:  232 bytes
 6 items:  360 bytes
 7 items:  360 bytes
 8 items:  360 bytes
 9 items:  640 bytes
10 items:  640 bytes
11 items:  640 bytes
12 items:  640 bytes
13 items:  640 bytes
14 items:  640 bytes
15 items:  640 bytes
16 items:  640 bytes
17 items: 1184 bytes
18 items: 1184 bytes
19 items: 1184 bytes
20 items: 1184 bytes

Memory Optimization Tips

You can reduce dictionary memory usage with these approaches ?

import sys

# Using __slots__ in classes
class RegularClass:
    def __init__(self, x, y):
        self.x = x
        self.y = y

class SlottedClass:
    __slots__ = ['x', 'y']
    def __init__(self, x, y):
        self.x = x
        self.y = y

# Compare memory usage
regular = RegularClass(1, 2)
slotted = SlottedClass(1, 2)

print(f"Regular class instance: {sys.getsizeof(regular.__dict__)} bytes")
print(f"Slotted class instance: {sys.getsizeof(slotted)} bytes")
Regular class instance: 232 bytes
Slotted class instance: 48 bytes

Memory Usage Summary

System Bytes per Bucket Initial Buckets Resize Strategy
32-bit 12 bytes 8 Double when full
64-bit 24 bytes 8 Double when full

Conclusion

Python dictionaries use hash tables with significant memory overhead per item. They start with 8 buckets and double in size when capacity is reached, which explains memory jumps as dictionaries grow.

Updated on: 2026-03-24T20:31:38+05:30

827 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements