+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Part 106 of 365

๐Ÿ“˜ Memory Profiling: Memory Management

Master memory profiling: memory management in Python with practical examples, best practices, and real-world applications ๐Ÿš€

๐Ÿ’ŽAdvanced
25 min read

Prerequisites

  • Basic understanding of programming concepts ๐Ÿ“
  • Python installation (3.8+) ๐Ÿ
  • VS Code or preferred IDE ๐Ÿ’ป

What you'll learn

  • Understand the concept fundamentals ๐ŸŽฏ
  • Apply the concept in real projects ๐Ÿ—๏ธ
  • Debug common issues ๐Ÿ›
  • Write clean, Pythonic code โœจ

๐ŸŽฏ Introduction

Welcome to this exciting tutorial on memory profiling and management in Python! ๐ŸŽ‰ In this guide, weโ€™ll explore how to track, analyze, and optimize your Python applicationโ€™s memory usage like a pro!

Ever wondered why your Python app is eating up all your RAM? ๐Ÿค” Or why it gets slower over time? Youโ€™re about to discover the secrets of memory management that will transform you into a memory optimization wizard! ๐Ÿง™โ€โ™‚๏ธ

By the end of this tutorial, youโ€™ll feel confident identifying memory leaks, optimizing memory usage, and building applications that run smoothly even with limited resources! Letโ€™s dive in! ๐ŸŠโ€โ™‚๏ธ

๐Ÿ“š Understanding Memory Profiling

๐Ÿค” What is Memory Profiling?

Memory profiling is like being a detective ๐Ÿ•ต๏ธ for your codeโ€™s memory usage. Think of it as having X-ray vision ๐Ÿ‘๏ธ that lets you see exactly how your program uses RAM - where itโ€™s stored, how much is used, and most importantly, whatโ€™s not being cleaned up!

In Python terms, memory profiling helps you:

  • โœจ Track memory allocation and deallocation
  • ๐Ÿš€ Identify memory leaks and bottlenecks
  • ๐Ÿ›ก๏ธ Optimize memory usage for better performance

๐Ÿ’ก Why Use Memory Profiling?

Hereโ€™s why developers love memory profiling:

  1. Performance Optimization ๐Ÿš€: Find and fix memory bottlenecks
  2. Resource Management ๐Ÿ’ป: Use system resources efficiently
  3. Scalability ๐Ÿ“ˆ: Ensure your app can handle growth
  4. Cost Savings ๐Ÿ’ฐ: Less memory = cheaper cloud hosting

Real-world example: Imagine building a photo editing app ๐Ÿ“ธ. Without proper memory management, loading just a few high-res images could crash your application! With memory profiling, you can ensure smooth operation even with hundreds of images.

๐Ÿ”ง Basic Syntax and Usage

๐Ÿ“ Simple Memory Tracking

Letโ€™s start with basic memory monitoring:

import sys
import gc
import psutil
import os

# ๐Ÿ‘‹ Hello, Memory Profiling!
def get_memory_usage():
    """Get current memory usage in MB"""
    process = psutil.Process(os.getpid())
    return process.memory_info().rss / 1024 / 1024  # ๐Ÿ“Š Convert to MB

# ๐ŸŽจ Track object sizes
def get_object_size(obj):
    """Get size of an object in bytes"""
    return sys.getsizeof(obj)

# ๐Ÿ“ˆ Monitor memory growth
print(f"Initial memory: {get_memory_usage():.2f} MB")

# Create some data
big_list = [i for i in range(1000000)]  # ๐ŸŽฏ 1 million integers
print(f"After creating list: {get_memory_usage():.2f} MB")
print(f"List size: {get_object_size(big_list) / 1024 / 1024:.2f} MB")

๐Ÿ’ก Explanation: Weโ€™re using psutil to track overall memory usage and sys.getsizeof() to measure individual objects. The emojis in comments make tracking more fun!

๐ŸŽฏ Memory Profiling Tools

Here are the essential tools youโ€™ll use:

# ๐Ÿ—๏ธ Tool 1: memory_profiler
from memory_profiler import profile

@profile
def memory_hungry_function():
    # ๐ŸŽจ Create a large list
    data = [i ** 2 for i in range(1000000)]
    
    # ๐Ÿ”„ Process the data
    filtered = [x for x in data if x % 2 == 0]
    
    # ๐Ÿ’ซ Create a dictionary
    data_dict = {i: i ** 2 for i in range(100000)}
    
    return len(filtered)

# ๐Ÿš€ Tool 2: tracemalloc (built-in!)
import tracemalloc

tracemalloc.start()

# Your code here
data = [i for i in range(1000000)]

current, peak = tracemalloc.get_traced_memory()
print(f"Current memory usage: {current / 1024 / 1024:.2f} MB")
print(f"Peak memory usage: {peak / 1024 / 1024:.2f} MB")

tracemalloc.stop()

๐Ÿ’ก Practical Examples

๐Ÿ›’ Example 1: E-Commerce Memory Manager

Letโ€™s build a memory-efficient product catalog:

import weakref
from datetime import datetime
import gc

# ๐Ÿ›๏ธ Product class with memory optimization
class Product:
    _instances = weakref.WeakValueDictionary()  # ๐ŸŽฏ Track all instances
    
    def __init__(self, id, name, price, image_data=None):
        self.id = id
        self.name = name
        self.price = price
        self._image_data = image_data  # ๐Ÿ“ธ Could be large!
        
        # Register instance
        Product._instances[id] = self
    
    @classmethod
    def get_memory_usage(cls):
        """Calculate total memory used by all products"""
        total = 0
        for product in cls._instances.values():
            total += sys.getsizeof(product)
            if product._image_data:
                total += sys.getsizeof(product._image_data)
        return total / 1024 / 1024  # Return in MB
    
    def __del__(self):
        print(f"๐Ÿ—‘๏ธ Cleaning up product: {self.name}")

# ๐Ÿ›’ Memory-efficient shopping cart
class ShoppingCart:
    def __init__(self):
        self.items = []  # Store product IDs, not objects
        self._cache = weakref.WeakValueDictionary()
    
    def add_item(self, product_id):
        """Add item by ID to save memory"""
        self.items.append(product_id)
        print(f"โœ… Added product {product_id} to cart!")
    
    def get_product(self, product_id):
        """Lazy load products as needed"""
        # Check cache first
        if product_id in self._cache:
            return self._cache[product_id]
        
        # Load from "database" (simulated)
        product = Product._instances.get(product_id)
        if product:
            self._cache[product_id] = product
        return product
    
    def checkout(self):
        """Process cart with minimal memory footprint"""
        total = 0
        print("๐Ÿ›’ Checking out:")
        
        for product_id in self.items:
            product = self.get_product(product_id)
            if product:
                total += product.price
                print(f"  ๐Ÿ“ฆ {product.name} - ${product.price}")
        
        print(f"๐Ÿ’ฐ Total: ${total}")
        print(f"๐Ÿ“Š Memory used: {Product.get_memory_usage():.2f} MB")

# ๐ŸŽฎ Let's use it!
# Create products
p1 = Product(1, "Gaming Laptop", 999.99, b"x" * 1000000)  # 1MB image
p2 = Product(2, "Wireless Mouse", 29.99, b"x" * 100000)   # 100KB image
p3 = Product(3, "Mechanical Keyboard", 149.99, b"x" * 200000)  # 200KB image

print(f"Products created. Memory: {Product.get_memory_usage():.2f} MB")

# Use the cart
cart = ShoppingCart()
cart.add_item(1)
cart.add_item(2)
cart.add_item(3)
cart.checkout()

# Force cleanup
del p1
gc.collect()
print(f"After cleanup. Memory: {Product.get_memory_usage():.2f} MB")

๐ŸŽฏ Try it yourself: Add a feature to compress product images when memory usage exceeds a threshold!

๐ŸŽฎ Example 2: Game Memory Optimizer

Letโ€™s optimize a gameโ€™s memory usage:

import weakref
from collections import defaultdict
import tracemalloc

# ๐Ÿ† Memory-efficient game asset manager
class AssetManager:
    def __init__(self, max_cache_size_mb=100):
        self._cache = {}
        self._usage_count = defaultdict(int)
        self._last_used = {}
        self.max_cache_size = max_cache_size_mb * 1024 * 1024  # Convert to bytes
        
        # Start memory tracking
        tracemalloc.start()
    
    def load_asset(self, asset_name, asset_data):
        """Load asset with memory management"""
        # Check if we need to free memory
        self._check_memory_limit()
        
        # Store asset
        self._cache[asset_name] = asset_data
        self._usage_count[asset_name] += 1
        self._last_used[asset_name] = datetime.now()
        
        print(f"โœจ Loaded {asset_name} ({sys.getsizeof(asset_data) / 1024:.1f} KB)")
    
    def get_asset(self, asset_name):
        """Get asset and update usage stats"""
        if asset_name in self._cache:
            self._usage_count[asset_name] += 1
            self._last_used[asset_name] = datetime.now()
            return self._cache[asset_name]
        return None
    
    def _check_memory_limit(self):
        """Free memory if exceeding limit"""
        current, peak = tracemalloc.get_traced_memory()
        
        if current > self.max_cache_size:
            print(f"โš ๏ธ Memory limit exceeded! Current: {current / 1024 / 1024:.1f} MB")
            self._free_least_used_assets()
    
    def _free_least_used_assets(self):
        """Remove least recently used assets"""
        # Sort by last used time
        sorted_assets = sorted(
            self._last_used.items(),
            key=lambda x: x[1]
        )
        
        # Remove oldest 20% of assets
        remove_count = max(1, len(sorted_assets) // 5)
        
        for asset_name, _ in sorted_assets[:remove_count]:
            if asset_name in self._cache:
                size = sys.getsizeof(self._cache[asset_name])
                del self._cache[asset_name]
                del self._usage_count[asset_name]
                del self._last_used[asset_name]
                print(f"๐Ÿ—‘๏ธ Freed {asset_name} ({size / 1024:.1f} KB)")
    
    def get_stats(self):
        """Get memory usage statistics"""
        current, peak = tracemalloc.get_traced_memory()
        
        print("๐Ÿ“Š Memory Stats:")
        print(f"  ๐Ÿ“ˆ Current usage: {current / 1024 / 1024:.1f} MB")
        print(f"  ๐Ÿ”๏ธ Peak usage: {peak / 1024 / 1024:.1f} MB")
        print(f"  ๐Ÿ“ฆ Cached assets: {len(self._cache)}")
        print(f"  ๐ŸŽฏ Cache limit: {self.max_cache_size / 1024 / 1024:.1f} MB")

# ๐ŸŽฎ Game simulation
class Game:
    def __init__(self):
        self.asset_manager = AssetManager(max_cache_size_mb=50)
        self.player_score = 0
    
    def load_level(self, level_num):
        """Load level assets"""
        print(f"\n๐ŸŽฎ Loading Level {level_num}...")
        
        # Simulate loading various assets
        assets = [
            (f"texture_ground_{level_num}", b"x" * 5000000),    # 5MB
            (f"texture_sky_{level_num}", b"x" * 3000000),      # 3MB
            (f"model_player_{level_num}", b"x" * 2000000),     # 2MB
            (f"sound_music_{level_num}", b"x" * 10000000),     # 10MB
            (f"model_enemies_{level_num}", b"x" * 4000000),    # 4MB
        ]
        
        for name, data in assets:
            self.asset_manager.load_asset(name, data)
        
        self.asset_manager.get_stats()
    
    def play_level(self, level_num):
        """Simulate playing a level"""
        print(f"\n๐ŸŽฏ Playing Level {level_num}...")
        
        # Access assets multiple times
        for i in range(5):
            self.asset_manager.get_asset(f"texture_ground_{level_num}")
            self.asset_manager.get_asset(f"model_player_{level_num}")
            
            # Simulate scoring
            self.player_score += 100
            
        print(f"๐Ÿ† Score: {self.player_score}")

# ๐Ÿš€ Let's play!
game = Game()

# Play through multiple levels
for level in range(1, 6):
    game.load_level(level)
    game.play_level(level)
    
    # Show memory stats after each level
    print(f"\n๐Ÿ“Š After Level {level}:")
    game.asset_manager.get_stats()

๐Ÿš€ Advanced Concepts

๐Ÿง™โ€โ™‚๏ธ Advanced Topic 1: Memory Leak Detection

When youโ€™re ready to level up, try this advanced pattern:

import gc
import objgraph
from collections import defaultdict

# ๐ŸŽฏ Advanced memory leak detector
class MemoryLeakDetector:
    def __init__(self):
        self.snapshots = []
        self.growth_history = defaultdict(list)
    
    def take_snapshot(self, label=""):
        """Take a memory snapshot"""
        gc.collect()  # Force garbage collection
        
        snapshot = {
            'label': label,
            'timestamp': datetime.now(),
            'objects': {}
        }
        
        # Count objects by type
        for obj in gc.get_objects():
            obj_type = type(obj).__name__
            snapshot['objects'][obj_type] = snapshot['objects'].get(obj_type, 0) + 1
        
        self.snapshots.append(snapshot)
        print(f"๐Ÿ“ธ Snapshot taken: {label}")
    
    def analyze_growth(self):
        """Analyze memory growth between snapshots"""
        if len(self.snapshots) < 2:
            print("โš ๏ธ Need at least 2 snapshots to analyze!")
            return
        
        print("\n๐Ÿ” Memory Growth Analysis:")
        
        for i in range(1, len(self.snapshots)):
            prev = self.snapshots[i-1]
            curr = self.snapshots[i]
            
            print(f"\n๐Ÿ“Š {prev['label']} โ†’ {curr['label']}:")
            
            # Find growing object types
            growth = {}
            for obj_type, count in curr['objects'].items():
                prev_count = prev['objects'].get(obj_type, 0)
                if count > prev_count:
                    growth[obj_type] = count - prev_count
            
            # Show top 5 growing types
            sorted_growth = sorted(growth.items(), key=lambda x: x[1], reverse=True)
            for obj_type, increase in sorted_growth[:5]:
                if increase > 0:
                    print(f"  ๐Ÿš€ {obj_type}: +{increase} objects")
                    self.growth_history[obj_type].append(increase)
    
    def detect_leaks(self):
        """Detect potential memory leaks"""
        print("\n๐Ÿšจ Potential Memory Leaks:")
        
        for obj_type, growth_list in self.growth_history.items():
            if len(growth_list) >= 3:  # Need at least 3 data points
                # Check if constantly growing
                if all(g > 0 for g in growth_list[-3:]):
                    avg_growth = sum(growth_list[-3:]) / 3
                    print(f"  ๐Ÿ’ฅ {obj_type}: Average growth of {avg_growth:.1f} objects/snapshot")

# ๐Ÿช„ Usage example
detector = MemoryLeakDetector()

# Simulate a memory leak
leaky_list = []

detector.take_snapshot("Initial")

for i in range(5):
    # This creates a memory leak - circular references!
    class LeakyNode:
        def __init__(self, value):
            self.value = value
            self.children = []
            self.parent = None
    
    # Create circular references
    root = LeakyNode(f"root_{i}")
    for j in range(10):
        child = LeakyNode(f"child_{i}_{j}")
        child.parent = root
        root.children.append(child)
    
    leaky_list.append(root)
    detector.take_snapshot(f"Iteration {i+1}")

detector.analyze_growth()
detector.detect_leaks()

๐Ÿ—๏ธ Advanced Topic 2: Memory Optimization Patterns

For the brave developers:

# ๐Ÿš€ Advanced memory optimization patterns
import functools
import pickle
import zlib

class MemoryOptimizer:
    """Advanced memory optimization techniques"""
    
    @staticmethod
    def compress_data(data):
        """Compress data to save memory"""
        pickled = pickle.dumps(data)
        compressed = zlib.compress(pickled)
        
        original_size = sys.getsizeof(data)
        compressed_size = sys.getsizeof(compressed)
        ratio = (1 - compressed_size / original_size) * 100
        
        print(f"๐Ÿ’ซ Compression: {original_size} โ†’ {compressed_size} bytes ({ratio:.1f}% saved)")
        return compressed
    
    @staticmethod
    def decompress_data(compressed_data):
        """Decompress data when needed"""
        decompressed = zlib.decompress(compressed_data)
        return pickle.loads(decompressed)
    
    @staticmethod
    @functools.lru_cache(maxsize=128)
    def expensive_computation(n):
        """Cache expensive computations"""
        print(f"๐Ÿงฎ Computing for {n}...")
        return sum(i ** 2 for i in range(n))
    
    @staticmethod
    def lazy_property(func):
        """Decorator for lazy-loaded properties"""
        attr_name = f'_lazy_{func.__name__}'
        
        @property
        @functools.wraps(func)
        def wrapper(self):
            if not hasattr(self, attr_name):
                setattr(self, attr_name, func(self))
                print(f"๐ŸŽฏ Lazy-loaded: {func.__name__}")
            return getattr(self, attr_name)
        
        return wrapper

# Example usage
class DataProcessor:
    def __init__(self, data_size):
        self.data_size = data_size
        self._raw_data = None
    
    @MemoryOptimizer.lazy_property
    def processed_data(self):
        """Only compute when accessed"""
        return [i ** 2 for i in range(self.data_size)]
    
    @MemoryOptimizer.lazy_property
    def statistics(self):
        """Compute statistics on demand"""
        data = self.processed_data
        return {
            'mean': sum(data) / len(data),
            'max': max(data),
            'min': min(data)
        }

# Test it out
processor = DataProcessor(1000000)
print("๐ŸŽฏ Processor created (data not loaded yet)")

# Access triggers computation
print(f"๐Ÿ“Š Stats: {processor.statistics}")
print(f"๐Ÿ“Š Stats again (cached): {processor.statistics}")

โš ๏ธ Common Pitfalls and Solutions

๐Ÿ˜ฑ Pitfall 1: Circular References

# โŒ Wrong way - creates memory leak!
class Node:
    def __init__(self, value):
        self.value = value
        self.parent = None
        self.children = []
    
    def add_child(self, child):
        self.children.append(child)
        child.parent = self  # ๐Ÿ’ฅ Circular reference!

# Creating a leak
root = Node("root")
child = Node("child")
root.add_child(child)
# Even after del root, memory isn't freed!

# โœ… Correct way - use weak references!
import weakref

class Node:
    def __init__(self, value):
        self.value = value
        self._parent = None  # Will use weakref
        self.children = []
    
    @property
    def parent(self):
        return self._parent() if self._parent else None
    
    @parent.setter
    def parent(self, node):
        self._parent = weakref.ref(node) if node else None
    
    def add_child(self, child):
        self.children.append(child)
        child.parent = self  # โœ… Now using weakref!

# No more leaks!
root = Node("root")
child = Node("child")
root.add_child(child)
del root  # Memory properly freed!

๐Ÿคฏ Pitfall 2: Holding Large Objects in Memory

# โŒ Dangerous - loads entire file in memory!
def process_large_file(filename):
    with open(filename, 'r') as f:
        data = f.read()  # ๐Ÿ’ฅ Could be gigabytes!
        lines = data.split('\n')
        return [line.upper() for line in lines]

# โœ… Safe - process line by line!
def process_large_file_safely(filename):
    processed_lines = []
    with open(filename, 'r') as f:
        for line in f:  # โœ… One line at a time
            processed_lines.append(line.strip().upper())
    return processed_lines

# Even better - use a generator!
def process_large_file_generator(filename):
    with open(filename, 'r') as f:
        for line in f:
            yield line.strip().upper()  # ๐Ÿš€ Ultra memory efficient!

๐Ÿ› ๏ธ Best Practices

  1. ๐ŸŽฏ Profile Before Optimizing: Donโ€™t guess - measure first!
  2. ๐Ÿ“ Use Appropriate Data Structures: Sets for uniqueness, deque for queues
  3. ๐Ÿ›ก๏ธ Implement __slots__: For classes with many instances
  4. ๐ŸŽจ Use Generators: For large datasets and streams
  5. โœจ Clean Up Explicitly: Use context managers and del when needed

๐Ÿงช Hands-On Exercise

Create a memory-efficient image gallery system:

๐Ÿ“‹ Requirements:

  • โœ… Load and display images without exceeding memory limit
  • ๐Ÿท๏ธ Support thumbnail generation and caching
  • ๐Ÿ‘ค Track image metadata (size, date, tags)
  • ๐Ÿ“… Implement LRU (Least Recently Used) cache
  • ๐ŸŽจ Add image compression when memory is low!

๐Ÿš€ Bonus Points:

  • Implement lazy loading for thumbnails
  • Add memory usage monitoring dashboard
  • Create automatic cache size adjustment

๐Ÿ’ก Solution

๐Ÿ” Click to see solution
import weakref
from collections import OrderedDict
from datetime import datetime
import hashlib

# ๐ŸŽฏ Memory-efficient image gallery!
class Image:
    def __init__(self, path, data=None):
        self.path = path
        self.metadata = {
            'size': len(data) if data else 0,
            'created': datetime.now(),
            'tags': set(),
            'access_count': 0
        }
        self._data = data
        self._thumbnail = None
        self._compressed = None
    
    @property
    def data(self):
        self.metadata['access_count'] += 1
        self.metadata['last_accessed'] = datetime.now()
        return self._data
    
    def get_thumbnail(self, size=(100, 100)):
        if not self._thumbnail:
            # Simulate thumbnail generation
            self._thumbnail = b"thumb_" + (self._data[:1000] if self._data else b"")
            print(f"๐Ÿ–ผ๏ธ Generated thumbnail for {self.path}")
        return self._thumbnail
    
    def compress(self):
        if not self._compressed and self._data:
            # Simulate compression
            self._compressed = zlib.compress(self._data)
            original_size = len(self._data)
            compressed_size = len(self._compressed)
            print(f"๐Ÿ’ซ Compressed {self.path}: {original_size} โ†’ {compressed_size} bytes")
            
            # Free original data
            self._data = None
    
    def decompress(self):
        if self._compressed and not self._data:
            self._data = zlib.decompress(self._compressed)
            print(f"๐Ÿ“ค Decompressed {self.path}")

class ImageGallery:
    def __init__(self, max_memory_mb=100):
        self.max_memory = max_memory_mb * 1024 * 1024
        self._images = OrderedDict()  # LRU ordering
        self._cache_size = 0
        self._compressed_images = weakref.WeakValueDictionary()
    
    def add_image(self, path, data):
        """Add image with memory management"""
        image = Image(path, data)
        image_size = len(data)
        
        # Check if we need to free memory
        while self._cache_size + image_size > self.max_memory and self._images:
            self._evict_lru_image()
        
        # Add to gallery
        self._images[path] = image
        self._cache_size += image_size
        
        print(f"โœ… Added {path} ({image_size / 1024:.1f} KB)")
        self._show_stats()
    
    def get_image(self, path):
        """Get image with LRU update"""
        if path in self._images:
            # Move to end (most recently used)
            image = self._images.pop(path)
            self._images[path] = image
            
            # Decompress if needed
            if image._compressed and not image._data:
                image.decompress()
            
            return image
        
        # Check compressed storage
        if path in self._compressed_images:
            image = self._compressed_images[path]
            image.decompress()
            self._images[path] = image
            return image
        
        return None
    
    def _evict_lru_image(self):
        """Remove least recently used image"""
        if not self._images:
            return
        
        # Get oldest image (first in OrderedDict)
        path, image = next(iter(self._images.items()))
        
        # Try compression first
        if image._data and not image._compressed:
            image.compress()
            self._compressed_images[path] = image
            self._cache_size -= len(image._data) if image._data else 0
            print(f"๐Ÿ—œ๏ธ Compressed {path} to save memory")
        else:
            # Remove completely
            self._images.pop(path)
            self._cache_size -= image.metadata['size']
            print(f"๐Ÿ—‘๏ธ Evicted {path} from cache")
    
    def _show_stats(self):
        """Display memory statistics"""
        print(f"๐Ÿ“Š Gallery Stats: {len(self._images)} images, "
              f"{self._cache_size / 1024 / 1024:.1f}/{self.max_memory / 1024 / 1024:.1f} MB")
    
    def get_thumbnails(self):
        """Get all thumbnails efficiently"""
        thumbnails = {}
        for path, image in self._images.items():
            thumbnails[path] = image.get_thumbnail()
        return thumbnails
    
    def search_by_tag(self, tag):
        """Search images by tag"""
        results = []
        for path, image in self._images.items():
            if tag in image.metadata['tags']:
                results.append(path)
        return results
    
    def optimize_memory(self):
        """Optimize memory usage"""
        print("\n๐Ÿ”ง Optimizing memory...")
        
        # Sort by access count
        sorted_images = sorted(
            self._images.items(),
            key=lambda x: x[1].metadata.get('access_count', 0)
        )
        
        # Compress least accessed images
        for path, image in sorted_images[:len(sorted_images)//3]:
            if image._data and not image._compressed:
                image.compress()
        
        self._show_stats()

# ๐ŸŽฎ Test it out!
gallery = ImageGallery(max_memory_mb=10)

# Add some images
for i in range(20):
    # Simulate different sized images
    size = 1024 * 1024 * (i % 3 + 1)  # 1-3 MB
    data = b"x" * size
    gallery.add_image(f"image_{i}.jpg", data)

# Access some images
print("\n๐Ÿ” Accessing images...")
for i in [0, 5, 10, 15]:
    img = gallery.get_image(f"image_{i}.jpg")
    if img:
        img.metadata['tags'].add("favorite")

# Optimize memory
gallery.optimize_memory()

# Get thumbnails
print("\n๐Ÿ–ผ๏ธ Getting thumbnails...")
thumbs = gallery.get_thumbnails()
print(f"Generated {len(thumbs)} thumbnails")

๐ŸŽ“ Key Takeaways

Youโ€™ve learned so much! Hereโ€™s what you can now do:

  • โœ… Profile memory usage with confidence ๐Ÿ’ช
  • โœ… Detect memory leaks before they cause problems ๐Ÿ›ก๏ธ
  • โœ… Optimize memory usage in real applications ๐ŸŽฏ
  • โœ… Implement caching strategies like a pro ๐Ÿ›
  • โœ… Build memory-efficient applications that scale! ๐Ÿš€

Remember: Memory management is about being smart with resources, not just using less memory! ๐Ÿค

๐Ÿค Next Steps

Congratulations! ๐ŸŽ‰ Youโ€™ve mastered memory profiling and management in Python!

Hereโ€™s what to do next:

  1. ๐Ÿ’ป Practice with the exercises above
  2. ๐Ÿ—๏ธ Profile your existing projects for memory issues
  3. ๐Ÿ“š Move on to our next tutorial: Advanced Profiling Techniques
  4. ๐ŸŒŸ Share your memory optimization wins with others!

Remember: Every Python expert was once a beginner. Keep profiling, keep optimizing, and most importantly, have fun! ๐Ÿš€


Happy coding! ๐ŸŽ‰๐Ÿš€โœจ