lru_cache - 6.4.2 | Chapter 6: Functional Programming Tools in Python | Python Advance
Students

Academic Programs

AI-powered learning for grades 8-12, aligned with major curricula

Professional

Professional Courses

Industry-relevant training in Business, Technology, and Design

Games

Interactive Games

Fun games to boost memory, math, typing, and English skills

lru_cache

6.4.2 - lru_cache

Enroll to start learning

You’ve not yet enrolled in this course. Please enroll for free to listen to audio lessons, classroom podcasts and take practice test.

Practice

Interactive Audio Lesson

Listen to a student-teacher conversation explaining the topic in a relatable way.

Introduction to lru_cache

πŸ”’ Unlock Audio Lesson

Sign up and enroll to listen to this audio lesson

0:00
--:--
Teacher
Teacher Instructor

Today, we're going to learn about `lru_cache`, a wonderful decorator in Python. Can someone guess what caching means?

Student 1
Student 1

Is it about storing things to use later? Like saving images?

Teacher
Teacher Instructor

Exactly! Just like images can be cached, `lru_cache` saves the results of function calls. Why do you think this might be useful?

Student 2
Student 2

Because it can save time if you're calling the same function repeatedly!

Teacher
Teacher Instructor

Right! This is especially handy with recursive functions like calculating Fibonacci numbers. Have you all heard of this sequence?

Student 3
Student 3

Yes! Each number is the sum of the two preceding ones!

Teacher
Teacher Instructor

Correct! Let's look at how we can use `lru_cache` while calculating Fibonacci numbers.

Student 4
Student 4

Does it help with the speed a lot?

Teacher
Teacher Instructor

It does! Let's explore an example next time. Just remember, `lru` stands for Least Recently Used, which is how the cache decides what to keep.

Using lru_cache with Example

πŸ”’ Unlock Audio Lesson

Sign up and enroll to listen to this audio lesson

0:00
--:--
Teacher
Teacher Instructor

Here’s a code snippet where we utilize `lru_cache`. Can someone read this code?

Student 1
Student 1

Sure! It defines a function 'fib' that recursively calculates Fibonacci numbers.

Teacher
Teacher Instructor

Great! Notice how `@lru_cache(maxsize=None)` is placed before the function. Can someone explain what happens if we call `fib(30)`?

Student 2
Student 2

It calculates the 30th Fibonacci but uses previously cached results, so it runs faster, right?

Teacher
Teacher Instructor

Exactly! By reusing results from the cache, we avoid recalculating for the same inputs. What’s the output if you run this?

Student 3
Student 3

I think it should be 832040!

Teacher
Teacher Instructor

Correct! Who wants to try modifying the maxsize?

Student 4
Student 4

What happens if we set it to a small number?

Teacher
Teacher Instructor

Good question! If the cache exceeds that size, the oldest cached entries would be discarded. Let's see that in action next session!

Introduction & Overview

Read summaries of the section's main ideas at different levels of detail.

Quick Overview

The `lru_cache` is a decorator in Python's `functools` module that enhances function performance by caching results for reuse.

Standard

In this section, we learn about the lru_cache decorator from the functools module which allows caching of function results. By storing results of expensive function calls, subsequent calls with the same parameters will utilize the cached result, improving performance significantly. The maxsize parameter controls the cache size.

Detailed

Detailed Summary of lru_cache

The lru_cache decorator from Python's functools module is a powerful tool for optimization, especially in functions that are computationally expensive. The term 'LRU' stands for 'Least Recently Used', which indicates how the cache maintains its size. When the cache reaches its maximum size, the least recently accessed items are purged to make room for new entries. This behavior makes it useful for recursive functions like calculating Fibonacci numbers, where many calls with the same parameters occur.

Key Features of lru_cache:

  • Caching Capability: It stores results of expensive function calls and returns the cached result when the same inputs occur again.
  • Performance Improvement: Highly beneficial in scenarios of repeated function calls, leading to reduced computation time.
  • Customizable Cache Size: You can specify maxsize to control how many results to store in the cache. Setting maxsize=None allows unlimited storage but may increase memory usage.

Usage Example:

Code Editor - python

In this example, the fib function computes Fibonacci numbers efficiently using lru_cache, showcasing the decorator's utility in handling recursion and improving performance.

Audio Book

Dive deep into the subject with an immersive audiobook experience.

Introduction to lru_cache

Chapter 1 of 2

πŸ”’ Unlock Audio Chapter

Sign up and enroll to access the full audio experience

0:00
--:--

Chapter Content

Caches the results of a function to improve performance.

Detailed Explanation

The lru_cache is a tool provided by the functools module in Python. It is used to store the results of expensive function calls, so that if the function is called again with the same arguments, the stored result can be returned immediately instead of recalculating it. This can greatly enhance performance, especially in recursive functions or functions that are called frequently with the same parameters.

Examples & Analogies

Think of lru_cache like a library where once you borrow a book, the next time you want to read it, instead of ordering it from the publisher again (which takes time), the librarian quickly hands you the same book from the shelf. This saves time and resources, just as lru_cache saves computation time by reusing previous results.

Using lru_cache with a Function

Chapter 2 of 2

πŸ”’ Unlock Audio Chapter

Sign up and enroll to access the full audio experience

0:00
--:--

Chapter Content

from functools import lru_cache
@lru_cache(maxsize=None)
def fib(n):
    if n < 2:
        return n
    return fib(n-1) + fib(n-2)
print(fib(30))

Detailed Explanation

Here, we see a practical application of lru_cache with a function that calculates Fibonacci numbers. The @lru_cache(maxsize=None) decorator is placed above the fib function. This means whenever fib() is called with a specific value of n, if that value has been calculated before, the cached value is returned instead of computing it again. Therefore, for fib(30), lru_cache speeds up the process using previously computed results, making it much faster than a naive implementation.

Examples & Analogies

Imagine if you were solving a complex math problem for the second time. Instead of recalculating everything from scratch, you could simply refer to the solution you found earlier. This is what lru_cache does: it keeps track of previously computed solutions to save on time and effort.

Key Concepts

  • lru_cache: A decorator that caches results of function calls to improve performance.

  • maxsize: An attribute of lru_cache that controls how many results can be stored.

  • Fibonacci Function: A common example used to illustrate the benefits of caching.

Examples & Applications

Using lru_cache to speed up Fibonacci calculations: Writing a recursive Fibonacci function with lru_cache decorator significantly reduces execution time.

Caching expensive function results: An example of determining prime numbers where previously computed results are reused.

Memory Aids

Interactive tools to help you remember key concepts

🎡

Rhymes

Cache it fast, don’t let it go; lru_cache makes functions flow.

πŸ“–

Stories

Imagine a wise old man who remembers every visitor. He only forgets the least interesting ones. This is how lru_cache remembers function calls!

🧠

Memory Tools

R a P: Remember and Perform - think about how lru_cache remembers calls to perform faster.

🎯

Acronyms

LRU

Least Recently Used - a way to remember how lru_cache decides which results to keep.

Flash Cards

Glossary

Decorator

A special type of function that modifies the behavior of another function.

Caching

The process of storing computed results for reuse to improve performance.

LRU

Least Recently Used; a cache replacement policy that removes the least frequently accessed items when storing new ones.

Reference links

Supplementary resources to enhance your learning experience.