Skip to content

jubnl/timeit_decorator

Repository files navigation

Timeit Decorator

PyPI - License PyPI Downloads PyPI Downloads PyPI version Python versions Build codecov Quality Gate Status GitHub Issues GitHub Repo stars

Table of Contents

Overview

timeit_decorator is a flexible Python library for benchmarking function execution. It supports repeated runs, parallel execution with threads or processes, detailed timing statistics, and native support for both sync and async functions.

Features

  • Multiple Runs and Workers: Run functions multiple times with configurable concurrency.
  • Sync, Async, and Auto-Detecting: Use @timeit_sync for synchronous functions, @timeit_async for coroutines, or @timeit to auto-detect at decoration time.
  • Per-Task Timeout Handling: Enforce or log timeouts individually for each execution.
  • Multiprocessing and Threading: Choose concurrency model for CPU- or I/O-bound workloads.
  • Detailed Statistics: Enable detailed=True to log a full table with average, median, min/max, stddev, and total time.
  • Auto-Scaled Time Units: Output automatically uses µs, ms, or s for readability.
  • Instance, Class, and Static Method Support: Fully supports method decorators (with limitations for multiprocessing).
  • Structured Logging Only: All output is logged using Python’s logging module.
Use Cases
  • Performance Analysis: Use the detailed parameter to get a comprehensive overview of the function's performance across multiple runs.
  • Debugging: The detailed statistics can help identify inconsistencies or anomalies in function execution, aiding in debugging efforts.

Remember that enabling detailed output can increase the verbosity of the output, especially for functions executed multiple times. It is recommended to use this feature judiciously based on the specific needs of performance analysis or debugging.

Flexible Logging

All output is handled exclusively through Python’s logging module. The timeit_decorator automatically configures a default logger if none exists. You can customize verbosity using the log_level parameter (default: logging.INFO).

Installation

To install timeit_decorator, run the following command:

pip install timeit-decorator

Usage

Example Available

You can find a runnable example in examples/main.py.
The corresponding output is written to examples/example_output.log.

Basic Usage

Here's how to use the timeit decorator:

import logging
from timeit_decorator import timeit_sync

# Configure logging
logging.basicConfig(
    level=logging.INFO,
    format="%(asctime)s [%(levelname)s] (%(name)s) %(message)s",
    datefmt="%Y-%m-%d %H:%M:%S"
)


@timeit_sync(runs=5, workers=2, log_level=logging.INFO)
def sample_function():
    # Function implementation
    pass


# Call the decorated function
sample_function()

Auto-Detecting Decorator

@timeit inspects the decorated function at decoration time and automatically forwards to @timeit_sync or @timeit_async. It accepts all the same parameters:

from timeit_decorator import timeit


@timeit(runs=5, workers=2)
def fetch_data(url: str) -> dict:
    # sync implementation
    pass


@timeit(runs=3, workers=3, detailed=True)
async def call_api(endpoint: str) -> dict:
    # async implementation
    pass

Note: use_multiprocessing=True is silently ignored when decorating an async function — a warning is logged via the "timeit.decorator" logger.

Efficient Execution for Single Run/Worker

For single executions, the decorator directly runs the function:

import logging
from timeit_decorator import timeit_sync

# Configure logging
logging.basicConfig(level=logging.INFO)


# Default parameters
# @timeit_sync(
#       runs=1,
#       workers=1,
#       log_level=logging.INFO,
#       use_multiprocessing=False,
#       detailed=False,
#       timeout=None,
#       enforce_timeout=False
# )
@timeit_sync()
def quick_function():
    # Function implementation for a quick task
    pass


# Call the decorated function
quick_function()

Using Multiprocessing

For CPU-bound tasks, you can enable multiprocessing:

import logging
from timeit_decorator import timeit_sync

# Configure logging
logging.basicConfig(level=logging.DEBUG)


@timeit_sync(runs=10, workers=4, use_multiprocessing=True, log_level=logging.DEBUG)
def cpu_intensive_function():
    # CPU-bound function implementation
    pass


# Call the decorated function
cpu_intensive_function()

Using Threading (Default)

For I/O-bound tasks, the default threading is more efficient:

import logging
from timeit_decorator import timeit_sync

# Configure logging
logging.basicConfig(level=logging.INFO)


@timeit_sync(runs=5, workers=2)
def io_bound_function():
    # I/O-bound function implementation
    pass


# Call the decorated function
io_bound_function()

Detailed Output Option

The timeit decorator includes an optional detailed parameter that provides more extensive statistics about the function execution time when set to True. This feature is particularly useful for in-depth performance analysis and debugging, as it gives users a broader view of how the function behaves under different conditions.

Usage of the detailed Parameter

Purpose: When set to True, the timeit decorator provides a detailed tabulated output including average, median, minimum, and maximum execution times, standard deviation, and total execution time for all runs.

Example
@timeit_sync(runs=5, workers=2, detailed=True)
def sample_function(a, b, c="some value"):
    # Function implementation
    pass


sample_function("arg1", "arg2", c="value overwrite")

This will output a detailed tabulated summary after the function execution, similar to the following:

Function       mymodule.sample_function
Args           ('arg1', 'arg2')
Kwargs         {'c': 'value overwrite'}
Runs           5
Workers        2
Average Time   200.00ms
Median Time    190.00ms
Min Time       180.00ms
Max Time       220.00ms
Std Deviation  15.00ms
Total Time     1.000s
Timed Out      False

Timeout Handling

You can specify a timeout (in seconds) to monitor execution duration for each run. The enforce_timeout parameter controls how timeouts are handled:

  • enforce_timeout=False (default): Logs a warning if a run exceeds the timeout but allows it to complete.
  • enforce_timeout=True: Cancels the execution if the timeout is reached (only supported with threading and async).

Example (Non-Enforced Timeout)

import logging
from timeit_decorator import timeit_sync

logging.basicConfig(level=logging.INFO)


@timeit_sync(timeout=0.1)
def slow_function():
    import time
    time.sleep(0.2)


slow_function()

Example (Enforced Timeout with Cancellation)

@timeit_sync(timeout=0.1, enforce_timeout=True)
def fast_abort():
    import time
    time.sleep(0.2)


fast_abort()

Behavior Summary

  • If execution completes before timeout -> normal result
  • If enforce_timeout=False and timeout is exceeded -> logs warning, allows completion
  • If enforce_timeout=True and timeout is exceeded -> run is cancelled and marked as timed out

Note: Enforced timeout is not supported with use_multiprocessing=True due to Python’s process model.

Async Support

timeit_decorator fully supports asynchronous functions via the @timeit_async decorator.

You can configure it with the same options as the sync version, except use_multiprocessing (async execution uses asyncio.Semaphore for concurrency and does not support multiprocessing):

  • runs, workers
  • timeout, enforce_timeout
  • detailed, log_level
import asyncio
from timeit_decorator import timeit_async


@timeit_async(runs=3, workers=2, timeout=0.1, enforce_timeout=False)
async def async_task():
    await asyncio.sleep(0.2)
    return "done"


asyncio.run(async_task())

If enforce_timeout=False, a warning is logged if the timeout is exceeded, but the coroutine runs to completion. If enforce_timeout=True, the task is cancelled via asyncio.wait_for() if it exceeds the timeout limit.

Limitations

While timeit_decorator is designed to be highly flexible, a few constraints exist due to Python's concurrency model:

Incompatibility with Static Methods and Multiprocessing

  • Static Methods and Multiprocessing: The timeit decorator currently does not support the use of multiprocessing ( use_multiprocessing=True) with @staticmethod. Attempting to use the timeit decorator with multiprocessing on static methods can lead to unexpected behavior or errors, specifically a PicklingError.

Reason for the Limitation: This issue arises because Python's multiprocessing module requires objects to be serialized (pickled) for transfer between processes. However, static methods pose a challenge for Python's pickling mechanism due to the way they are referenced internally. This can result in a PicklingError stating that the static method is not the same object as expected.

Example of the issue:

# This will raise a PicklingError when executed
class ExampleClass:
    @staticmethod
    @timeit_sync(use_multiprocessing=True, runs=2)
    def example_static_method():
        # method implementation
        pass

Example of exception :

_pickle.PicklingError: Can't pickle <function ExampleClass.example_static_method at 0x...>: it's not the same object as __main__.ExampleClass.example_static_method

Recommended Workaround: To avoid this issue, consider using instance methods or regular functions, which are not subject to the same serialization constraints as static methods. Alternatively, refrain from using use_multiprocessing=True with static methods.

This limitation stems from inherent characteristics of Python's multiprocessing and pickling mechanisms. Users are encouraged to structure their code accordingly to prevent encountering this issue. We are continuously working to enhance the timeit decorator and mitigate such limitations wherever possible. If you encounter any other issues or limitations, please feel free to report them in the project's issue tracker.

Requirements

timeit_decorator requires Python 3.9+

Contributing

Contributions to timeit_decorator are welcome! Please read our contributing guidelines for more details.

License

timeit_decorator is released under the GPL-3.0 License.

Changelog

See CHANGELOG.md for a full list of changes, fixes, and new features introduced in each release.