CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-fastcore

Python supercharged for fastai development

56

1.36x
Overview
Eval results
Files

extended.mddocs/

Extended Functionality

Advanced utilities including caching systems, file operations, async helpers, data serialization, and system integration tools for complex development scenarios. The xtras module provides essential utilities that extend the core functionality with specialized tools for file handling, caching, execution, and data transformation.

Capabilities

File System Operations

Comprehensive file system utilities including directory walking, glob operations, and path manipulation with advanced filtering capabilities.

def walk(path, symlinks=True, keep_file=ret_true, keep_folder=ret_true, skip_folder=ret_false, func=os.path.join, ret_folders=False):
    """
    Generator version of os.walk with functional filtering.
    
    Enhanced directory traversal with customizable file and folder filtering
    using predicate functions. Provides more control than standard os.walk.
    
    Parameters:
    - path: Path|str, starting directory path
    - symlinks: bool, follow symbolic links (default: True)
    - keep_file: callable, function returning True for wanted files (default: ret_true)
    - keep_folder: callable, function returning True for folders to enter (default: ret_true)
    - skip_folder: callable, function returning True for folders to skip (default: ret_false)
    - func: callable, function to apply to each matched file (default: os.path.join)
    - ret_folders: bool, yield folders in addition to files (default: False)
    
    Yields:
    Results of func(root, name) for each matching file/folder
    """

def globtastic(path, recursive=True, symlinks=True, file_glob=None, file_re=None, 
               folder_re=None, skip_file_glob=None, skip_file_re=None, 
               skip_folder_re=None, func=os.path.join, ret_folders=False):
    """
    Enhanced glob with regex matches, symlink handling, and skip parameters.
    
    More powerful alternative to glob.glob with support for regex patterns,
    exclusion rules, and advanced filtering options.
    
    Parameters:
    - path: Path|str, starting directory path
    - recursive: bool, search subdirectories (default: True)
    - symlinks: bool, follow symbolic links (default: True)
    - file_glob: str, glob pattern for file inclusion
    - file_re: str, regex pattern for file inclusion
    - folder_re: str, regex pattern for folder inclusion
    - skip_file_glob: str, glob pattern for file exclusion
    - skip_file_re: str, regex pattern for file exclusion
    - skip_folder_re: str, regex pattern for folder exclusion
    - func: callable, function to apply to matches
    - ret_folders: bool, include folders in results
    
    Returns:
    L: List of paths matching the criteria
    """

def mkdir(path, exist_ok=False, parents=False, overwrite=False, **kwargs):
    """
    Create directory with enhanced options.
    
    Creates directory with Path support and optional overwrite capability.
    More flexible than standard os.makedirs.
    
    Parameters:
    - path: Path|str, directory to create
    - exist_ok: bool, don't raise error if directory exists
    - parents: bool, create parent directories as needed
    - overwrite: bool, remove existing directory first
    - **kwargs: additional arguments passed to Path.mkdir
    
    Returns:
    Path: The created directory path
    """

@contextmanager
def maybe_open(f, mode='r', **kwargs):
    """
    Context manager: open file if path, otherwise return as-is.
    
    Flexible file handling that works with both file paths and file objects.
    Automatically handles opening and closing for paths while passing through
    already opened file objects.
    
    Parameters:
    - f: str|PathLike|file-like, file path or file object
    - mode: str, file opening mode (default: 'r')
    - **kwargs: additional arguments for open()
    
    Yields:
    File object ready for reading/writing
    """

Data Serialization and Loading

Functions for handling various data formats including JSON, pickle, and compressed data.

def loads(s, **kwargs):
    """
    Load JSON with enhanced error handling.
    
    JSON loading with better error messages and support for Path objects.
    Handles common JSON parsing errors gracefully.
    
    Parameters:
    - s: str|Path, JSON string or file path
    - **kwargs: additional arguments for json.loads
    
    Returns:
    Parsed JSON object
    
    Raises:
    Informative errors for malformed JSON
    """

def loads_multi(s, **kwargs):
    """
    Load multiple JSON objects from string or file.
    
    Handles JSONL (JSON Lines) format and multiple JSON objects
    separated by whitespace.
    
    Parameters:
    - s: str|Path, JSON string or file path containing multiple objects
    - **kwargs: additional arguments for json.loads
    
    Returns:
    List of parsed JSON objects
    """

def dumps(obj, **kwargs):
    """
    Dump object to JSON with Path handling.
    
    Enhanced JSON serialization with support for Path objects
    and sensible defaults for common use cases.
    
    Parameters:
    - obj: Object to serialize
    - **kwargs: additional arguments for json.dumps
    
    Returns:
    str: JSON string representation
    """

def save_pickle(fn, o):
    """
    Save object to pickle file.
    
    Parameters:
    - fn: Path|str, file path for pickle
    - o: Object to pickle
    """

def load_pickle(fn):
    """
    Load object from pickle file.
    
    Parameters:
    - fn: Path|str, pickle file path
    
    Returns:
    Unpickled object
    """

def bunzip(fn):
    """
    Decompress bzip2 data from file or bytes.
    
    Parameters:
    - fn: Path|str|bytes, bzip2 compressed data or file path
    
    Returns:
    bytes: Decompressed data
    """

System Execution and Environment

Utilities for running system commands and managing execution environment.

def run(cmd, *rest, same_in_win=False, ignore_ex=False, as_bytes=False, stderr=False):
    """
    Run system command with enhanced error handling.
    
    Execute shell commands with flexible argument handling and
    comprehensive error reporting. Supports both string and list
    command specifications.
    
    Parameters:
    - cmd: str|list, command to execute
    - *rest: additional command arguments
    - same_in_win: bool, don't modify command for Windows compatibility
    - ignore_ex: bool, return None instead of raising on errors
    - as_bytes: bool, return output as bytes instead of string
    - stderr: bool, include stderr in output
    
    Returns:
    str|bytes|None: Command output or None if error and ignore_ex=True
    
    Raises:
    RuntimeError: If command fails and ignore_ex=False
    """

def exec_eval(code, glob=None, loc=None):
    """
    Execute code and return result of last expression.
    
    Enhanced exec that can return the value of the last expression
    in the code block, combining exec and eval functionality.
    
    Parameters:
    - code: str, Python code to execute
    - glob: dict, global namespace (default: None)
    - loc: dict, local namespace (default: None)
    
    Returns:
    Result of last expression in code, or None if no expression
    """

@contextmanager
def modified_env(**kwargs):
    """
    Context manager to temporarily modify environment variables.
    
    Parameters:
    - **kwargs: environment variables to set
    
    Usage:
    with modified_env(PATH="/new/path", DEBUG="1"):
        # Environment modified here
        subprocess.run(...)
    # Environment restored here
    """

def set_num_threads(n):
    """
    Set number of threads for various libraries.
    
    Sets thread counts for NumPy, OpenMP, and other numerical libraries
    to optimize performance on multi-core systems.
    
    Parameters:
    - n: int, number of threads to use
    """

Caching and Performance

Advanced caching utilities including flexible caching policies and cached iterators.

def flexicache(cache_dir=None, **kwargs):
    """
    Flexible caching decorator with multiple policies.
    
    Advanced caching with support for different cache policies including
    time-based expiration and modification time checking.
    
    Parameters:
    - cache_dir: Path|str, directory for cache files
    - **kwargs: additional caching options
    
    Returns:
    Decorator function for caching
    """

def time_policy(seconds):
    """
    Cache policy based on time expiration.
    
    Parameters:
    - seconds: int|float, cache duration in seconds
    
    Returns:
    Policy function for time-based cache expiration
    """

def mtime_policy():
    """
    Cache policy based on file modification time.
    
    Invalidates cache when source files are modified.
    
    Returns:
    Policy function for mtime-based cache invalidation
    """

def timed_cache(seconds=3600):
    """
    Simple time-based cache decorator.
    
    Parameters:
    - seconds: int, cache duration (default: 3600)
    
    Returns:
    Caching decorator
    """

class CachedIter:
    """
    Iterator that caches results for repeated iteration.
    
    Allows multiple iterations over the same data without
    recomputing expensive operations.
    
    Parameters:
    - func: callable, function that returns an iterator
    - *args: arguments for func
    - **kwargs: keyword arguments for func
    """
    
    def __init__(self, func, *args, **kwargs): ...
    def __iter__(self): ...

class CachedAwaitable:
    """
    Cache results of async operations.
    
    Provides caching for coroutines and async functions to avoid
    redundant async operations.
    
    Parameters:
    - coro: coroutine to cache
    """
    
    def __init__(self, coro): ...
    def __await__(self): ...

def reawaitable(f):
    """
    Decorator to make async functions re-awaitable.
    
    Allows the same coroutine result to be awaited multiple times
    without re-execution.
    
    Parameters:
    - f: async function to make re-awaitable
    
    Returns:
    Decorated async function
    """

Data Transformation and Utilities

Specialized classes and functions for data manipulation and object transformation.

class IterLen:
    """
    Iterator wrapper that supports len().
    
    Wraps iterators to provide length information when the
    underlying iterable has a __len__ method.
    
    Parameters:
    - items: iterable to wrap
    """
    
    def __init__(self, items): ...
    def __iter__(self): ...
    def __len__(self): ...

class ReindexCollection:
    """
    Collection that supports reindexing operations.
    
    Provides advanced indexing capabilities with support for
    reordering and subset selection.
    
    Parameters:
    - items: initial collection items
    """
    
    def __init__(self, items): ...
    def reindex(self, idxs): ...
    def __getitem__(self, idx): ...

class SaveReturn:
    """
    Context manager to save and return iteration results.
    
    Captures all yielded values from an iterator for later access
    while still allowing normal iteration.
    
    Parameters:
    - it: iterator to wrap
    """
    
    def __init__(self, it): ...
    def __enter__(self): ...
    def __exit__(self, *args): ...
    def __iter__(self): ...

def dict2obj(d, **kwargs):
    """
    Convert dictionary to object with attribute access.
    
    Parameters:
    - d: dict, dictionary to convert
    - **kwargs: additional attributes to set
    
    Returns:
    Object with dictionary keys as attributes
    """

def obj2dict(obj, **kwargs):
    """
    Convert object to dictionary.
    
    Parameters:
    - obj: object to convert
    - **kwargs: additional dictionary entries
    
    Returns:
    dict: Object attributes as dictionary
    """

def is_listy(x):
    """
    Check if object is list-like but not string/dict.
    
    Parameters:
    - x: object to check
    
    Returns:
    bool: True if list-like (iterable but not string/dict)
    """

def mapped(f):
    """
    Create a mapped version of an iterable.
    
    Parameters:
    - f: function to map over items
    
    Returns:
    Function that applies f to iterables
    """

String and Formatting Utilities

Text processing and formatting functions for display and analysis.

def truncstr(s, maxlen=50, suf='…'):
    """
    Truncate string with suffix if too long.
    
    Parameters:
    - s: str, string to truncate
    - maxlen: int, maximum length (default: 50)
    - suf: str, suffix for truncated strings (default: '…')
    
    Returns:
    str: Truncated string with suffix if needed
    """

def sparkline(data, mn=None, mx=None):
    """
    Create Unicode sparkline from numeric data.
    
    Parameters:
    - data: iterable of numbers
    - mn: float, minimum value (auto-detected if None)
    - mx: float, maximum value (auto-detected if None)
    
    Returns:
    str: Unicode sparkline representation
    """

class PartialFormatter(string.Formatter):
    """
    String formatter that handles missing keys gracefully.
    
    Allows partial string formatting where some placeholders
    can remain unfilled without raising KeyError.
    """
    
    def get_value(self, key, args, kwargs): ...

def partial_format(s, **kwargs):
    """
    Partially format string, leaving unfilled placeholders.
    
    Parameters:
    - s: str, format string with placeholders
    - **kwargs: values for available placeholders
    
    Returns:
    str: Partially formatted string
    """

def stringfmt_names(s):
    """
    Extract placeholder names from format string.
    
    Parameters:
    - s: str, format string
    
    Returns:
    set: Set of placeholder names
    """

Development and Debugging Tools

Utilities for development workflow including timing, tracing, and code analysis.

class EventTimer:
    """
    Timer for measuring event durations with context manager support.
    
    Provides precise timing measurements with automatic formatting
    and optional storage of timing results.
    
    Parameters:
    - store: bool, store timing results (default: False)
    - run: bool, start timing immediately (default: True)
    """
    
    def __init__(self, store=False, run=True): ...
    def __enter__(self): ...
    def __exit__(self, *args): ...
    def start(self): ...
    def stop(self): ...

def trace(f):
    """
    Decorator to trace function calls with arguments and results.
    
    Prints function name, arguments, and return values for debugging.
    Useful for understanding program flow and data transformations.
    
    Parameters:
    - f: function to trace
    
    Returns:
    Wrapped function with tracing
    """

def modify_exception(e, msg_func, **kwargs):
    """
    Modify exception message with additional context.
    
    Parameters:
    - e: Exception to modify
    - msg_func: function to transform exception message
    - **kwargs: additional context for message
    
    Returns:
    Modified exception
    """

def round_multiple(x, mult=1):
    """
    Round number to nearest multiple.
    
    Parameters:
    - x: number to round
    - mult: multiple to round to (default: 1)
    
    Returns:
    Number rounded to nearest multiple
    """

class ContextManagers:
    """
    Manage multiple context managers as a single unit.
    
    Allows entering and exiting multiple context managers together,
    with proper cleanup even if some fail.
    
    Parameters:
    - *cms: context managers to manage
    """
    
    def __init__(self, *cms): ...
    def __enter__(self): ...
    def __exit__(self, *args): ...

Usage Examples

Advanced File Operations

from fastcore.xtras import walk, globtastic, mkdir

# Advanced directory walking with filtering
python_files = list(walk(
    "src", 
    keep_file=lambda root, name: name.endswith('.py'),
    skip_folder=lambda root, name: name.startswith('.')
))

# Powerful glob with regex and exclusions
source_files = globtastic(
    "project",
    file_re=r".*\.(py|js|ts)$",
    skip_folder_re=r"(node_modules|__pycache__|\.git)",
    recursive=True
)

# Create directory with overwrite
build_dir = mkdir("build", parents=True, overwrite=True)

Caching and Performance

from fastcore.xtras import flexicache, timed_cache, CachedIter

# Flexible caching with time policy
@flexicache(cache_dir=".cache")
def expensive_computation(data):
    # Time-intensive processing
    return process_data(data)

# Simple time-based caching
@timed_cache(seconds=300)  # 5 minute cache
def api_call(endpoint):
    return requests.get(endpoint).json()

# Cache iterator results
def data_generator():
    for i in range(1000000):
        yield expensive_transform(i)

cached_data = CachedIter(data_generator)
# First iteration computes and caches
for item in cached_data: process(item)
# Second iteration uses cached results
for item in cached_data: analyze(item)

System Execution and Environment

from fastcore.xtras import run, modified_env, exec_eval

# Enhanced command execution
result = run(["python", "script.py", "--flag"], 
             stderr=True, ignore_ex=True)

# Temporary environment modification
with modified_env(DEBUG="1", LOG_LEVEL="verbose"):
    run("my_app --config dev.json")

# Execute code and get result
result = exec_eval("""
x = 10
y = 20
x + y  # This value is returned
""")
print(result)  # 30

Data Transformation

from fastcore.xtras import dict2obj, ReindexCollection, SaveReturn

# Dictionary to object conversion
config = dict2obj({
    "model": "resnet50", 
    "lr": 0.01,
    "batch_size": 32
})
print(config.model)  # "resnet50"

# Reindexing collections
data = ReindexCollection([10, 20, 30, 40, 50])
subset = data.reindex([0, 2, 4])  # [10, 30, 50]

# Save iteration results
def number_generator():
    for i in range(5):
        yield i * i

with SaveReturn(number_generator()) as saver:
    for val in saver:
        print(f"Processing {val}")

results = saver.values  # [0, 1, 4, 9, 16]

String Processing and Formatting

from fastcore.xtras import truncstr, sparkline, partial_format

# String truncation
long_text = "This is a very long string that needs truncation"
short = truncstr(long_text, maxlen=20)  # "This is a very lo…"

# Unicode sparklines
data = [1, 3, 7, 4, 2, 8, 5, 3, 2, 1]
chart = sparkline(data)  # "▁▃▇▄▂█▅▃▂▁"

# Partial string formatting
template = "Hello {name}, your score is {score}"
partial = partial_format(template, name="Alice")
# "Hello Alice, your score is {score}"

Development Tools

from fastcore.xtras import EventTimer, trace, ContextManagers

# Precise timing
with EventTimer() as timer:
    expensive_operation()
print(f"Operation took {timer.elapsed:.3f} seconds")

# Function tracing
@trace
def fibonacci(n):
    if n <= 1: return n
    return fibonacci(n-1) + fibonacci(n-2)

# Managing multiple contexts
from contextlib import contextmanager

@contextmanager
def temp_file(): 
    # ... create temp file
    yield file
    # ... cleanup

with ContextManagers(open('file1.txt'), temp_file(), timer):
    # All contexts entered together
    process_files()
# All contexts properly exited

Install with Tessl CLI

npx tessl i tessl/pypi-fastcore

docs

collections.md

core-utilities.md

extended.md

index.md

metaprogramming.md

networking.md

parallel.md

system-integration.md

testing.md

xml-html.md

tile.json