or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

analysis-utilities.mdcli-interface.mdcore-skeletonization.mdindex.mdpoint-connection.mdpost-processing.md
tile.json

tessl/pypi-kimimaro

Skeletonize densely labeled image volumes using TEASAR-derived algorithms for neuroscience and connectomics research.

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
pypipkg:pypi/kimimaro@5.4.x

To install, run

npx @tessl/cli install tessl/pypi-kimimaro@5.4.0

index.mddocs/

Kimimaro

Kimimaro is a Python library that rapidly skeletonizes densely labeled 2D and 3D image volumes using a TEASAR-derived algorithm. It converts volumetric images into one-dimensional skeleton representations (stick figure drawings) that preserve geometric and topological structure, providing compact representations for visualization, connectivity analysis, and morphological studies in neuroscience and connectomics research.

Package Information

  • Package Name: kimimaro
  • Language: Python (with C++ extensions via Cython)
  • Installation: pip install kimimaro
  • Optional Dependencies: pip install "kimimaro[all]" for full feature support

Core Imports

import kimimaro

For type hints and advanced functionality:

from typing import Union, Dict, List, Tuple, Sequence
import numpy as np
from osteoid import Skeleton
from kimimaro import DimensionError

Basic Usage

import kimimaro
import numpy as np

# Load a densely labeled 3D volume (integers representing different objects)
labels = np.load("segmented_volume.npy")  # shape: (Z, Y, X)

# Skeletonize all non-zero labels
skeletons = kimimaro.skeletonize(
    labels,
    teasar_params={
        "scale": 1.5,           # Rolling ball invalidation scale
        "const": 300,           # Minimum radius in physical units (nm)
        "pdrf_scale": 100000,   # Penalty distance field scale
        "pdrf_exponent": 8,     # Penalty field exponent
        "soma_detection_threshold": 750,    # Soma detection threshold (nm)
        "soma_acceptance_threshold": 3500,  # Soma acceptance threshold (nm)
        "soma_invalidation_scale": 2,       # Soma invalidation scale
        "soma_invalidation_const": 300,     # Soma invalidation constant (nm)
        "max_paths": None,               # Maximum paths to trace per object
    },
    anisotropy=(16, 16, 40),    # Physical voxel dimensions (nm)
    dust_threshold=1000,        # Skip objects < 1000 voxels
    progress=True,              # Show progress bar
    parallel=1,                 # Number of processes (1=single, <=0=all CPUs)
)

# Access individual skeletons by label ID
skeleton_for_label_5 = skeletons[5]

# Post-process a skeleton to improve quality
clean_skeleton = kimimaro.postprocess(
    skeleton_for_label_5,
    dust_threshold=1500,   # Remove disconnected components < 1500 nm
    tick_threshold=3000    # Remove small branches < 3000 nm
)

# Convert skeleton to SWC format for visualization
swc_string = clean_skeleton.to_swc()
print(swc_string)

Architecture

Kimimaro implements a TEASAR-derived skeletonization algorithm optimized for densely labeled volumes:

  • TEASAR Algorithm: Finds skeleton paths by tracing through a penalty field from root points to boundary extremes
  • Penalty Field: Combines distance-to-boundary (DBF) and euclidean distance (EDF) to guide path selection
  • Soma Detection: Identifies cellular bodies using DBF thresholds and applies specialized processing
  • Parallel Processing: Supports multi-process skeletonization for large datasets
  • Memory Optimization: Uses shared memory arrays and efficient data structures for large volumes

The algorithm works by:

  1. Computing distance transforms (DBF) for each connected component
  2. Finding root points (typically soma centers or high DBF regions)
  3. Tracing paths via Dijkstra's algorithm through the penalty field
  4. Applying rolling-ball invalidation to mark visited regions
  5. Repeating until all significant regions are covered

Capabilities

Core Skeletonization

Main skeletonization functionality for converting labeled volumes into skeleton representations using TEASAR algorithm with parallel processing support and specialized soma handling.

def skeletonize(
    all_labels,
    teasar_params=DEFAULT_TEASAR_PARAMS,
    anisotropy=(1,1,1),
    object_ids=None,
    dust_threshold=1000,
    progress=True,
    fix_branching=True,
    in_place=False,
    fix_borders=True,
    parallel=1,
    parallel_chunk_size=100,
    extra_targets_before=[],
    extra_targets_after=[],
    fill_holes=False,
    fix_avocados=False,
    voxel_graph=None
): ...

Core Skeletonization

Post-processing and Quality Enhancement

Skeleton post-processing functions for improving quality through component joining, dust removal, loop elimination, and branch trimming.

def postprocess(
    skeleton: Skeleton,
    dust_threshold: float = 1500.0,
    tick_threshold: float = 3000.0
) -> Skeleton: ...

def join_close_components(
    skeletons: Sequence[Skeleton],
    radius: float = float('inf'),
    restrict_by_radius: bool = False
) -> Skeleton: ...

Post-processing

Analysis and Utilities

Advanced analysis functions for cross-sectional measurements, oversegmentation, and skeleton extraction from binary images.

def cross_sectional_area(
    all_labels: np.ndarray,
    skeletons: Union[Dict[int, Skeleton], List[Skeleton], Skeleton],
    anisotropy: np.ndarray = np.array([1,1,1], dtype=np.float32),
    smoothing_window: int = 1,
    progress: bool = False,
    in_place: bool = False,
    fill_holes: bool = False,
    repair_contacts: bool = False,
    visualize_section_planes: bool = False,
    step: int = 1
) -> Union[Dict[int, Skeleton], List[Skeleton], Skeleton]: ...

def extract_skeleton_from_binary_image(image: np.ndarray) -> Skeleton: ...

def oversegment(
    all_labels: np.ndarray,
    skeletons: Union[Dict[int, Skeleton], List[Skeleton], Skeleton],
    anisotropy: np.ndarray = np.array([1,1,1], dtype=np.float32),
    progress: bool = False,
    fill_holes: bool = False,
    in_place: bool = False,
    downsample: int = 0
) -> Tuple[np.ndarray, Union[Dict[int, Skeleton], List[Skeleton], Skeleton]]: ...

Analysis and Utilities

Point Connection and Targeting

Functions for connecting specific points and converting synapse locations to skeleton targets.

def connect_points(
    labels,
    start,
    end,
    anisotropy=(1,1,1),
    fill_holes=False,
    in_place=False,
    pdrf_scale=100000,
    pdrf_exponent=4
): ...

def synapses_to_targets(labels, synapses, progress=False): ...

Point Connection

Command Line Interface

Complete command-line interface for skeletonization, visualization, and SWC file manipulation without requiring Python programming.

# Main skeletonization command
kimimaro forge <input_file> [options]

# View/visualize files
kimimaro view <file.swc|file.npy>

# Convert between formats
kimimaro swc from <binary_image>
kimimaro swc to <swc_file> --format <npy|tiff>

# Display software license
kimimaro license

CLI Interface

Constants and Configuration

DEFAULT_TEASAR_PARAMS = {
    "scale": 1.5,
    "const": 300,
    "pdrf_scale": 100000,
    "pdrf_exponent": 8,
    "soma_acceptance_threshold": 3500,
    "soma_detection_threshold": 750,
    "soma_invalidation_const": 300,
    "soma_invalidation_scale": 2,
    "max_paths": None
}

Exception Classes

class DimensionError(Exception):
    """Raised when input arrays have incompatible dimensions for skeletonization."""
    pass