or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

tessl/pypi-kimimaro

Skeletonize densely labeled image volumes using TEASAR-derived algorithms for neuroscience and connectomics research.

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
pypipkg:pypi/kimimaro@5.4.x

To install, run

npx @tessl/cli install tessl/pypi-kimimaro@5.4.0

0

# Kimimaro

1

2

Kimimaro is a Python library that rapidly skeletonizes densely labeled 2D and 3D image volumes using a TEASAR-derived algorithm. It converts volumetric images into one-dimensional skeleton representations (stick figure drawings) that preserve geometric and topological structure, providing compact representations for visualization, connectivity analysis, and morphological studies in neuroscience and connectomics research.

3

4

## Package Information

5

6

- **Package Name**: kimimaro

7

- **Language**: Python (with C++ extensions via Cython)

8

- **Installation**: `pip install kimimaro`

9

- **Optional Dependencies**: `pip install "kimimaro[all]"` for full feature support

10

11

## Core Imports

12

13

```python

14

import kimimaro

15

```

16

17

For type hints and advanced functionality:

18

19

```python

20

from typing import Union, Dict, List, Tuple, Sequence

21

import numpy as np

22

from osteoid import Skeleton

23

from kimimaro import DimensionError

24

```

25

26

## Basic Usage

27

28

```python

29

import kimimaro

30

import numpy as np

31

32

# Load a densely labeled 3D volume (integers representing different objects)

33

labels = np.load("segmented_volume.npy") # shape: (Z, Y, X)

34

35

# Skeletonize all non-zero labels

36

skeletons = kimimaro.skeletonize(

37

labels,

38

teasar_params={

39

"scale": 1.5, # Rolling ball invalidation scale

40

"const": 300, # Minimum radius in physical units (nm)

41

"pdrf_scale": 100000, # Penalty distance field scale

42

"pdrf_exponent": 8, # Penalty field exponent

43

"soma_detection_threshold": 750, # Soma detection threshold (nm)

44

"soma_acceptance_threshold": 3500, # Soma acceptance threshold (nm)

45

"soma_invalidation_scale": 2, # Soma invalidation scale

46

"soma_invalidation_const": 300, # Soma invalidation constant (nm)

47

"max_paths": None, # Maximum paths to trace per object

48

},

49

anisotropy=(16, 16, 40), # Physical voxel dimensions (nm)

50

dust_threshold=1000, # Skip objects < 1000 voxels

51

progress=True, # Show progress bar

52

parallel=1, # Number of processes (1=single, <=0=all CPUs)

53

)

54

55

# Access individual skeletons by label ID

56

skeleton_for_label_5 = skeletons[5]

57

58

# Post-process a skeleton to improve quality

59

clean_skeleton = kimimaro.postprocess(

60

skeleton_for_label_5,

61

dust_threshold=1500, # Remove disconnected components < 1500 nm

62

tick_threshold=3000 # Remove small branches < 3000 nm

63

)

64

65

# Convert skeleton to SWC format for visualization

66

swc_string = clean_skeleton.to_swc()

67

print(swc_string)

68

```

69

70

## Architecture

71

72

Kimimaro implements a TEASAR-derived skeletonization algorithm optimized for densely labeled volumes:

73

74

- **TEASAR Algorithm**: Finds skeleton paths by tracing through a penalty field from root points to boundary extremes

75

- **Penalty Field**: Combines distance-to-boundary (DBF) and euclidean distance (EDF) to guide path selection

76

- **Soma Detection**: Identifies cellular bodies using DBF thresholds and applies specialized processing

77

- **Parallel Processing**: Supports multi-process skeletonization for large datasets

78

- **Memory Optimization**: Uses shared memory arrays and efficient data structures for large volumes

79

80

The algorithm works by:

81

1. Computing distance transforms (DBF) for each connected component

82

2. Finding root points (typically soma centers or high DBF regions)

83

3. Tracing paths via Dijkstra's algorithm through the penalty field

84

4. Applying rolling-ball invalidation to mark visited regions

85

5. Repeating until all significant regions are covered

86

87

## Capabilities

88

89

### Core Skeletonization

90

91

Main skeletonization functionality for converting labeled volumes into skeleton representations using TEASAR algorithm with parallel processing support and specialized soma handling.

92

93

```python { .api }

94

def skeletonize(

95

all_labels,

96

teasar_params=DEFAULT_TEASAR_PARAMS,

97

anisotropy=(1,1,1),

98

object_ids=None,

99

dust_threshold=1000,

100

progress=True,

101

fix_branching=True,

102

in_place=False,

103

fix_borders=True,

104

parallel=1,

105

parallel_chunk_size=100,

106

extra_targets_before=[],

107

extra_targets_after=[],

108

fill_holes=False,

109

fix_avocados=False,

110

voxel_graph=None

111

): ...

112

```

113

114

[Core Skeletonization](./core-skeletonization.md)

115

116

### Post-processing and Quality Enhancement

117

118

Skeleton post-processing functions for improving quality through component joining, dust removal, loop elimination, and branch trimming.

119

120

```python { .api }

121

def postprocess(

122

skeleton: Skeleton,

123

dust_threshold: float = 1500.0,

124

tick_threshold: float = 3000.0

125

) -> Skeleton: ...

126

127

def join_close_components(

128

skeletons: Sequence[Skeleton],

129

radius: float = float('inf'),

130

restrict_by_radius: bool = False

131

) -> Skeleton: ...

132

```

133

134

[Post-processing](./post-processing.md)

135

136

### Analysis and Utilities

137

138

Advanced analysis functions for cross-sectional measurements, oversegmentation, and skeleton extraction from binary images.

139

140

```python { .api }

141

def cross_sectional_area(

142

all_labels: np.ndarray,

143

skeletons: Union[Dict[int, Skeleton], List[Skeleton], Skeleton],

144

anisotropy: np.ndarray = np.array([1,1,1], dtype=np.float32),

145

smoothing_window: int = 1,

146

progress: bool = False,

147

in_place: bool = False,

148

fill_holes: bool = False,

149

repair_contacts: bool = False,

150

visualize_section_planes: bool = False,

151

step: int = 1

152

) -> Union[Dict[int, Skeleton], List[Skeleton], Skeleton]: ...

153

154

def extract_skeleton_from_binary_image(image: np.ndarray) -> Skeleton: ...

155

156

def oversegment(

157

all_labels: np.ndarray,

158

skeletons: Union[Dict[int, Skeleton], List[Skeleton], Skeleton],

159

anisotropy: np.ndarray = np.array([1,1,1], dtype=np.float32),

160

progress: bool = False,

161

fill_holes: bool = False,

162

in_place: bool = False,

163

downsample: int = 0

164

) -> Tuple[np.ndarray, Union[Dict[int, Skeleton], List[Skeleton], Skeleton]]: ...

165

```

166

167

[Analysis and Utilities](./analysis-utilities.md)

168

169

### Point Connection and Targeting

170

171

Functions for connecting specific points and converting synapse locations to skeleton targets.

172

173

```python { .api }

174

def connect_points(

175

labels,

176

start,

177

end,

178

anisotropy=(1,1,1),

179

fill_holes=False,

180

in_place=False,

181

pdrf_scale=100000,

182

pdrf_exponent=4

183

): ...

184

185

def synapses_to_targets(labels, synapses, progress=False): ...

186

```

187

188

[Point Connection](./point-connection.md)

189

190

### Command Line Interface

191

192

Complete command-line interface for skeletonization, visualization, and SWC file manipulation without requiring Python programming.

193

194

```bash { .api }

195

# Main skeletonization command

196

kimimaro forge <input_file> [options]

197

198

# View/visualize files

199

kimimaro view <file.swc|file.npy>

200

201

# Convert between formats

202

kimimaro swc from <binary_image>

203

kimimaro swc to <swc_file> --format <npy|tiff>

204

205

# Display software license

206

kimimaro license

207

```

208

209

[CLI Interface](./cli-interface.md)

210

211

## Constants and Configuration

212

213

```python { .api }

214

DEFAULT_TEASAR_PARAMS = {

215

"scale": 1.5,

216

"const": 300,

217

"pdrf_scale": 100000,

218

"pdrf_exponent": 8,

219

"soma_acceptance_threshold": 3500,

220

"soma_detection_threshold": 750,

221

"soma_invalidation_const": 300,

222

"soma_invalidation_scale": 2,

223

"max_paths": None

224

}

225

```

226

227

## Exception Classes

228

229

```python { .api }

230

class DimensionError(Exception):

231

"""Raised when input arrays have incompatible dimensions for skeletonization."""

232

pass

233

```