or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

compression.mdcopc.mdcore-io.mddata-containers.mdindex.mdio-handlers.mdpoint-data.mdvlr.md

index.mddocs/

0

# Laspy

1

2

Laspy is a comprehensive Python library for reading, modifying, and creating LAS (Light Detection and Ranging) LiDAR files. It provides native Python support for both standard LAS and compressed LAZ formats, enabling efficient processing of point cloud data from laser scanning with support for streaming, chunking, and Cloud Optimized Point Cloud (COPC) formats.

3

4

## Package Information

5

6

- **Package Name**: laspy

7

- **Language**: Python

8

- **Installation**: `pip install laspy` (basic LAS support)

9

- **Installation with LAZ**: `pip install laspy[lazrs]` or `pip install laspy[laszip]`

10

11

## Core Imports

12

13

```python

14

import laspy

15

```

16

17

For specific functionality:

18

19

```python

20

from laspy import LasData, LasHeader, LasReader, LasWriter

21

from laspy import CopcReader, Bounds # COPC support

22

from laspy import PointFormat, ExtraBytesParams # Point format handling

23

from laspy import VLR # Variable Length Records

24

```

25

26

## Basic Usage

27

28

### Simple Read and Write

29

30

```python

31

import laspy

32

33

# Read entire LAS file into memory

34

las = laspy.read('input.las')

35

36

# Filter points (keep only ground points)

37

las.points = las.points[las.classification == 2]

38

39

# Write to compressed LAZ format

40

las.write('ground.laz')

41

```

42

43

### Header Inspection

44

45

```python

46

import laspy

47

48

# Open file for header inspection (doesn't load points)

49

with laspy.open('input.las') as f:

50

print(f"Point format: {f.header.point_format}")

51

print(f"Number of points: {f.header.point_count}")

52

print(f"Bounds: {f.header.mins} to {f.header.maxs}")

53

```

54

55

### Chunked Processing

56

57

```python

58

import laspy

59

60

# Process large files in chunks to save memory

61

with laspy.open('large.laz') as input_las:

62

with laspy.open('filtered.laz', mode="w", header=input_las.header) as output_las:

63

for points in input_las.chunk_iterator(2_000_000):

64

# Filter and write ground points

65

ground_points = points[points.classification == 2]

66

output_las.write_points(ground_points)

67

```

68

69

## Architecture

70

71

Laspy follows a hierarchical design centered around key components:

72

73

- **LasData**: Main container synchronizing header, points, and VLRs

74

- **LasHeader**: File metadata, coordinate reference systems, and format specifications

75

- **Point Records**: Raw (PackedPointRecord) and scaled (ScaleAwarePointRecord) point data

76

- **I/O Handlers**: Readers, writers, and appendors for different access patterns

77

- **Point Formats**: Flexible point format definitions supporting standard and custom dimensions

78

- **Compression**: Multiple LAZ backends (lazrs, laszip) with selective decompression

79

80

## Capabilities

81

82

### Core I/O Operations

83

84

High-level functions for reading, writing, creating, and memory-mapping LAS files. Supports streaming, chunked processing, and multiple compression backends.

85

86

```python { .api }

87

def read(source, closefd=True, laz_backend=None, decompression_selection=None, encoding_errors="strict") -> LasData: ...

88

def open(source, mode="r", closefd=True, laz_backend=None, header=None, do_compress=None, encoding_errors="strict", read_evlrs=True, decompression_selection=None): ...

89

# Returns LasReader (mode="r"), LasWriter (mode="w"), or LasAppender (mode="a")

90

def create(*, point_format=None, file_version=None) -> LasData: ...

91

def mmap(filename) -> LasMMAP: ... # Memory-mapped LAS data (extends LasData)

92

def convert(source_las, *, point_format_id=None, file_version=None) -> LasData: ...

93

```

94

95

[Core I/O Operations](./core-io.md)

96

97

### Point Data Handling

98

99

Comprehensive point format management including standard LAS point formats, custom extra dimensions, and efficient point record processing with coordinate scaling.

100

101

```python { .api }

102

class PointFormat:

103

def __init__(self, point_format_id: int): ...

104

def add_extra_dimension(self, params: ExtraBytesParams): ...

105

def dimension_by_name(self, name: str) -> DimensionInfo: ...

106

107

class PackedPointRecord:

108

@staticmethod

109

def zeros(point_count, point_format) -> PackedPointRecord: ...

110

def __getitem__(self, key): ...

111

def __setitem__(self, key, value): ...

112

113

class ScaleAwarePointRecord(PackedPointRecord):

114

@staticmethod

115

def zeros(point_count, *, point_format=None, scales=None, offsets=None, header=None) -> ScaleAwarePointRecord: ...

116

def change_scaling(self, scales=None, offsets=None): ...

117

```

118

119

[Point Data Handling](./point-data.md)

120

121

### COPC Operations

122

123

Cloud Optimized Point Cloud support for efficient web-based and spatial querying of large LiDAR datasets with HTTP streaming capabilities.

124

125

```python { .api }

126

class CopcReader:

127

@classmethod

128

def open(cls, source, http_num_threads=None, decompression_selection=None) -> CopcReader: ...

129

def query(self, bounds=None, resolution=None, level=None) -> ScaleAwarePointRecord: ...

130

def spatial_query(self, bounds: Bounds) -> ScaleAwarePointRecord: ...

131

def level_query(self, level) -> ScaleAwarePointRecord: ...

132

133

class Bounds:

134

def __init__(self, mins, maxs): ...

135

def overlaps(self, other: Bounds) -> bool: ...

136

```

137

138

[COPC Operations](./copc.md)

139

140

### Data Container Classes

141

142

Primary data container classes for managing LAS file components including headers, point data, and metadata integration.

143

144

```python { .api }

145

class LasData:

146

def __init__(self, header: LasHeader, points=None): ...

147

def add_extra_dim(self, params: ExtraBytesParams): ...

148

def write(self, destination, do_compress=None, laz_backend=None): ...

149

def change_scaling(self, scales=None, offsets=None): ...

150

151

class LasHeader:

152

def __init__(self, *, version=None, point_format=None): ...

153

def add_extra_dims(self, params: List[ExtraBytesParams]): ...

154

def add_crs(self, crs, keep_compatibility=True): ...

155

def update(self, points: PackedPointRecord): ...

156

```

157

158

[Data Container Classes](./data-containers.md)

159

160

### I/O Handler Classes

161

162

Specialized reader, writer, and appender classes for different file access patterns including streaming, chunked processing, and memory mapping.

163

164

```python { .api }

165

class LasReader:

166

def __init__(self, source, closefd=True, laz_backend=None, read_evlrs=True, decompression_selection=None): ...

167

def read_points(self, n: int) -> ScaleAwarePointRecord: ...

168

def chunk_iterator(self, points_per_iteration: int): ...

169

170

class LasWriter:

171

def __init__(self, dest, header: LasHeader, do_compress=None, laz_backend=None, closefd=True, encoding_errors="strict"): ...

172

def write_points(self, points: PackedPointRecord): ...

173

174

class LasAppender:

175

def append_points(self, points: PackedPointRecord): ...

176

```

177

178

[I/O Handler Classes](./io-handlers.md)

179

180

### VLR Management

181

182

Variable Length Record handling for storing metadata, coordinate reference systems, and custom application data within LAS files.

183

184

```python { .api }

185

class VLR:

186

def __init__(self, user_id, record_id, description="", record_data=b""): ...

187

def record_data_bytes(self) -> bytes: ...

188

189

# VLR utilities from vlrs module

190

from laspy.vlrs import geotiff # GeoTIFF VLR support

191

```

192

193

[VLR Management](./vlr.md)

194

195

### Compression Support

196

197

LAZ compression backend management with support for multiple compression libraries and selective field decompression for efficient processing.

198

199

```python { .api }

200

class LazBackend(Enum):

201

LazrsParallel = 0

202

Lazrs = 1

203

Laszip = 2

204

205

@classmethod

206

def detect_available(cls): ...

207

def is_available(self) -> bool: ...

208

209

class DecompressionSelection:

210

@classmethod

211

def all(cls) -> DecompressionSelection: ...

212

@classmethod

213

def base(cls) -> DecompressionSelection: ...

214

```

215

216

[Compression Support](./compression.md)

217

218

## Types

219

220

```python { .api }

221

class DimensionInfo:

222

name: str

223

kind: DimensionKind

224

num_bits: int

225

num_elements: int

226

is_standard: bool

227

description: str

228

offsets: Optional[np.ndarray]

229

scales: Optional[np.ndarray]

230

no_data: Optional[np.ndarray]

231

232

class DimensionKind(Enum):

233

SignedInteger = 0

234

UnsignedInteger = 1

235

FloatingPoint = 2

236

BitField = 3

237

238

class ExtraBytesParams:

239

def __init__(self, name: str, type, description="", offsets=None, scales=None, no_data=None): ...

240

241

class Version:

242

major: int

243

minor: int

244

245

class LaspyException(Exception): ...

246

```