or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

compression.mdcopc.mdcore-io.mddata-containers.mdindex.mdio-handlers.mdpoint-data.mdvlr.md
tile.json

tessl/pypi-laspy

Native Python ASPRS LAS read/write library for processing LiDAR point cloud data

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
pypipkg:pypi/laspy@2.6.x

To install, run

npx @tessl/cli install tessl/pypi-laspy@2.6.0

index.mddocs/

Laspy

Laspy is a comprehensive Python library for reading, modifying, and creating LAS (Light Detection and Ranging) LiDAR files. It provides native Python support for both standard LAS and compressed LAZ formats, enabling efficient processing of point cloud data from laser scanning with support for streaming, chunking, and Cloud Optimized Point Cloud (COPC) formats.

Package Information

  • Package Name: laspy
  • Language: Python
  • Installation: pip install laspy (basic LAS support)
  • Installation with LAZ: pip install laspy[lazrs] or pip install laspy[laszip]

Core Imports

import laspy

For specific functionality:

from laspy import LasData, LasHeader, LasReader, LasWriter
from laspy import CopcReader, Bounds  # COPC support
from laspy import PointFormat, ExtraBytesParams  # Point format handling
from laspy import VLR  # Variable Length Records

Basic Usage

Simple Read and Write

import laspy

# Read entire LAS file into memory
las = laspy.read('input.las')

# Filter points (keep only ground points) 
las.points = las.points[las.classification == 2]

# Write to compressed LAZ format
las.write('ground.laz')

Header Inspection

import laspy

# Open file for header inspection (doesn't load points)
with laspy.open('input.las') as f:
    print(f"Point format: {f.header.point_format}")
    print(f"Number of points: {f.header.point_count}")
    print(f"Bounds: {f.header.mins} to {f.header.maxs}")

Chunked Processing

import laspy

# Process large files in chunks to save memory
with laspy.open('large.laz') as input_las:
    with laspy.open('filtered.laz', mode="w", header=input_las.header) as output_las:
        for points in input_las.chunk_iterator(2_000_000):
            # Filter and write ground points
            ground_points = points[points.classification == 2]
            output_las.write_points(ground_points)

Architecture

Laspy follows a hierarchical design centered around key components:

  • LasData: Main container synchronizing header, points, and VLRs
  • LasHeader: File metadata, coordinate reference systems, and format specifications
  • Point Records: Raw (PackedPointRecord) and scaled (ScaleAwarePointRecord) point data
  • I/O Handlers: Readers, writers, and appendors for different access patterns
  • Point Formats: Flexible point format definitions supporting standard and custom dimensions
  • Compression: Multiple LAZ backends (lazrs, laszip) with selective decompression

Capabilities

Core I/O Operations

High-level functions for reading, writing, creating, and memory-mapping LAS files. Supports streaming, chunked processing, and multiple compression backends.

def read(source, closefd=True, laz_backend=None, decompression_selection=None, encoding_errors="strict") -> LasData: ...
def open(source, mode="r", closefd=True, laz_backend=None, header=None, do_compress=None, encoding_errors="strict", read_evlrs=True, decompression_selection=None): ...
# Returns LasReader (mode="r"), LasWriter (mode="w"), or LasAppender (mode="a")
def create(*, point_format=None, file_version=None) -> LasData: ...
def mmap(filename) -> LasMMAP: ...  # Memory-mapped LAS data (extends LasData)
def convert(source_las, *, point_format_id=None, file_version=None) -> LasData: ...

Core I/O Operations

Point Data Handling

Comprehensive point format management including standard LAS point formats, custom extra dimensions, and efficient point record processing with coordinate scaling.

class PointFormat:
    def __init__(self, point_format_id: int): ...
    def add_extra_dimension(self, params: ExtraBytesParams): ...
    def dimension_by_name(self, name: str) -> DimensionInfo: ...

class PackedPointRecord:
    @staticmethod  
    def zeros(point_count, point_format) -> PackedPointRecord: ...
    def __getitem__(self, key): ...
    def __setitem__(self, key, value): ...

class ScaleAwarePointRecord(PackedPointRecord):
    @staticmethod
    def zeros(point_count, *, point_format=None, scales=None, offsets=None, header=None) -> ScaleAwarePointRecord: ...
    def change_scaling(self, scales=None, offsets=None): ...

Point Data Handling

COPC Operations

Cloud Optimized Point Cloud support for efficient web-based and spatial querying of large LiDAR datasets with HTTP streaming capabilities.

class CopcReader:
    @classmethod
    def open(cls, source, http_num_threads=None, decompression_selection=None) -> CopcReader: ...
    def query(self, bounds=None, resolution=None, level=None) -> ScaleAwarePointRecord: ...
    def spatial_query(self, bounds: Bounds) -> ScaleAwarePointRecord: ...
    def level_query(self, level) -> ScaleAwarePointRecord: ...

class Bounds:
    def __init__(self, mins, maxs): ...
    def overlaps(self, other: Bounds) -> bool: ...

COPC Operations

Data Container Classes

Primary data container classes for managing LAS file components including headers, point data, and metadata integration.

class LasData:
    def __init__(self, header: LasHeader, points=None): ...
    def add_extra_dim(self, params: ExtraBytesParams): ...
    def write(self, destination, do_compress=None, laz_backend=None): ...
    def change_scaling(self, scales=None, offsets=None): ...

class LasHeader:  
    def __init__(self, *, version=None, point_format=None): ...
    def add_extra_dims(self, params: List[ExtraBytesParams]): ...
    def add_crs(self, crs, keep_compatibility=True): ...
    def update(self, points: PackedPointRecord): ...

Data Container Classes

I/O Handler Classes

Specialized reader, writer, and appender classes for different file access patterns including streaming, chunked processing, and memory mapping.

class LasReader:
    def __init__(self, source, closefd=True, laz_backend=None, read_evlrs=True, decompression_selection=None): ...
    def read_points(self, n: int) -> ScaleAwarePointRecord: ...
    def chunk_iterator(self, points_per_iteration: int): ...

class LasWriter:
    def __init__(self, dest, header: LasHeader, do_compress=None, laz_backend=None, closefd=True, encoding_errors="strict"): ...
    def write_points(self, points: PackedPointRecord): ...

class LasAppender:
    def append_points(self, points: PackedPointRecord): ...

I/O Handler Classes

VLR Management

Variable Length Record handling for storing metadata, coordinate reference systems, and custom application data within LAS files.

class VLR:
    def __init__(self, user_id, record_id, description="", record_data=b""): ...
    def record_data_bytes(self) -> bytes: ...

# VLR utilities from vlrs module
from laspy.vlrs import geotiff  # GeoTIFF VLR support

VLR Management

Compression Support

LAZ compression backend management with support for multiple compression libraries and selective field decompression for efficient processing.

class LazBackend(Enum):
    LazrsParallel = 0
    Lazrs = 1  
    Laszip = 2
    
    @classmethod
    def detect_available(cls): ...
    def is_available(self) -> bool: ...

class DecompressionSelection:
    @classmethod 
    def all(cls) -> DecompressionSelection: ...
    @classmethod
    def base(cls) -> DecompressionSelection: ...

Compression Support

Types

class DimensionInfo:
    name: str
    kind: DimensionKind  
    num_bits: int
    num_elements: int
    is_standard: bool
    description: str
    offsets: Optional[np.ndarray]
    scales: Optional[np.ndarray]
    no_data: Optional[np.ndarray]

class DimensionKind(Enum):
    SignedInteger = 0
    UnsignedInteger = 1
    FloatingPoint = 2
    BitField = 3

class ExtraBytesParams:
    def __init__(self, name: str, type, description="", offsets=None, scales=None, no_data=None): ...

class Version:
    major: int
    minor: int

class LaspyException(Exception): ...