CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-neo

A comprehensive Python library for representing electrophysiology data with support for reading and writing a wide range of neurophysiology file formats.

Pending
Overview
Eval results
Files

data-utilities.mddocs/

Data Manipulation Utilities

Utility functions for extracting, filtering, and manipulating electrophysiology data objects within Neo's hierarchical structure. These tools provide high-level operations for common analysis workflows, data organization, and dataset management.

Capabilities

Event and Epoch Extraction

Functions for finding and extracting specific events and epochs based on properties and criteria.

def get_events(container, **properties):
    """
    Extract events matching specified criteria from Neo containers.
    
    Parameters:
    - container: Neo container (Block, Segment, or Group)
    - **properties: Event properties to match (name, labels, times, etc.)
    
    Returns:
    list: Event objects matching the specified criteria
    
    Examples:
    - get_events(segment, name='stimulus')
    - get_events(block, labels=['start', 'stop'])
    - get_events(segment, times=GreaterThan(1.0*pq.s))
    """

def get_epochs(container, **properties):
    """
    Extract epochs matching specified criteria from Neo containers.
    
    Parameters:
    - container: Neo container (Block, Segment, or Group)  
    - **properties: Epoch properties to match (name, labels, times, durations, etc.)
    
    Returns:
    list: Epoch objects matching the specified criteria
    
    Examples:
    - get_epochs(segment, name='stimulus_period')
    - get_epochs(block, labels=['baseline', 'response'])
    - get_epochs(segment, durations=GreaterThan(0.5*pq.s))
    """

def add_epoch(container, epoch):
    """
    Add epoch annotations to Neo containers.
    
    Parameters:
    - container: Neo container (Block, Segment, or Group)
    - epoch: Epoch object to add
    
    The epoch is added to all appropriate containers in the hierarchy.
    """

def match_events(container1, container2, **matching_criteria):
    """
    Match events across different Neo containers based on specified criteria.
    
    Parameters:
    - container1, container2: Neo containers to compare
    - **matching_criteria: Criteria for matching events
    
    Returns:
    list: Pairs of matched events from the two containers
    
    Useful for aligning events across different recording sessions
    or comparing events before and after processing.
    """

Data Segmentation and Slicing

Functions for temporal segmentation and extraction of data based on epochs and time windows.

def cut_block_by_epochs(block, epochs):
    """
    Create new Block containing data segmented by epoch boundaries.
    
    Parameters:
    - block: Neo Block object containing data to segment
    - epochs: Epoch objects or list of epochs defining boundaries
    
    Returns:
    Block: New Block with segments corresponding to each epoch
    
    Each epoch becomes a new Segment containing all data objects
    that fall within the epoch's time boundaries.
    """

def cut_segment_by_epoch(segment, epoch):
    """
    Create new Segment containing data within specified epoch boundaries.
    
    Parameters:
    - segment: Neo Segment object to slice
    - epoch: Single Epoch object defining time boundaries
    
    Returns:
    Segment: New Segment containing only data within epoch timespan
    
    All data objects (AnalogSignal, SpikeTrain, Event, etc.) are
    temporally sliced to fit within the epoch boundaries.
    """

Data Compatibility and Validation

Functions for checking data compatibility and validation across different Neo operations.

def is_block_rawio_compatible(block):
    """
    Check if Block is compatible with RawIO interface requirements.
    
    Parameters:
    - block: Neo Block object to validate
    
    Returns:
    bool: True if Block structure is compatible with RawIO operations
    
    Validates that the Block structure, sampling rates, channel
    organizations, and data types are consistent with RawIO
    expectations for high-performance file access.
    """

Dataset Management

Functions for downloading and managing public electrophysiology datasets for testing and examples.

def download_dataset(name, data_home=None):
    """
    Download public electrophysiology datasets for testing and examples.
    
    Parameters:
    - name (str): Name of dataset to download
    - data_home (str, optional): Directory to store datasets
    
    Returns:
    str: Path to downloaded dataset directory
    
    Available datasets include example recordings from various
    hardware systems and file formats for testing I/O capabilities
    and exploring Neo functionality.
    """

def get_local_testing_data_folder():
    """
    Get path to local testing data folder containing example files.
    
    Returns:
    str: Path to directory containing test data files
    
    Returns the location where Neo stores small example files
    for testing different I/O classes and data structures.
    """

Usage Examples

Event and Epoch Operations

import neo
from neo.utils import get_events, get_epochs, cut_segment_by_epoch
from neo.core.filters import Equals, GreaterThan
import quantities as pq

# Load data
io = neo.io.get_io('experiment.abf')
block = io.read_block()
segment = block.segments[0]

# Extract specific events
stimulus_events = get_events(segment, name='stimulus_onset')
response_events = get_events(segment, labels=['button_press', 'lever_pull'])

# Extract events by timing criteria  
late_events = get_events(segment, times=GreaterThan(10.0*pq.s))

# Extract epochs
baseline_epochs = get_epochs(segment, name='baseline')
long_epochs = get_epochs(segment, durations=GreaterThan(2.0*pq.s))

# Segment data by epochs
if baseline_epochs:
    baseline_segment = cut_segment_by_epoch(segment, baseline_epochs[0])
    print(f"Baseline segment duration: {baseline_segment.duration}")
    print(f"Number of spike trains: {len(baseline_segment.spiketrains)}")
    print(f"Number of analog signals: {len(baseline_segment.analogsignals)}")

Data Segmentation Workflow

from neo.utils import cut_block_by_epochs, add_epoch
import numpy as np

# Create trial epochs for segmentation
trial_starts = np.array([0, 30, 60, 90]) * pq.s
trial_duration = 25 * pq.s

trial_epochs = []
for i, start in enumerate(trial_starts):
    epoch = neo.Epoch(
        times=[start],
        durations=[trial_duration], 
        labels=[f'trial_{i+1}']
    )
    trial_epochs.append(epoch)
    add_epoch(segment, epoch)

# Segment entire block by trials
segmented_block = cut_block_by_epochs(block, trial_epochs)

print(f"Original block: {len(block.segments)} segments")
print(f"Segmented block: {len(segmented_block.segments)} segments")

# Process each trial segment
for i, trial_segment in enumerate(segmented_block.segments):
    print(f"Trial {i+1}:")
    print(f"  Duration: {trial_segment.duration}")
    print(f"  Spike trains: {len(trial_segment.spiketrains)}")
    print(f"  Events: {len(trial_segment.events)}")

Event Matching Across Sessions

from neo.utils import match_events

# Load pre and post-treatment sessions
pre_io = neo.io.get_io('pre_treatment.abf')
post_io = neo.io.get_io('post_treatment.abf')

pre_block = pre_io.read_block()
post_block = post_io.read_block()

# Match stimulus events across sessions
matched_stimuli = match_events(
    pre_block.segments[0], 
    post_block.segments[0],
    name=Equals('stimulus'),
    tolerance=0.1*pq.s  # Allow 100ms timing difference
)

print(f"Found {len(matched_stimuli)} matched stimulus events")

# Compare responses to matched stimuli
for pre_event, post_event in matched_stimuli:
    pre_time = pre_event.times[0]
    post_time = post_event.times[0]
    time_diff = post_time - pre_time  
    print(f"Stimulus at {pre_time}: time shift = {time_diff}")

Data Validation and Compatibility

from neo.utils import is_block_rawio_compatible

# Check if data is suitable for high-performance access
if is_block_rawio_compatible(block):
    print("Block is compatible with RawIO - can use high-performance access")
    
    # Use RawIO for efficient data access
    import neo.rawio
    rawio = neo.rawio.get_rawio('data_file.ns5')
    rawio.parse_header()
    
    # Access signal chunks efficiently
    chunk = rawio.get_analogsignal_chunk(
        block_index=0,
        seg_index=0, 
        i_start=1000,
        i_stop=2000,
        channel_indexes=[0, 1, 2]
    )
else:
    print("Block requires standard I/O access")
    # Use regular Neo I/O methods

Dataset Management

from neo.utils import download_dataset, get_local_testing_data_folder

# Download example datasets
dataset_path = download_dataset('example_abf_files')
print(f"Downloaded dataset to: {dataset_path}")

# Get local test data location
test_data_folder = get_local_testing_data_folder()
print(f"Test data available at: {test_data_folder}")

# Use test data for exploring formats
import os
test_files = os.listdir(test_data_folder)
abf_files = [f for f in test_files if f.endswith('.abf')]

if abf_files:
    test_file = os.path.join(test_data_folder, abf_files[0])
    io = neo.io.AxonIO(test_file)
    block = io.read_block()
    print(f"Test file loaded: {len(block.segments)} segments")

Advanced Filtering and Processing

from neo.core.filters import Equals, InRange, GreaterThan
import quantities as pq

# Complex event filtering
movement_events = get_events(
    segment,
    name=Equals('movement'),
    times=InRange(5.0*pq.s, 15.0*pq.s),  # Between 5-15 seconds
    annotations={'movement_type': 'reach'}  # Custom annotation matching
)

# Multi-criteria epoch extraction
response_periods = get_epochs(
    segment,
    labels=['response_start', 'response_end'],
    durations=GreaterThan(0.5*pq.s),
    annotations={'condition': 'experimental'}
)

# Batch processing across multiple segments
all_segments = []
for original_segment in block.segments:
    for epoch in response_periods:
        if epoch in original_segment.epochs:
            processed_segment = cut_segment_by_epoch(original_segment, epoch)
            processed_segment.name = f"{original_segment.name}_{epoch.labels[0]}"
            all_segments.append(processed_segment)

print(f"Created {len(all_segments)} processed segments")

Data Organization and Cleanup

# Reorganize data by extracting specific time windows
def extract_peri_event_data(segment, events, window=(-1.0, 2.0)):
    """Extract data windows around events."""
    peri_event_segments = []
    
    for i, event in enumerate(events):
        event_time = event.times[0]
        start_time = event_time + window[0]*pq.s
        end_time = event_time + window[1]*pq.s
        
        # Create epoch for this time window
        peri_epoch = neo.Epoch(
            times=[start_time],
            durations=[end_time - start_time],
            labels=[f'peri_event_{i}']
        )
        
        # Extract data for this window
        peri_segment = cut_segment_by_epoch(segment, peri_epoch)
        peri_segment.name = f"event_{i}_window"
        peri_event_segments.append(peri_segment)
    
    return peri_event_segments

# Extract peri-stimulus data
stimulus_events = get_events(segment, name='stimulus')
peri_stim_segments = extract_peri_event_data(
    segment, 
    stimulus_events, 
    window=(-0.5, 1.5)  # 500ms before to 1.5s after stimulus
)

print(f"Extracted {len(peri_stim_segments)} peri-stimulus segments")

Types

# Container types for utilities
Container = Block | Segment | Group     # Any Neo container type
ContainerList = list[Container]         # List of containers

# Event and epoch types
EventList = list[Event]                 # List of Event objects  
EpochList = list[Epoch]                 # List of Epoch objects
EventPair = tuple[Event, Event]         # Matched event pair
MatchedEvents = list[EventPair]         # List of matched event pairs

# Property matching types
PropertyDict = dict[str, Any]           # Property name to value mapping
FilterCriteria = dict[str, FilterCondition]  # Filtering criteria
MatchingCriteria = dict[str, Any]       # Event matching criteria

# Time and boundary types
TimeWindow = tuple[pq.Quantity, pq.Quantity]  # Start and end times
TimeBoundaries = list[TimeWindow]             # Multiple time windows
EpochBoundaries = list[Epoch]                 # Epoch-based boundaries

# Dataset management types
DatasetName = str                       # Name of available dataset
DatasetPath = str                       # Path to dataset directory
TestDataPath = str                      # Path to test data folder

# Validation types
CompatibilityResult = bool              # Compatibility check result
ValidationError = Exception             # Validation error type

Install with Tessl CLI

npx tessl i tessl/pypi-neo

docs

core-data-structures.md

data-utilities.md

file-io-support.md

index.md

rawio-access.md

tile.json