Modern high-performance serialization utilities for Python
npx @tessl/cli install tessl/pypi-srsly@1.0.0Modern high-performance serialization utilities for Python that bundle JSON, MessagePack, and pickle serialization into a single package with a unified API. srsly provides cross-platform compatibility, optimized performance through C extensions, and comprehensive serialization support with zero external dependencies.
pip install srslyimport srslyAll functions are available directly from the main module:
from srsly import json_dumps, json_loads, msgpack_dumps, msgpack_loads, pickle_dumps, pickle_loadsFor advanced usage, access underlying modules:
import srsly.ujson # High-performance JSON
import srsly.msgpack # MessagePack with numpy support
import srsly.cloudpickle # Enhanced pickleimport srsly
# JSON serialization
data = {"name": "John", "age": 30, "scores": [85, 92, 78]}
json_string = srsly.json_dumps(data, indent=2)
parsed_data = srsly.json_loads(json_string)
# File operations
srsly.write_json("data.json", data)
loaded_data = srsly.read_json("data.json")
# JSONL (newline-delimited JSON)
lines = [{"id": 1, "name": "Alice"}, {"id": 2, "name": "Bob"}]
srsly.write_jsonl("data.jsonl", lines)
for item in srsly.read_jsonl("data.jsonl"):
print(item)
# MessagePack serialization
msgpack_bytes = srsly.msgpack_dumps(data)
restored_data = srsly.msgpack_loads(msgpack_bytes)
# Pickle serialization
pickle_bytes = srsly.pickle_dumps(data)
unpickled_data = srsly.pickle_loads(pickle_bytes)srsly integrates optimized forks of leading Python serialization libraries:
The library handles cross-platform serialization issues (encodings, locales, large files) and provides utilities for standard input/output, gzip compression, and validation.
High-performance JSON operations with file I/O support, gzip compression, JSONL format handling, and cross-platform compatibility. Includes validation utilities and support for standard input/output streams.
def json_dumps(data, indent=0, sort_keys=False): ...
def json_loads(data): ...
def read_json(location): ...
def write_json(location, data, indent=2): ...
def read_gzip_json(location): ...
def write_gzip_json(location, data, indent=2): ...
def read_jsonl(location, skip=False): ...
def write_jsonl(location, lines, append=False, append_new_line=True): ...
def is_json_serializable(obj): ...Binary serialization with numpy array support, optimized for performance and cross-language compatibility. Provides both high-level API functions and access to underlying packer/unpacker classes.
def msgpack_dumps(data): ...
def msgpack_loads(data, use_list=True): ...
def read_msgpack(location, use_list=True): ...
def write_msgpack(location, data): ...Enhanced pickle operations using cloudpickle for cloud computing compatibility and complex object serialization including functions, lambdas, and classes.
def pickle_dumps(data, protocol=None): ...
def pickle_loads(data): ...Low-level, high-performance JSON serialization with C extensions, providing the fastest possible JSON encoding and decoding operations. Used internally by the main JSON API functions.
def dumps(obj, indent=0, ensure_ascii=True, double_precision=9, encode_html_chars=False, escape_forward_slashes=True): ...
def loads(data, precise_float=False): ...
def dump(obj, fp, indent=0, ensure_ascii=True, double_precision=9, encode_html_chars=False, escape_forward_slashes=True): ...
def load(fp, precise_float=False): ...
def encode(obj, indent=0, ensure_ascii=True, double_precision=9, encode_html_chars=False, escape_forward_slashes=True): ...
def decode(data, precise_float=False): ...# Path-like objects supported throughout the API
PathLike = str | pathlib.Path
# Special location values
Location = PathLike | "-" # "-" represents stdin/stdout
# JSON-serializable data types
JSONSerializable = dict | list | str | int | float | bool | None