Apache Avro meta-framework that coordinates data serialization implementations across multiple programming languages
—
References and coordination information for language-specific Avro implementations across multiple programming languages, enabling developers to access appropriate Avro libraries for their target platforms.
Primary language implementation providing comprehensive Avro functionality with the richest feature set and tooling ecosystem.
<!-- Maven coordinates for Java implementation -->
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.12.0</version>
</dependency>
<!-- Additional Java modules -->
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-compiler</artifactId>
<version>1.12.0</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-ipc</artifactId>
<version>1.12.0</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-tools</artifactId>
<version>1.12.0</version>
</dependency>Location: lang/java
Key Modules:
avro: Core serialization and schema librariesavro-compiler: Schema compilation and code generationavro-ipc: RPC communication frameworkavro-tools: Command-line utilities and toolsavro-mapred: MapReduce integrationavro-maven-plugin: Maven build integrationUsage Examples:
// Basic serialization
Schema schema = new Schema.Parser().parse(schemaJson);
GenericRecord record = new GenericData.Record(schema);
record.put("field", value);
// Code generation
SpecificRecord specificRecord = MyRecord.newBuilder()
.setField(value)
.build();Pure Python implementation providing full Avro compatibility for Python applications and data science workflows.
# Installation
pip install avro-python3==1.12.0
# Alternative installation
pip install apache-avro==1.12.0Location: lang/py
Package Name: avro-python3 or apache-avro
Usage Examples:
import avro.schema
import avro.io
import json
# Load schema
schema = avro.schema.parse(open("schema.avsc").read())
# Serialize data
writer = avro.io.DatumWriter(schema)
bytes_writer = io.BytesIO()
encoder = avro.io.BinaryEncoder(bytes_writer)
writer.write(record, encoder)Node.js and browser-compatible implementation for JavaScript/TypeScript applications and web development.
# NPM installation
npm install avro-js@1.12.0
# Yarn installation
yarn add avro-js@1.12.0Location: lang/js
Package Name: avro-js
Usage Examples:
const avro = require('avro-js');
// Parse schema
const schema = avro.parse(schemaJson);
// Serialize/deserialize
const buffer = schema.toBuffer(record);
const deserialized = schema.fromBuffer(buffer);High-performance C++ implementation for systems programming and performance-critical applications.
# CMake configuration
find_package(Avro REQUIRED)
target_link_libraries(your_target Avro::avro)
# Compiler flags
set(CMAKE_CXX_STANDARD 11)Location: lang/c++
Build System: CMake
Usage Examples:
#include <avro/Encoder.hh>
#include <avro/Decoder.hh>
// Serialize
avro::EncoderPtr encoder = avro::binaryEncoder();
avro::encode(*encoder, record);
// Deserialize
avro::DecoderPtr decoder = avro::binaryDecoder();
avro::decode(*decoder, record);.NET implementation for C# and other .NET languages, supporting .NET Framework and .NET Core.
<!-- NuGet package reference -->
<PackageReference Include="Apache.Avro" Version="1.12.0" />Location: lang/csharp
Package Name: Apache.Avro
Usage Examples:
using Avro;
using Avro.IO;
// Serialize
var writer = new BinaryEncoder(stream);
var datumWriter = new GenericDatumWriter<GenericRecord>(schema);
datumWriter.Write(record, writer);
// Deserialize
var reader = new BinaryDecoder(stream);
var datumReader = new GenericDatumReader<GenericRecord>(schema, schema);
var result = datumReader.Read(null, reader);Support for additional programming languages with varying levels of functionality and community maintenance.
# Language implementation matrix
implementations:
c:
location: "lang/c"
status: "maintained"
buildSystem: "autotools"
features: ["serialization", "deserialization"]
perl:
location: "lang/perl"
status: "maintained"
package: "Avro"
features: ["serialization", "deserialization", "schema_validation"]
php:
location: "lang/php"
status: "maintained"
package: "avro-php"
features: ["serialization", "deserialization"]
ruby:
location: "lang/ruby"
status: "maintained"
gem: "avro"
features: ["serialization", "deserialization", "schema_resolution"]
rust:
location: "lang/rust"
status: "active_development"
crate: "apache-avro"
features: ["serialization", "deserialization", "async_io", "compression"]Usage Examples:
// C implementation
#include <avro.h>
avro_schema_t schema;
avro_schema_from_json_literal(schema_json, &schema);
avro_value_t value;
avro_generic_value_new(schema, &value);# Perl implementation
use Avro::Schema;
use Avro::BinaryEncoder;
my $schema = Avro::Schema->parse($schema_json);
my $encoder = Avro::BinaryEncoder->new();
$encoder->encode($schema, $data);<?php
// PHP implementation
require_once 'avro.php';
$schema = Avro\Schema::parse($schema_json);
$io = new Avro\IO\StringIO();
$encoder = new Avro\IO\BinaryEncoder($io);
$writer = new Avro\DataFile\DataFileWriter($io, $encoder, $schema);# Ruby implementation
require 'avro'
schema = Avro::Schema.parse(schema_json)
writer = Avro::IO::DatumWriter.new(schema)
encoder = Avro::IO::BinaryEncoder.new(stringio)
writer.write(datum, encoder)// Rust implementation
use apache_avro::{Schema, Writer, Reader};
let schema = Schema::parse_str(schema_json)?;
let mut writer = Writer::new(&schema, Vec::new());
writer.append(record)?;Standards and testing procedures that ensure data compatibility across all language implementations.
# Compatibility testing matrix
compatibility_tests:
interop_data:
location: "share/test/data"
formats: ["json", "binary"]
schemas: ["primitive", "complex", "recursive"]
round_trip_tests:
serialization: "language_a -> binary -> language_b"
schema_evolution: "schema_v1 -> data -> schema_v2"
rpc_communication: "client_lang -> server_lang"
validation_suite:
schema_parsing: true
data_validation: true
protocol_compliance: true
performance_benchmarks: trueUsage Examples:
# Run cross-language compatibility tests
./build.sh interop-test
# Test specific language pair
./build.sh test-interop java python
# Validate schema compatibility
avro-tools validate schema.avsc
avro-tools compatibility --reader new_schema.avsc --writer old_schema.avscGuidelines for choosing the appropriate language implementation based on project requirements and constraints.
# Implementation selection criteria
selection_guide:
performance_critical:
recommended: ["java", "c++", "rust"]
reason: "Optimized native performance"
web_development:
recommended: ["javascript", "java", "csharp"]
reason: "Rich ecosystem and tooling"
data_science:
recommended: ["python", "java", "r"]
reason: "Integration with analytics frameworks"
systems_programming:
recommended: ["c", "c++", "rust"]
reason: "Low-level control and efficiency"
enterprise_applications:
recommended: ["java", "csharp", "python"]
reason: "Enterprise framework integration"
feature_completeness:
tier1: ["java"] # Full feature set
tier2: ["python", "c++"] # Most features
tier3: ["javascript", "csharp", "rust"] # Core features
tier4: ["c", "perl", "php", "ruby"] # Basic featuresInstall with Tessl CLI
npx tessl i tessl/maven-org-apache-avro--avro-toplevel