0
# Language Implementations
1
2
References and coordination information for language-specific Avro implementations across multiple programming languages, enabling developers to access appropriate Avro libraries for their target platforms.
3
4
## Capabilities
5
6
### Java Implementation
7
8
Primary language implementation providing comprehensive Avro functionality with the richest feature set and tooling ecosystem.
9
10
```xml { .api }
11
<!-- Maven coordinates for Java implementation -->
12
<dependency>
13
<groupId>org.apache.avro</groupId>
14
<artifactId>avro</artifactId>
15
<version>1.12.0</version>
16
</dependency>
17
18
<!-- Additional Java modules -->
19
<dependency>
20
<groupId>org.apache.avro</groupId>
21
<artifactId>avro-compiler</artifactId>
22
<version>1.12.0</version>
23
</dependency>
24
25
<dependency>
26
<groupId>org.apache.avro</groupId>
27
<artifactId>avro-ipc</artifactId>
28
<version>1.12.0</version>
29
</dependency>
30
31
<dependency>
32
<groupId>org.apache.avro</groupId>
33
<artifactId>avro-tools</artifactId>
34
<version>1.12.0</version>
35
</dependency>
36
```
37
38
**Location**: `lang/java`
39
40
**Key Modules**:
41
- `avro`: Core serialization and schema libraries
42
- `avro-compiler`: Schema compilation and code generation
43
- `avro-ipc`: RPC communication framework
44
- `avro-tools`: Command-line utilities and tools
45
- `avro-mapred`: MapReduce integration
46
- `avro-maven-plugin`: Maven build integration
47
48
**Usage Examples:**
49
50
```java
51
// Basic serialization
52
Schema schema = new Schema.Parser().parse(schemaJson);
53
GenericRecord record = new GenericData.Record(schema);
54
record.put("field", value);
55
56
// Code generation
57
SpecificRecord specificRecord = MyRecord.newBuilder()
58
.setField(value)
59
.build();
60
```
61
62
### Python Implementation
63
64
Pure Python implementation providing full Avro compatibility for Python applications and data science workflows.
65
66
```bash { .api }
67
# Installation
68
pip install avro-python3==1.12.0
69
70
# Alternative installation
71
pip install apache-avro==1.12.0
72
```
73
74
**Location**: `lang/py`
75
76
**Package Name**: `avro-python3` or `apache-avro`
77
78
**Usage Examples:**
79
80
```python
81
import avro.schema
82
import avro.io
83
import json
84
85
# Load schema
86
schema = avro.schema.parse(open("schema.avsc").read())
87
88
# Serialize data
89
writer = avro.io.DatumWriter(schema)
90
bytes_writer = io.BytesIO()
91
encoder = avro.io.BinaryEncoder(bytes_writer)
92
writer.write(record, encoder)
93
```
94
95
### JavaScript Implementation
96
97
Node.js and browser-compatible implementation for JavaScript/TypeScript applications and web development.
98
99
```bash { .api }
100
# NPM installation
101
npm install avro-js@1.12.0
102
103
# Yarn installation
104
yarn add avro-js@1.12.0
105
```
106
107
**Location**: `lang/js`
108
109
**Package Name**: `avro-js`
110
111
**Usage Examples:**
112
113
```javascript
114
const avro = require('avro-js');
115
116
// Parse schema
117
const schema = avro.parse(schemaJson);
118
119
// Serialize/deserialize
120
const buffer = schema.toBuffer(record);
121
const deserialized = schema.fromBuffer(buffer);
122
```
123
124
### C++ Implementation
125
126
High-performance C++ implementation for systems programming and performance-critical applications.
127
128
```cmake { .api }
129
# CMake configuration
130
find_package(Avro REQUIRED)
131
target_link_libraries(your_target Avro::avro)
132
133
# Compiler flags
134
set(CMAKE_CXX_STANDARD 11)
135
```
136
137
**Location**: `lang/c++`
138
139
**Build System**: CMake
140
141
**Usage Examples:**
142
143
```cpp
144
#include <avro/Encoder.hh>
145
#include <avro/Decoder.hh>
146
147
// Serialize
148
avro::EncoderPtr encoder = avro::binaryEncoder();
149
avro::encode(*encoder, record);
150
151
// Deserialize
152
avro::DecoderPtr decoder = avro::binaryDecoder();
153
avro::decode(*decoder, record);
154
```
155
156
### C# (.NET) Implementation
157
158
.NET implementation for C# and other .NET languages, supporting .NET Framework and .NET Core.
159
160
```xml { .api }
161
<!-- NuGet package reference -->
162
<PackageReference Include="Apache.Avro" Version="1.12.0" />
163
```
164
165
**Location**: `lang/csharp`
166
167
**Package Name**: `Apache.Avro`
168
169
**Usage Examples:**
170
171
```csharp
172
using Avro;
173
using Avro.IO;
174
175
// Serialize
176
var writer = new BinaryEncoder(stream);
177
var datumWriter = new GenericDatumWriter<GenericRecord>(schema);
178
datumWriter.Write(record, writer);
179
180
// Deserialize
181
var reader = new BinaryDecoder(stream);
182
var datumReader = new GenericDatumReader<GenericRecord>(schema, schema);
183
var result = datumReader.Read(null, reader);
184
```
185
186
### Additional Language Implementations
187
188
Support for additional programming languages with varying levels of functionality and community maintenance.
189
190
```yaml { .api }
191
# Language implementation matrix
192
implementations:
193
c:
194
location: "lang/c"
195
status: "maintained"
196
buildSystem: "autotools"
197
features: ["serialization", "deserialization"]
198
199
perl:
200
location: "lang/perl"
201
status: "maintained"
202
package: "Avro"
203
features: ["serialization", "deserialization", "schema_validation"]
204
205
php:
206
location: "lang/php"
207
status: "maintained"
208
package: "avro-php"
209
features: ["serialization", "deserialization"]
210
211
ruby:
212
location: "lang/ruby"
213
status: "maintained"
214
gem: "avro"
215
features: ["serialization", "deserialization", "schema_resolution"]
216
217
rust:
218
location: "lang/rust"
219
status: "active_development"
220
crate: "apache-avro"
221
features: ["serialization", "deserialization", "async_io", "compression"]
222
```
223
224
**Usage Examples:**
225
226
```c
227
// C implementation
228
#include <avro.h>
229
230
avro_schema_t schema;
231
avro_schema_from_json_literal(schema_json, &schema);
232
233
avro_value_t value;
234
avro_generic_value_new(schema, &value);
235
```
236
237
```perl
238
# Perl implementation
239
use Avro::Schema;
240
use Avro::BinaryEncoder;
241
242
my $schema = Avro::Schema->parse($schema_json);
243
my $encoder = Avro::BinaryEncoder->new();
244
$encoder->encode($schema, $data);
245
```
246
247
```php
248
<?php
249
// PHP implementation
250
require_once 'avro.php';
251
252
$schema = Avro\Schema::parse($schema_json);
253
$io = new Avro\IO\StringIO();
254
$encoder = new Avro\IO\BinaryEncoder($io);
255
$writer = new Avro\DataFile\DataFileWriter($io, $encoder, $schema);
256
```
257
258
```ruby
259
# Ruby implementation
260
require 'avro'
261
262
schema = Avro::Schema.parse(schema_json)
263
writer = Avro::IO::DatumWriter.new(schema)
264
encoder = Avro::IO::BinaryEncoder.new(stringio)
265
writer.write(datum, encoder)
266
```
267
268
```rust
269
// Rust implementation
270
use apache_avro::{Schema, Writer, Reader};
271
272
let schema = Schema::parse_str(schema_json)?;
273
let mut writer = Writer::new(&schema, Vec::new());
274
writer.append(record)?;
275
```
276
277
### Cross-Language Compatibility
278
279
Standards and testing procedures that ensure data compatibility across all language implementations.
280
281
```yaml { .api }
282
# Compatibility testing matrix
283
compatibility_tests:
284
interop_data:
285
location: "share/test/data"
286
formats: ["json", "binary"]
287
schemas: ["primitive", "complex", "recursive"]
288
289
round_trip_tests:
290
serialization: "language_a -> binary -> language_b"
291
schema_evolution: "schema_v1 -> data -> schema_v2"
292
rpc_communication: "client_lang -> server_lang"
293
294
validation_suite:
295
schema_parsing: true
296
data_validation: true
297
protocol_compliance: true
298
performance_benchmarks: true
299
```
300
301
**Usage Examples:**
302
303
```bash
304
# Run cross-language compatibility tests
305
./build.sh interop-test
306
307
# Test specific language pair
308
./build.sh test-interop java python
309
310
# Validate schema compatibility
311
avro-tools validate schema.avsc
312
avro-tools compatibility --reader new_schema.avsc --writer old_schema.avsc
313
```
314
315
### Implementation Selection Guide
316
317
Guidelines for choosing the appropriate language implementation based on project requirements and constraints.
318
319
```yaml { .api }
320
# Implementation selection criteria
321
selection_guide:
322
performance_critical:
323
recommended: ["java", "c++", "rust"]
324
reason: "Optimized native performance"
325
326
web_development:
327
recommended: ["javascript", "java", "csharp"]
328
reason: "Rich ecosystem and tooling"
329
330
data_science:
331
recommended: ["python", "java", "r"]
332
reason: "Integration with analytics frameworks"
333
334
systems_programming:
335
recommended: ["c", "c++", "rust"]
336
reason: "Low-level control and efficiency"
337
338
enterprise_applications:
339
recommended: ["java", "csharp", "python"]
340
reason: "Enterprise framework integration"
341
342
feature_completeness:
343
tier1: ["java"] # Full feature set
344
tier2: ["python", "c++"] # Most features
345
tier3: ["javascript", "csharp", "rust"] # Core features
346
tier4: ["c", "perl", "php", "ruby"] # Basic features
347
```