Schema Registry for Apache Kafka - covers schema management (Avro, Protobuf, JSON Schema), compatibility modes, schema evolution, REST API, serializer/deserializer configuration, Kafka Connect converters, Flink SQL integration, and Confluent Cloud.
100
Does it follow best practices?
Validation for skill structure
These are distinct concepts:
The topic name and schema name are independent. A single topic can have multiple schemas (key and value), and a single schema can be used across multiple topics.
The subject name strategy determines how schemas are organized in the registry.
Subject = <topic>-key (for keys)
Subject = <topic>-value (for values)Subject = <fully.qualified.RecordName>Subject = <topic>-<fully.qualified.RecordName>// Producer/Consumer config
props.put("value.subject.name.strategy",
"io.confluent.kafka.serializers.subject.TopicNameStrategy");
// or RecordNameStrategy, TopicRecordNameStrategy
props.put("key.subject.name.strategy",
"io.confluent.kafka.serializers.subject.TopicNameStrategy");Wire format:
[0x00][4-byte schema ID][serialized data]Adding a field (backward compatible):
// V1
{"type": "record", "name": "User", "fields": [
{"name": "id", "type": "int"},
{"name": "name", "type": "string"}
]}
// V2 - added email with default
{"type": "record", "name": "User", "fields": [
{"name": "id", "type": "int"},
{"name": "name", "type": "string"},
{"name": "email", "type": ["null", "string"], "default": null}
]}Key Avro rules:
["null", "string"]Adding a field (backward compatible):
// V1
message User {
int32 id = 1;
string name = 2;
}
// V2
message User {
int32 id = 1;
string name = 2;
string email = 3; // new field, consumers ignore unknown fields
}Key Protobuf rules:
reserved keyword prevents accidental reuse of deleted field numbersoneof fields have special compatibility considerationsAdding a field (backward compatible with open content model):
// V1
{
"type": "object",
"properties": {
"id": {"type": "integer"},
"name": {"type": "string"}
},
"required": ["id", "name"]
}
// V2
{
"type": "object",
"properties": {
"id": {"type": "integer"},
"name": {"type": "string"},
"email": {"type": "string"}
},
"required": ["id", "name"]
}Key JSON Schema rules:
additionalProperties: true, the default) is more evolution-friendlyadditionalProperties: false) restricts adding propertiesrequired field is NOT backward compatiblerequired constraint is backward compatibleadditionalProperties setting dramatically impacts compatibilitySchema contexts provide logical grouping — separate "sub-registries" within one Schema Registry instance.
:.context-name:subject-nameAll three formats support schema references for composing complex schemas:
Avro: References model imported schemas
Protobuf: References model import statements
JSON Schema: References model $ref pointers
// Registering a schema with references
{
"schema": "...",
"schemaType": "AVRO",
"references": [
{
"name": "com.example.Address",
"subject": "address-value",
"version": 1
}
]
}_schemas by default)Install with Tessl CLI
npx tessl i gamussa/schema-registry@0.2.0