CtrlK
BlogDocsLog inGet started
Tessl Logo

gamussa/schema-registry

Schema Registry for Apache Kafka - covers schema management (Avro, Protobuf, JSON Schema), compatibility modes, schema evolution, REST API, serializer/deserializer configuration, Kafka Connect converters, Flink SQL integration, and Confluent Cloud.

100

Does it follow best practices?

Validation for skill structure

Overview
Skills
Evals
Files

Schema Registry

Schema Registry provides a centralized repository for managing and validating schemas (Avro, Protobuf, JSON Schema) used with Apache Kafka. It enforces compatibility rules during schema evolution and integrates with Kafka producers, consumers, Kafka Connect, and Flink SQL.

Key Concepts

  • Schema: Structure definition (Avro record, Protobuf message, JSON Schema object)
  • Subject: Named scope under which schemas evolve (e.g., my-topic-value)
  • Schema ID: Globally unique integer, embedded in serialized messages ([0x00][4-byte ID][data])
  • Compatibility: Rules governing allowed schema changes between versions

Contents

  • Fundamentals - Schemas, subjects, naming strategies, compatibility types, evolution rules by format
  • REST API - All Schema Registry REST endpoints with curl examples
  • Serializers and Deserializers - Kafka producer/consumer serializers, Kafka Connect converters, Maven dependencies
  • Flink SQL Integration - avro-confluent format, CREATE TABLE patterns, Confluent Cloud Flink SQL
  • Confluent Cloud - Managed Schema Registry, data contracts, schema linking, stream governance

Install with Tessl CLI

npx tessl i gamussa/schema-registry
Workspace
gamussa
Visibility
Public
Created
Last updated
Publish Source
GitHub
Badge
gamussa/schema-registry badge