Confluent Avro Support
The Kafka connector supports the use of Apache Avro-formatted messages. Avro is a data serialization system that provides a compact binary format and relies on schemas to define data structures. By integrating with a Confluent Schema Registry, the connector can handle serialization (Produce) and deserialization (Consume/Listen) of messages.
Avro support is available for the following operations:
- Produce
- Consume
- Listen
Prerequisites
To implement Avro support within your Kafka operations, ensure you have the following.
- Connectivity from the Boomi runtime to your Confluent Schema Registry.
- Registered Avro schemas in the Schema Registry for the specific topics you intend to use.
- Necessary credentials (Username/Password or API Key/Secret) if the Schema Registry requires authentication.
- Existing Kafka topics are configured to process Avro data.
Connection configuration
To enable Avro support, you must configure the Schema Registry settings within your Kafka connection component:
- Schema Registry URL — Enter the full URL for your registry (e.g.,
https://schema-registry.example.com:8081). - Authentication Method — Basic
- Username — Enter the username for your registry.
- Password — Enter the password for your registry.
- These credentials must have sufficient permissions to read schemas for all operations using this connection.
- When the Schema Registry URL is specified, the Test Connection validates that the connection component can successfully connect to both Kafka as well as the schema registry.
Enabling Avro in Operations
The Kafka connector utilizes dynamic configuration to manage Avro serialization.
During Browse, in the Import wizard, select Dynamic Topic and Use Avro Schema checkboxes. The Kafka connector lists all the schemas in the Object Type dropdown. The user can select the schema from this list. The connector retrieves the schema and generates a corresponding JSON profile.
The user can specify the Topic Name in the Topic Name Browse property or Topic Name Operation Property. Alternatively, the Topic Name can be passed as a Dynamic Operation Property.
Schema Updates: If a schema changes in the Registry, you must re-import the object in the operation to update the JSON profile.
Produce Operation
The Produce operation converts JSON input documents into Avro binary format before sending them to Kafka.
The input JSON document must strictly match the structure of the imported Avro schema. Discrepancies in data types or missing required fields will result in serialization errors.
Consume and Listen Operations
Consume and Listen operations read Avro binary data from Kafka and deserialize it into JSON documents.
Troubleshooting
| Issue | Resolution |
|---|---|
| Schema Not Found | Verify the Schema Registry URL, credentials, and ensure the schema is present in the Confluent schema registry. |
| Serialization Errors (Produce) | Ensure the input JSON matches the Avro schema structure and all required fields are present. |
| Deserialization Errors (Consume/Listen) | Ensure the input JSON matches the Avro schema structure and all required fields are present. |
| Schema evolution (Produce/Consume/Listen) | Check for schema incompatibility or re-import the object if the registry schema has been updated. |
| "Use Avro Schema" missing | Ensure Allow Dynamic Topics is selected in during Browse |