{prodname} connectors work with the Kafka Connect framework to capture changes in databases and generate change event records.
The Kafka Connect workers then apply any configured transformations to each of the messages generated by the connector,
serialize each message key and value into a binary form by using the configured link:https://kafka.apache.org/documentation/#connect_running[_converters_],
and write each message into the correct Kafka topic.
You can specify converters in several ways:
* In the Kafka Connect worker configuration.
+
In this case, the same converters are used for all connectors that are deployed to that worker's cluster.
* For an individual connector.
+
Kafka Connect comes with a _JSON converter_ that serializes message keys and values into JSON documents. You can configure the JSON converter to include or exclude the message schema by specifying the `key.converter.schemas.enable` and `value.converter.schemas.enable` properties.
The {prodname} {link-prefix}:{link-tutorial}[tutorial] shows what the messages look like when both payload and schemas are included.
Alternatively, you can serialize the message keys and values by using link:https://avro.apache.org/[Apache Avro].
The Avro binary format is compact and efficient, and Avro schemas make it possible to ensure that the messages have the correct structure.
Avro's schema evolution mechanism makes it possible to evolve the schemas over time, which is essential for {prodname} connectors that dynamically generate the message schemas to match the structure of the database tables.
Over time, the change events captured by {prodname} connectors and written by Kafka Connect into a topic may have different versions of the same schema.
Avro serialization makes it easier for consumers to adapt to the changing schema.
ifdef::product[]
[IMPORTANT]
====
Using Avro to serialize message keys and values is a Technology Preview feature. Technology Preview features are not supported with Red Hat production service-level agreements (SLAs) and might not be functionally complete; therefore, Red Hat does not recommend implementing any Technology Preview features in production environments. This Technology Preview feature provides early access to upcoming product innovations, enabling you to test functionality and provide feedback during the development process. For more information about support scope, see link:https://access.redhat.com/support/offerings/techpreview/[Technology Preview Features Support Scope].
* An Avro converter that you can configure in Kafka Connect workers. This converter maps Kafka Connect schemas to Avro schemas. The converter then uses the Avro schemas to serialize the message keys and values into Avro's compact binary form.
To use the Apicurio registry with {prodname}, you must add Apicurio Registry converters and their dependencies to the Kafka Connect container image that you are using for running {prodname}.
The Apicurio Registry project also provides a JSON converter that can be used with the Apicurio registry. This combines the advantage of less verbose messages with human-readable JSON. Messages do not contain the schema information themselves, but only a schema ID.
* Install the Avro converter from link:https://repo1.maven.org/maven2/io/apicurio/apicurio-registry-distro-connect-converter/{apicurio-version}/apicurio-registry-distro-connect-converter-{apicurio-version}-converter.tar.gz[the installation package] into Kafka Connect's _libs_ directory or directly into a plug-in directory
* Configure a Kafka Connect instance with the following property settings:
. Build a {prodname} image with the Avro converter from this link:https://github.com/debezium/debezium-examples/blob/master/tutorial/debezium-with-apicurio/Dockerfile[Dockerfile]:
This can lead to problems during serialization if the column name does not also adhere to the Avro naming rules.
Each {prodname} connector provides a configuration property, `sanitize.field.names` that you can set to `true` if you have columns that do not adhere to Avro rules for names. Setting `sanitize.field.names` to `true` allows serialization of non-conformant fields without having to actually modify your schema.
please see the https://github.com/debezium/debezium-examples/tree/master/tutorial#using-mysql-and-the-avro-message-format[MySQL and the Avro message format] tutorial example.