This feature is currently in incubating state, i.e. exact semantics, configuration options etc. may change in future revisions, based on the feedback we receive. Please let us know if you encounter any problems while using this extension.
The use of custom-developed converters is a Technology Preview feature only.
Technology Preview features are not supported with Red Hat production service level agreements (SLAs) and might not be functionally complete.
Red Hat does not recommend using them in production.
These features provide early access to upcoming product features, enabling customers to test functionality and provide feedback during the development process.
For more information about the support scope of Red Hat Technology Preview features, see link:https://access.redhat.com/support/offerings/techpreview[https://access.redhat.com/support/offerings/techpreview].
While the default mappings are generally sufficient, for some applications you might want to apply an alternate mapping.
For example, you might need a custom mapping if the default mapping exports a column using the format of milliseconds since the UNIX epoch, but your downstream application can only consume the column values as formatted strings.
You customize data type mappings by developing and deploying a custom converter.
You configure custom converters to act on all columns of a certain type, or you can narrow their scope so that they apply to a specific table column only.
The converter function intercepts data type conversion requests for any columns that match a specified criteria, and then performs the specified conversion.
The `converters` property specifies the converters that are available to a connector, and can include sub-properties that further modify conversion behavior.
Passes the properties specified in the connector configuration to the converter instance.
The `configure` method runs when the connector is initialized.
You can use a converter with multiple connectors and modify its behavior based on the connector's property settings. +
The `configure` method accepts the following argument:
`props`::: Contains the properties to pass to the converter instance.
Each property specifies the format for converting the values of a particular type of column.
`converterFor()`::
Registers the converter to process specific columns or fields in the data source.
{prodname} invokes the `converterFor()` method to prompt the converter to call `registration` for the conversion.
The `converterFor` method runs once for each column. +
The method accepts the following arguments:
`field`:::
An object that passes metadata about the field or column that is processed.
The column metadata can include the name of the column or field, the name of the table or collection, the data type, size, and so forth.
`registration`:::
An object of type `io.debezium.spi.converter.CustomConverter.ConverterRegistration` that provides the target schema definition and the code for converting the column data.
The converter calls the `registration` parameter when the source column matches the type that the converter should process.
calls the `register` method to define the converter for each column in the schema.
Schemas are represented using the Kafka Connect link:https://kafka.apache.org/31/javadoc/org/apache/kafka/connect/data/SchemaBuilder.html[`SchemaBuilder`] API.
* Runs the `configure` method, which configures the converter based on the value of the `schema.name` property that is specified in the connector configuration.
The converter configuration is specific to each instance.
* Runs the `converterFor` method, which registers the converter to process values in source columns for which the data type is set to `isbn`.
** Identifies the target `STRING` schema based on the value that is specified for the `schema.name` property.
** Converts ISBN data in the source column to `String` values.
Custom converters act on specific columns or column types in a source table to specify how to convert the data types in the source to Kafka Connect schema types.
To use a custom converter with a connector, you deploy the converter JAR file alongside the connector file, and then configure the connector to use the converter.
* To use a custom converter with a {prodname} connector, export the Java project to a JAR file, and copy the file to the directory that contains the JAR file for each {prodname} connector that you want to use it with. +
For example, in a typical deployment, the {prodname} connector files are stored in subdirectories of a Kafka Connect directory (`/kafka/connect`), with each connector JAR in its own subdirectory (`/kafka/connect/debezium-connector-db2`, `/kafka/connect/debezium-connector-mysql`, and so forth).
To use a converter with a connector, add the converter JAR file to the connector's subdirectory.
=== Configuring a connector to use a custom converter
To enable a connector to use the custom converter, you add properties to the connector configuration that specify the converter name and class.
If the converter requires further information to customize the formats of specific data types, you can also define other coniguration options to provide that information.
<1> The `converters` property is mandatory and enumerates a comma-separated list of symbolic names of the converter instances to use with the connector.
The values listed for this property serve as prefixes in the names of other properties that you specify for the converter.
<2> The `_<converterSymbolicName>_.type` property is mandatory, and specifies the name of the class that implements the converter.
For example, for the earlier xref:example-debezium-simple-custom-converter[custom converter example], you would add the following properties to the connector configuration:
For example, to add a property for the preceding `isbn` converter to specify the `schema.name` to pass to the `configure` method in the converter code, add the following property: