DBZ-5770 Update documentation regarding the Binary Data Converter

Updated all references from `ByteBufferConverter` to
`BinaryDataConverter`.

Also added special note about changing the settings before upgrading
when running in Kafka Connect (otherwise Kakfa Connect will fail to
start).

chore: add my name to copyright and aliases
This commit is contained in:
Enzo Cappa 2022-10-31 09:41:55 -07:00 committed by Jiri Pechanec
parent b7e2042b09
commit 179c359ca6
4 changed files with 10 additions and 5 deletions

View File

@ -122,6 +122,7 @@ Elan Hasson
Eliran Agranovich
Emmanuel Brard
Emrul Islam
Enzo Cappa
Eric Slep
Eric Weaver
Eric S. Kreiseir

View File

@ -221,7 +221,7 @@ value.converter=io.debezium.converters.ByteArrayConverter
By default, the `payload` field value (the Avro data) is the only message value.
Configuration of `ByteArrayConverter` as the value converter propagates the `payload` field value as-is into the Kafka message value.
Note that this differs from the `ByteBufferConverter` suggested for other SMTs.
Note that this differs from the `BinaryDataConverter` suggested for other SMTs.
This is due to the different approach MongoDB takes to storing byte arrays internally.
The {prodname} connectors may be configured to emit heartbeat, transaction metadata, or schema change events (support varies by connector).

View File

@ -222,21 +222,21 @@ apply the following configuration to the connector:
----
transforms=outbox,...
transforms.outbox.type=io.debezium.transforms.outbox.EventRouter
value.converter=io.debezium.converters.ByteBufferConverter
value.converter=io.debezium.converters.BinaryDataConverter
----
By default, the `payload` column value (the Avro data) is the only message value.
Configuration of `ByteBufferConverter` as the value converter propagates the `payload` column value as-is into the Kafka message value.
Configuration of `BinaryDataConverter` as the value converter propagates the `payload` column value as-is into the Kafka message value.
The {prodname} connectors may be configured to emit heartbeat, transaction metadata, or schema change events (support varies by connector).
These events cannot be serialized by the `ByteBufferConverter` so additional configuration must be provided so the converter knows how to serialize these events.
These events cannot be serialized by the `BinaryDataConverter` so additional configuration must be provided so the converter knows how to serialize these events.
As an example, the following configuration illustrates using the Apache Kafka `JsonConverter` with no schemas:
[source]
----
transforms=outbox,...
transforms.outbox.type=io.debezium.transforms.outbox.EventRouter
value.converter=io.debezium.converters.ByteBufferConverter
value.converter=io.debezium.converters.BinaryDataConverter
value.converter.delegate.converter.type=org.apache.kafka.connect.json.JsonConverter
value.converter.delegate.converter.type.schemas.enable=false
----
@ -244,6 +244,9 @@ value.converter.delegate.converter.type.schemas.enable=false
The delegate `Converter` implementation is specified by the `delegate.converter.type` option.
If any extra configuration options are needed by the converter, they can also be specified, such as the disablement of schemas shown above using `schemas.enable=false`.
[NOTE]
====
The converter `io.debezium.converters.ByteBufferConverter` has been deprecated since Debezium version 1.9, and has been removed in 2.0. Furthermore, when using Kafka Connect the connector's configuration must be updated before upgrading to Debezium 2.x
// Type: concept
// Title: Emitting additional fields in {prodname} outbox messages

View File

@ -165,3 +165,4 @@ rajdangwal,Rajendra Dangwal
Sage-Pierce,Sage Pierce
joschi,Jochen Schalanda
janjwerner-confluent,Jan Werner
enzo-cappa,Enzo Cappa