DBZ-3518 Update formatting to correct rendering and linking problems

This commit is contained in:
Bob Roldan 2021-05-12 21:29:52 -04:00 committed by Jiri Pechanec
parent 05ce63e06a
commit d0df3c278a
2 changed files with 27 additions and 34 deletions

View File

@ -2143,7 +2143,6 @@ The {prodname} Db2 connector provides three types of metrics that are in additio
// Type: reference
// ModuleID: monitoring-debezium-during-snapshots-of-db2-databases
// Title: Monitoring {prodname} during snapshots of Db2 databases
[[db2-monitoring-snapshots]]
[[db2-snapshot-metrics]]
=== Snapshot metrics
@ -2154,7 +2153,6 @@ include::{partialsdir}/modules/all-connectors/ref-connector-monitoring-snapshot-
// Type: reference
// ModuleID: monitoring-debezium-db2-connector-record-streaming
// Title: Monitoring {prodname} Db2 connector record streaming
[[db2-monitoring-streaming]]
[[db2-streaming-metrics]]
=== Streaming metrics
@ -2163,9 +2161,8 @@ The *MBean* is `debezium.db2:type=connector-metrics,context=streaming,server=_<d
include::{partialsdir}/modules/all-connectors/ref-connector-monitoring-streaming-metrics.adoc[leveloffset=+1]
// Type: reference
// ModuleID: monitoring-debezium-db2-connector-schema history
// ModuleID: monitoring-debezium-db2-connector-schema-history
// Title: Monitoring {prodname} Db2 connector schema history
[[db2-monitoring-schema-history]]
[[db2-schema-history-metrics]]
=== Schema history metrics

View File

@ -13,7 +13,7 @@
toc::[]
link:https://cloudevents.io/[CloudEvents] is a specification for describing event data in a common way. Its aim is to provide interoperability across services, platforms and systems. {prodname} enables you to configure a MongoDB, MySQL, PostgreSQL, or SQL Server connector to emit change event records that conform to the CloudEvents specification.
link:https://cloudevents.io/[CloudEvents] is a specification for describing event data in a common way. Its aim is to provide interoperability across services, platforms and systems. {prodname} enables you to configure a MongoDB, MySQL, PostgreSQL, or SQL Server connector to emit change event records that conform to the CloudEvents specification.
ifdef::community[]
[NOTE]
@ -30,28 +30,28 @@ Emitting change event records in CloudEvents format is a Technology Preview feat
====
endif::product[]
The CloudEvents specification defines:
The CloudEvents specification defines:
* A set of standardized event attributes
* Rules for defining custom attributes
* Encoding rules for mapping event formats to serialized representations such as JSON or Avro
* Protocol bindings for transport layers such as Apache Kafka, HTTP or AMQP
To configure a {prodname} connector to emit change event records that conform to the CloudEvents specification, {prodname} provides the `io.debezium.converters.CloudEventsConverter`, which is a Kafka Connect message converter.
To configure a {prodname} connector to emit change event records that conform to the CloudEvents specification, {prodname} provides the `io.debezium.converters.CloudEventsConverter`, which is a Kafka Connect message converter.
Currently, only structured mapping mode is supported. The CloudEvents change event envelope can be JSON or Avro and each envelope type supports JSON or Avro as the `data` format. It is expected that a future {prodname} release will support binary mapping mode.
Currently, only structured mapping mode is supported. The CloudEvents change event envelope can be JSON or Avro and each envelope type supports JSON or Avro as the `data` format. It is expected that a future {prodname} release will support binary mapping mode.
ifdef::product[]
Information about emitting change events in CloudEvents format is organized as follows:
Information about emitting change events in CloudEvents format is organized as follows:
* xref:example-debezium-change-event-records-in-cloudevents-format[]
* xref:example-of-configuring-debezium-cloudevents-converter[]
* xref:debezium-cloudevents-converter-configuration-options[]
endif::product[]
For information about using Avro, see:
For information about using Avro, see:
* {link-prefix}:{link-avro-serialization}#avro-serialization[Avro serialization]
* {link-prefix}:{link-avro-serialization}#avro-serialization[Avro serialization]
* link:https://github.com/Apicurio/apicurio-registry[Apicurio Registry]
@ -60,7 +60,7 @@ For information about using Avro, see:
// Title: Example {prodname} change event records in CloudEvents format
== Example event format
The following example shows what a CloudEvents change event record emitted by a PostgreSQL connector looks like. In this example, the PostgreSQL connector is configured to use JSON as the CloudEvents format envelope and also as the `data` format.
The following example shows what a CloudEvents change event record emitted by a PostgreSQL connector looks like. In this example, the PostgreSQL connector is configured to use JSON as the CloudEvents format envelope and also as the `data` format.
[source,json,indent=0,subs="+attributes"]
----
@ -95,19 +95,19 @@ The following example shows what a CloudEvents change event record emitted by a
}
}
----
<1> Unique ID that the connector generates for the change event based on the change event's content.
<2> The source of the event, which is the logical name of the database as specified by the `database.server.name` property in the connector's configuration.
<3> The CloudEvents specification version.
<1> Unique ID that the connector generates for the change event based on the change event's content.
<2> The source of the event, which is the logical name of the database as specified by the `database.server.name` property in the connector's configuration.
<3> The CloudEvents specification version.
<4> Connector type that generated the change event. The format of this field is `io.debezium._CONNECTOR_TYPE_.datachangeevent`. The value of `_CONNECTOR_TYPE_` is `mongodb`, `mysql`, `postgresql`, or `sqlserver`.
<5> Time of the change in the source database.
<6> Describes the content type of the `data` attribute, which is JSON in this example.
The only alternative is Avro.
<7> An operation identifier. Possible values are `r` for read, `c` for create, `u` for update, or `d` for delete.
<6> Describes the content type of the `data` attribute, which is JSON in this example.
The only alternative is Avro.
<7> An operation identifier. Possible values are `r` for read, `c` for create, `u` for update, or `d` for delete.
<8> All `source` attributes that are known from {prodname} change events are mapped to CloudEvents extension attributes by using the `iodebezium` prefix for the attribute name.
<9> When enabled in the connector, each `transaction` attribute that is known from {prodname} change events is mapped to a CloudEvents extension attribute by using the `iodebeziumtx` prefix for the attribute name.
<10> The actual data change itself. Depending on the operation and the connector, the data might contain `before`, `after` and/or `patch` fields.
The following example also shows what a CloudEvents change event record emitted by a PostgreSQL connector looks like. In this example, the PostgreSQL connector is again configured to use JSON as the CloudEvents format envelope, but this time the connector is configured to use Avro for the `data` format.
The following example also shows what a CloudEvents change event record emitted by a PostgreSQL connector looks like. In this example, the PostgreSQL connector is again configured to use JSON as the CloudEvents format envelope, but this time the connector is configured to use Avro for the `data` format.
[source,json,indent=0,subs="+attributes"]
----
@ -148,7 +148,7 @@ It is also possible to use Avro for the envelope as well as the `data` attribute
// Title: Example of configuring {prodname} CloudEvents converter
== Example configuration
Configure `io.debezium.converters.CloudEventsConverter` in your {prodname} connector configuration.
Configure `io.debezium.converters.CloudEventsConverter` in your {prodname} connector configuration.
The following example shows how to configure the CloudEvents converter to emit change event records that have the following characteristics:
* Use JSON as the envelope.
@ -164,7 +164,7 @@ The following example shows how to configure the CloudEvents converter to emit c
...
----
<1> Specifying the `serializer.type` is optional, because `json` is the default.
The CloudEvents converter converts Kafka record values. In the same connector configuration, you can specify `key.converter` if you want to operate on record keys.
For example, you might specify `StringConverter`, `LongConverter`, `JsonConverter`, or `AvroConverter`.
@ -174,7 +174,7 @@ For example, you might specify `StringConverter`, `LongConverter`, `JsonConverte
[[cloud-events-converter-configuration-options]]
== Configuration options
When you configure a {prodname} connector to use the CloudEvent converter you can specify the following options.
When you configure a {prodname} connector to use the CloudEvent converter you can specify the following options.
.Descriptions of CloudEvents converter configuration options
[cols="30%a,25%a,45%a",subs="+attributes"]
@ -183,25 +183,21 @@ When you configure a {prodname} connector to use the CloudEvent converter you ca
|Default
|Description
[id="cloud-events-converter-serializer-type"]
|{link-prefix}:{link-cloud-events}#cloud-events-converter-serializer-type[`serializer.type`]
|[[cloud-events-converter-serializer-type]]xref:cloud-events-converter-serializer-type[`serializer.type`]
|`json`
|The encoding type to use for the CloudEvents envelope structure.
|The encoding type to use for the CloudEvents envelope structure.
The value can be `json` or `avro`.
[id="cloud-events-converter-data-serializer-type"]
|{link-prefix}:{link-cloud-events}#cloud-events-converter-data-serializer-type[`data{zwsp}.serializer.type`]
|[[cloud-events-converter-data-serializer-type]]xref:cloud-events-converter-data-serializer-type[`data.serializer.type`]
|`json`
|The encoding type to use for the `data` attribute.
|The encoding type to use for the `data` attribute.
The value can be `json` or `avro`.
[id="cloud-events-converter-json"]
|{link-prefix}:{link-cloud-events}#cloud-events-converter-json[`json. \...`]
|[[cloud-events-converter-json]]xref:cloud-events-converter-json[`json. \...`]
|N/A
|Any configuration options to be passed through to the underlying converter when using JSON. The `json.` prefix is removed.
|Any configuration options to be passed through to the underlying converter when using JSON. The `json.` prefix is removed.
[id="cloud-events-converter-avro"]
|{link-prefix}:{link-cloud-events}#cloud-events-converter-avro[`avro. \...`]
|[[cloud-events-converter-avro]]xref:cloud-events-converter-avro[`avro. \...`]
|N/A
|Any configuration options to be passed through to the underlying converter when using Avro. The `avro.` prefix is removed. For example, for Avro `data`, you would specify the `avro.schema.registry.url` option.
|Any configuration options to be passed through to the underlying converter when using Avro. The `avro.` prefix is removed. For example, for Avro `data`, you would specify the `avro.schema.registry.url` option.
|===