DBZ-1759 Switch to upstream AK links

This commit is contained in:
Chris Cranford 2020-02-10 14:30:32 -05:00 committed by Gunnar Morling
parent fdd9392ebb
commit abd59a5fca
6 changed files with 5 additions and 7 deletions

View File

@ -120,8 +120,7 @@ asciidoc:
assemblies: '../assemblies'
modules: '../../modules'
mysql-connector-plugin-download: 'https://repo1.maven.org/maven2/io/debezium/debezium-connector-mysql/1.1.0.Final/debezium-connector-mysql-1.1.0.Final-plugin.tar.gz'
mysql-version: '8.0'
confluent-platform-version: '5.3.1'
mysql-version: '8.0'
strimzi-version: '0.13.0'
```

View File

@ -14,5 +14,4 @@ asciidoc:
assemblies: '../assemblies'
modules: '../../modules'
mysql-version: '8.0'
confluent-platform-version: '5.3.1'
strimzi-version: '0.13.0'

View File

@ -1104,7 +1104,7 @@ You can even link:/docs/openshift/[run Debezium on OpenShift].
To use the connector to produce change events for a particular Db2 database or cluster:
. enable the link:#setting-up-Db2[CDC on Db2] to publish the _CDC_ events in the database
. create a link:#example-configuration[configuration file for the Db2 Connector] and use the https://docs.confluent.io/{confluent-platform-version}/connect/restapi.html[Kafka Connect REST API] to add that connector to your Kafka Connect cluster.
. create a link:#example-configuration[configuration file for the Db2 Connector] and use the http://kafka.apache.org/documentation/#connect_rest[Kafka Connect REST API] to add that connector to your Kafka Connect cluster.
When the connector starts, it will grab a consistent snapshot of the schemas in your Db2 database and start streaming changes, producing events for every inserted, updated, and deleted row.
You can also choose to produce events for a subset of the schemas and tables.

View File

@ -1534,7 +1534,7 @@ For consistent toasted values handling we recommend to
== Deploying the PostgreSQL Connector
ifndef::cdc-product[]
If you've already installed https://zookeeper.apache.org[Zookeeper], http://kafka.apache.org/[Kafka], and http://kafka.apache.org/documentation.html#connect[Kafka Connect], then using Debezium's PostgreSQL connector is easy. Simply download the https://repo1.maven.org/maven2/io/debezium/debezium-connector-postgres/{debezium-version}/debezium-connector-postgres-{debezium-version}-plugin.tar.gz[connector's plugin archive], extract the JARs into your Kafka Connect environment, and add the directory with the JARs to http://docs.confluent.io/{confluent-platform-version}/connect/userguide.html#installing-plugins[Kafka Connect's classpath]. Restart your Kafka Connect process to pick up the new JARs.
If you've already installed https://zookeeper.apache.org[Zookeeper], http://kafka.apache.org/[Kafka], and http://kafka.apache.org/documentation.html#connect[Kafka Connect], then using Debezium's PostgreSQL connector is easy. Simply download the https://repo1.maven.org/maven2/io/debezium/debezium-connector-postgres/{debezium-version}/debezium-connector-postgres-{debezium-version}-plugin.tar.gz[connector's plugin archive], extract the JARs into your Kafka Connect environment, and add the directory with the JARs to http://kafka.apache.org/documentation/#connectconfigs[Kafka Connect's plugin.path]. Restart your Kafka Connect process to pick up the new JARs.
endif::cdc-product[]
ifdef::cdc-product[]

View File

@ -1038,7 +1038,7 @@ The `connect.decimal.precision` schema parameter contains an integer representin
ifndef::cdc-product[]
If you've already installed https://zookeeper.apache.org[Zookeeper], http://kafka.apache.org/[Kafka], and http://kafka.apache.org/documentation.html#connect[Kafka Connect], then using Debezium's SQL Server` connector is easy.
Simply download the https://repo1.maven.org/maven2/io/debezium/debezium-connector-sqlserver/0.9.0.Alpha1/debezium-connector-sqlserver-0.9.0.Alpha1-plugin.tar.gz[connector's plugin archive], extract the JARs into your Kafka Connect environment, and add the directory with the JARs to http://docs.confluent.io/{confluent-platform-version}/connect/userguide.html#installing-plugins[Kafka Connect's classpath].
Simply download the https://repo1.maven.org/maven2/io/debezium/debezium-connector-sqlserver/0.9.0.Alpha1/debezium-connector-sqlserver-0.9.0.Alpha1-plugin.tar.gz[connector's plugin archive], extract the JARs into your Kafka Connect environment, and add the directory with the JARs to http://kafka.apache.org/documentation/#connectconfigs[Kafka Connect's plugin.path].
Restart your Kafka Connect process to pick up the new JARs.
endif::cdc-product[]

View File

@ -63,7 +63,7 @@ xref:connectors/sqlserver.adoc#deploying-a-connector[SQL Server Connector],
xref:connectors/oracle.adoc#deploying-a-connector[Oracle Connector],
xref:connectors/db2.adoc#deploying-a-connector[Db2 Connector]
or xref:connectors/cassandra.adoc#deploying-a-connector[Cassandra Connector]
and use the link:https://docs.confluent.io/{confluent-platform-version}/connect/restapi.html[Kafka Connect REST API] to add that
and use the link:http://kafka.apache.org/documentation/#connect_rest[Kafka Connect REST API] to add that
connector configuration to your Kafka Connect cluster. When the connector starts, it will connect to the source and produce events
for each inserted, updated, and deleted row or document.