DBZ-4392 US: Oracle reorg; DS: Specify Maven as Db2 driver source
This commit is contained in:
parent
c64558ae77
commit
9cb83458af
@ -1609,9 +1609,9 @@ To deploy a {prodname} Db2 connector, you install the {prodname} Db2 connector a
|
||||
|
||||
.Procedure
|
||||
|
||||
. Download the link:https://repo1.maven.org/maven2/io/debezium/debezium-connector-db2/{debezium-version}/debezium-connector-db2-{debezium-version}-plugin.tar.gz[connector's plug-in archive].
|
||||
. Download the link:https://repo1.maven.org/maven2/io/debezium/debezium-connector-db2/{debezium-version}/debezium-connector-db2-{debezium-version}-plugin.tar.gz[{prodname} Db2 connector plug-in archive] from Maven Central.
|
||||
. Extract the JAR files into your Kafka Connect environment.
|
||||
. Download the link:https://www.ibm.com/support/pages/db2-jdbc-driver-versions-and-downloads[JDBC driver for Db2], and extract the downloaded driver file to the directory that contains the {prodname} Db2 connector JAR file (that is, `debezium-connector-db2-{debezium-version}.jar`).
|
||||
. Download the link:https://repo1.maven.org/maven2/com/ibm/db2/jcc/{db2-version}/jcc-{db2-version}.jar[JDBC driver for Db2] from Maven Central, and extract the downloaded driver file to the directory that contains the {prodname} Db2 connector JAR file (that is, `debezium-connector-db2-{debezium-version}.jar`).
|
||||
+
|
||||
[IMPORTANT]
|
||||
====
|
||||
@ -1657,8 +1657,9 @@ Due to licensing requirements, the Db2 JDBC driver file that {prodname} requires
|
||||
The driver is available for download from Maven Central.
|
||||
Depending on the deployment method that you use, you retrieve the driver by adding a command to the Kafka Connect custom resource or to the Dockerfile that you use to build the connector image.
|
||||
|
||||
* If you xref:openshift-streams-db2-connector-deployment[use {StreamsName} to add the connector to your Kafka Connect image], add the Maven Central location for the driver to `builds.plugins.artifact.url` in the `KafkaConnect` custom resource.
|
||||
* If you xref:deploying-debezium-db2-connectors[use a Dockerfile to build the connector], add a `curl` command in the Dockerfile to obtain the required driver file from Maven Central.
|
||||
* If you use {StreamsName} to add the connector to your Kafka Connect image, add the Maven Central location for the driver to `builds.plugins.artifact.url` in the `KafkaConnect` custom resource as shown in xref:using-streams-to-deploy-debezium-db2-connectors[].
|
||||
* If you use a Dockerfile to build a container image for the connector, insert a `curl` command in the Dockerfile to specify the URL for downloading the required driver file from Maven Central.
|
||||
For more information, see xref:deploying-debezium-db2-connectors[].
|
||||
|
||||
|
||||
// Type: concept
|
||||
|
@ -613,7 +613,7 @@ SCN gap detection is available only if the large SCN increment occurs while the
|
||||
|
||||
Every data change event that the Oracle connector emits has a key and a value.
|
||||
The structures of the key and value depend on the table from which the change events originate.
|
||||
For information about how {prodname} constructs topic names, see xref:{link-oracle-connector}#oracle-topic-names[Topic names]).
|
||||
For information about how {prodname} constructs topic names, see xref:oracle-topic-names[Topic names].
|
||||
|
||||
[WARNING]
|
||||
====
|
||||
@ -1875,7 +1875,7 @@ To deploy a {prodname} Oracle connector, you install the {prodname} Oracle conne
|
||||
|
||||
. Download the {prodname} https://repo1.maven.org/maven2/io/debezium/debezium-connector-oracle/{debezium-version}/debezium-connector-oracle-{debezium-version}-plugin.tar.gz[Oracle connector plug-in archive].
|
||||
. Extract the files into your Kafka Connect environment.
|
||||
. xref:obtaining-the-oracle-jdbc-driver[Download the Oracle JDBC driver from Maven Central and extract it to the directory with the connector JAR files.]
|
||||
. link:https://repo1.maven.org/maven2/com/oracle/database/jdbc/ojdbc8/{ojdbc8-version}/ojdbc8-{ojdbc8-version}.jar[Download the Oracle JDBC driver] from Maven Central and extract it to the directory with the connector JAR files.
|
||||
. Add the directory with the JAR files to {link-kafka-docs}/#connectconfigs[Kafka Connect's `plugin.path`].
|
||||
. Restart your Kafka Connect process to pick up the new JAR files.
|
||||
|
||||
@ -1904,6 +1904,32 @@ For more information, see xref:obtaining-the-oracle-jdbc-driver[Obtaining the Or
|
||||
|
||||
* xref:descriptions-of-debezium-oracle-connector-configuration-properties[]
|
||||
|
||||
// Type: procedure
|
||||
[id="obtaining-the-oracle-jdbc-driver"]
|
||||
=== Obtaining the Oracle JDBC driver
|
||||
|
||||
Due to licensing requirements, the Oracle JDBC driver file that {prodname} requires to connect to an Oracle database is not included in the {prodname} Oracle connector archive.
|
||||
ifdef::product[]
|
||||
The driver is available for download from Maven Central.
|
||||
Depending on the deployment method that you use, you retrieve the driver by adding a command to the Kafka Connect custom resource or to the Dockerfile that you use to build the connector image.
|
||||
|
||||
* If you use {StreamsName} to add the connector to your Kafka Connect image, add the Maven Central location for the driver to `builds.plugins.artifact.url` in the `KafkaConnect` custom resource as shown in xref:using-streams-to-deploy-debezium-oracle-connectors[].
|
||||
* If you use a Dockerfile to build a container image for the connector, insert a `curl` command in the Dockerfile to specify the URL for downloading the required driver file from Maven Central.
|
||||
For more information, see xref:deploying-debezium-oracle-connectors[Deploying a {prodname} Oracle connector by building a custom Kafka Connect container image from a Dockerfile].
|
||||
|
||||
endif::product[]
|
||||
ifdef::community[]
|
||||
|
||||
NOTE: If you use the {prodname} Oracle connector with Oracle XStream, obtain the JDBC driver as part of the Oracle Instant Client package.
|
||||
For more information, see xref:obtaining-oracle-jdbc-driver-and-xstreams-api-files[].
|
||||
|
||||
.Procedure
|
||||
|
||||
. From a browser, link:https://repo1.maven.org/maven2/com/oracle/database/jdbc/ojdbc8/{ojdbc8-version}/ojdbc8-{ojdbc8-version}.jar[download the 'ojdbc8.jar' from Maven Central].
|
||||
. Copy the downloaded driver file to the directory that contains the {prodname} Oracle connector JAR file (`debezium-connector-oracle-{debezium-version}.jar`).
|
||||
endif::community[]
|
||||
|
||||
|
||||
// Type: concept
|
||||
[id="openshift-streams-oracle-connector-deployment"]
|
||||
=== {prodname} Oracle connector deployment using {StreamsName}
|
||||
@ -2109,30 +2135,6 @@ oc apply -f inventory-connector.yaml
|
||||
The preceding command registers `inventory-connector` and the connector starts to run against the `server1` database as defined in the `KafkaConnector` CR.
|
||||
endif::product[]
|
||||
|
||||
// Type: procedure
|
||||
[id="obtaining-the-oracle-jdbc-driver"]
|
||||
=== Obtaining the Oracle JDBC driver
|
||||
|
||||
Due to licensing requirements, the Oracle JDBC driver file that {prodname} requires to connect to an Oracle database is not included in the {prodname} Oracle connector archive.
|
||||
ifdef::product[]
|
||||
The driver is available for download from Maven Central.
|
||||
Depending on the deployment method that you use, you retrieve the driver by adding a command to the Kafka Connect custom resource or to the Dockerfile that you use to build the connector image.
|
||||
|
||||
* If you xref:openshift-streams-oracle-connector-deployment[use {StreamsName} to add the connector to your Kafka Connect image], add the Maven Central location for the driver to `builds.plugins.artifact.url` in the `KafkaConnect` custom resource.
|
||||
* If you xref:deploying-debezium-oracle-connectors[use a Dockerfile to build the connector], use a `curl` command in the Dockerfile to obtain the required driver file from Maven Central.
|
||||
|
||||
endif::product[]
|
||||
ifdef::community[]
|
||||
|
||||
NOTE: If you use the {prodname} Oracle connector with Oracle XStream, obtain the JDBC driver as part of the Oracle Instant Client package.
|
||||
For more information, see xref:obtaining-oracle-jdbc-driver-and-xstreams-api-files[].
|
||||
|
||||
.Procedure
|
||||
|
||||
. From a browser, link:https://repo1.maven.org/maven2/com/oracle/database/jdbc/ojdbc8/{ojdbc8-version}/ojdbc8-{ojdbc8-version}.jar[download the 'ojdbc8.jar' from Maven Central].
|
||||
. Copy the downloaded driver file to the directory that contains the {prodname} Oracle connector JAR file (`debezium-connector-oracle-{debezium-version}.jar`).
|
||||
endif::community[]
|
||||
|
||||
ifdef::community[]
|
||||
[[oracle-example-configuration]]
|
||||
=== {prodname} Oracle connector configuration
|
||||
|
@ -3,10 +3,8 @@ The current preferred method for deploying connectors on OpenShift is to use a b
|
||||
|
||||
During the build process, the {kafka-streams} Operator transforms input parameters in a `KafkaConnect` custom resource, including {prodname} connector definitions, into a Kafka Connect container image.
|
||||
The build downloads the necessary artifacts from the Red Hat Maven repository or another configured HTTP server.
|
||||
ifeval::["{context}" == "oracle"]
|
||||
You can also configure the custom resource to download the Oracle JDBC driver, which is not included in the connector archive.
|
||||
endif::[]
|
||||
The newly created container is pushed to the container registry that is specified in `.spec.build.output`, and is used to deploy a Kafka Connect pod.
|
||||
|
||||
The newly created container is pushed to the container registry that is specified in `.spec.build.output`, and is used to deploy a Kafka Connect cluster.
|
||||
After {StreamsName} builds the Kafka Connect image, you create `KafkaConnector` custom resources to start the connectors that are included in the build.
|
||||
|
||||
.Prerequisites
|
||||
|
@ -3,7 +3,7 @@ In the example that follows, the custom resource is configured to download the f
|
||||
* The {prodname} {connector-name} connector archive.
|
||||
* The {registry-name-full} archive. The {registry} is an optional component.
|
||||
* The {prodname} scripting SMT archive. The SMT archive is an optional component.
|
||||
* The {connector-name} JDBC driver, which is required to connecto to {connector-name} databases, but is not included in the connector archive.
|
||||
* The {connector-name} JDBC driver, which is required to connect to {connector-name} databases, but is not included in the connector archive.
|
||||
|
||||
[source%nowrap,yaml,subs="+attributes,+quotes"]
|
||||
----
|
||||
@ -70,6 +70,7 @@ The `type` value must match the type of the file that is referenced in the `url`
|
||||
|The value of `artifacts.url` specifies the address of an HTTP server, such as a Maven repository, that stores the file for the connector artifact.
|
||||
The OpenShift cluster must have access to the specified server.
|
||||
|
||||
|
||||
|8
|
||||
|Specifies the location of the {connector-name} JDBC driver in Maven Central.
|
||||
The required driver is not included in the {prodname} {connector-name} connector archive.
|
||||
|
@ -3,7 +3,7 @@ In the example that follows, the custom resource is configured to download the f
|
||||
* The {prodname} {connector-name} connector archive.
|
||||
* The {registry-name-full} archive. The {registry} is an optional component.
|
||||
* The {prodname} scripting SMT archive. The SMT archive is an optional component.
|
||||
* The {connector-name} JDBC driver, which is required to connecto to {connector-name} databases, but is not included in the connector archive.
|
||||
* The {connector-name} JDBC driver, which is required to connect to {connector-name} databases, but is not included in the connector archive.
|
||||
|
||||
[source%nowrap,yaml,subs="+attributes,+quotes"]
|
||||
----
|
||||
@ -68,6 +68,7 @@ The `type` value must match the type of the file that is referenced in the `url`
|
||||
|
||||
|7
|
||||
|The value of `artifacts.url` specifies the address of an HTTP server, such as a Maven repository, that stores the file for the connector artifact.
|
||||
{prodname} connector artifacts are available in the Red Hat Maven repository.
|
||||
The OpenShift cluster must have access to the specified server.
|
||||
|
||||
|8
|
||||
|
@ -60,11 +60,11 @@ For example, you can add Service Registry artifacts, or the {prodname} scripting
|
||||
|The value of `artifacts.type` specifies the file type of the artifact specified in the `artifacts.url`.
|
||||
Valid types are `zip`, `tgz`, or `jar`.
|
||||
{prodname} connector archives are provided in `.zip` file format.
|
||||
JDBC driver files are in `.jar` format.
|
||||
The `type` value must match the type of the file that is referenced in the `url` field.
|
||||
|
||||
|7
|
||||
|The value of `artifacts.url` specifies the address of an HTTP server, such as a Maven repository, that stores the file for the connector artifact.
|
||||
{prodname} connector artifacts are available in the Red Hat Maven repository.
|
||||
The OpenShift cluster must have access to the specified server.
|
||||
|
||||
|===
|
||||
|
Loading…
Reference in New Issue
Block a user