Update install.adoc

This commit is contained in:
Gunnar Morling 2021-01-27 08:29:59 +01:00 committed by GitHub
parent 8dbea7b80a
commit 9e3487eff1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -53,6 +53,8 @@ endif::[]
If immutable containers are your thing, then check out https://quay.io/organization/debezium[{prodname}'s container images] (https://hub.docker.com/r/debezium/[alternative source] on DockerHub) for Apache Kafka, Kafka Connect and Apache Zookeeper, with the different {prodname} connectors already pre-installed and ready to go. Our xref:tutorial.adoc[tutorial] even walks you through using these images, and this is a great way to learn what {prodname} is all about. If immutable containers are your thing, then check out https://quay.io/organization/debezium[{prodname}'s container images] (https://hub.docker.com/r/debezium/[alternative source] on DockerHub) for Apache Kafka, Kafka Connect and Apache Zookeeper, with the different {prodname} connectors already pre-installed and ready to go. Our xref:tutorial.adoc[tutorial] even walks you through using these images, and this is a great way to learn what {prodname} is all about.
Of course you also can run {prodname} on Kubernetes and xref:operations/openshift.adoc[OpenShift]. Of course you also can run {prodname} on Kubernetes and xref:operations/openshift.adoc[OpenShift].
Using the https://strimzi.io/[Strimzi] Kubernetes Operator is recommended for that.
It allows to deploy Apache Kafka, Kafka Connect, and even connectors declaratively via custom Kubernetes resources.
By default, the directory _/kafka/connect_ is used as plugin directory by the {prodname} Docker image for Kafka Connect. By default, the directory _/kafka/connect_ is used as plugin directory by the {prodname} Docker image for Kafka Connect.
So any additional connectors you may wish to use should be added to that directory. So any additional connectors you may wish to use should be added to that directory.
@ -78,9 +80,9 @@ xref:connectors/postgresql.adoc#postgresql-deploying-a-connector[Postgres Connec
xref:connectors/mongodb.adoc#mongodb-deploying-a-connector[MongoDB Connector], xref:connectors/mongodb.adoc#mongodb-deploying-a-connector[MongoDB Connector],
xref:connectors/sqlserver.adoc#sqlserver-deploying-a-connector[SQL Server Connector], xref:connectors/sqlserver.adoc#sqlserver-deploying-a-connector[SQL Server Connector],
xref:connectors/oracle.adoc#oracle-deploying-a-connector[Oracle Connector], xref:connectors/oracle.adoc#oracle-deploying-a-connector[Oracle Connector],
xref:connectors/db2.adoc#db2-deploying-a-connector[Db2 Connector] xref:connectors/db2.adoc#db2-deploying-a-connector[Db2 Connector],
xref:connectors/cassandra.adoc#cassandra-deploying-a-connector[Cassandra Connector] xref:connectors/cassandra.adoc#cassandra-deploying-a-connector[Cassandra Connector],
or xref:connectors/vitess.adoc#vitess-deploying-a-connector[Vitess Connector] or xref:connectors/vitess.adoc#vitess-deploying-a-connector[Vitess Connector],
and use the link:{link-kafka-docs}/#connect_rest[Kafka Connect REST API] to add that and use the link:{link-kafka-docs}/#connect_rest[Kafka Connect REST API] to add that
connector configuration to your Kafka Connect cluster. When the connector starts, it will connect to the source and produce events connector configuration to your Kafka Connect cluster. When the connector starts, it will connect to the source and produce events
for each inserted, updated, and deleted row or document. for each inserted, updated, and deleted row or document.