diff --git a/documentation/modules/ROOT/pages/install.adoc b/documentation/modules/ROOT/pages/install.adoc index 0efafa5f9..aef03ccce 100644 --- a/documentation/modules/ROOT/pages/install.adoc +++ b/documentation/modules/ROOT/pages/install.adoc @@ -53,6 +53,8 @@ endif::[] If immutable containers are your thing, then check out https://quay.io/organization/debezium[{prodname}'s container images] (https://hub.docker.com/r/debezium/[alternative source] on DockerHub) for Apache Kafka, Kafka Connect and Apache Zookeeper, with the different {prodname} connectors already pre-installed and ready to go. Our xref:tutorial.adoc[tutorial] even walks you through using these images, and this is a great way to learn what {prodname} is all about. Of course you also can run {prodname} on Kubernetes and xref:operations/openshift.adoc[OpenShift]. +Using the https://strimzi.io/[Strimzi] Kubernetes Operator is recommended for that. +It allows to deploy Apache Kafka, Kafka Connect, and even connectors declaratively via custom Kubernetes resources. By default, the directory _/kafka/connect_ is used as plugin directory by the {prodname} Docker image for Kafka Connect. So any additional connectors you may wish to use should be added to that directory. @@ -78,9 +80,9 @@ xref:connectors/postgresql.adoc#postgresql-deploying-a-connector[Postgres Connec xref:connectors/mongodb.adoc#mongodb-deploying-a-connector[MongoDB Connector], xref:connectors/sqlserver.adoc#sqlserver-deploying-a-connector[SQL Server Connector], xref:connectors/oracle.adoc#oracle-deploying-a-connector[Oracle Connector], -xref:connectors/db2.adoc#db2-deploying-a-connector[Db2 Connector] -xref:connectors/cassandra.adoc#cassandra-deploying-a-connector[Cassandra Connector] -or xref:connectors/vitess.adoc#vitess-deploying-a-connector[Vitess Connector] +xref:connectors/db2.adoc#db2-deploying-a-connector[Db2 Connector], +xref:connectors/cassandra.adoc#cassandra-deploying-a-connector[Cassandra Connector], +or xref:connectors/vitess.adoc#vitess-deploying-a-connector[Vitess Connector], and use the link:{link-kafka-docs}/#connect_rest[Kafka Connect REST API] to add that connector configuration to your Kafka Connect cluster. When the connector starts, it will connect to the source and produce events for each inserted, updated, and deleted row or document.