diff --git a/serverless/eventing/event-sources/serverless-kafka-developer-source.adoc b/serverless/eventing/event-sources/serverless-kafka-developer-source.adoc index deb539119a..834fbb1dee 100644 --- a/serverless/eventing/event-sources/serverless-kafka-developer-source.adoc +++ b/serverless/eventing/event-sources/serverless-kafka-developer-source.adoc @@ -8,6 +8,12 @@ toc::[] You can create an Apache Kafka source that reads events from an Apache Kafka cluster and passes these events to a sink. You can create a Kafka source by using the {product-title} web console, the Knative (`kn`) CLI, or by creating a `KafkaSource` object directly as a YAML file and using the OpenShift CLI (`oc`) to apply it. + +[NOTE] +==== +See the documentation for xref:../../../serverless/install/installing-knative-eventing.adoc#serverless-install-kafka-odc_installing-knative-eventing[Installing Knative broker for Apache Kafka]. +==== + // dev console include::modules/serverless-kafka-source-odc.adoc[leveloffset=+1] // kn commands