mirror of
https://github.com/openshift/openshift-docs.git
synced 2026-02-05 21:46:22 +01:00
SRVKE-747: Broker docs clean up and add Kafka broker
This commit is contained in:
committed by
openshift-cherrypick-robot
parent
7edf83f64a
commit
fcbb3dce9c
28
serverless/develop/serverless-event-delivery.adoc
Normal file
28
serverless/develop/serverless-event-delivery.adoc
Normal file
@@ -0,0 +1,28 @@
|
||||
:_content-type: ASSEMBLY
|
||||
include::modules/serverless-document-attributes.adoc[]
|
||||
[id="serverless-event-delivery"]
|
||||
= Event delivery
|
||||
include::modules/common-attributes.adoc[]
|
||||
:context: serverless-event-delivery
|
||||
|
||||
toc::[]
|
||||
|
||||
You can configure event delivery parameters for Knative Eventing that are applied in cases where an event fails to be delivered by a xref:../../serverless/develop/serverless-subs.adoc#serverless-subs[subscription] or xref:../../serverless/develop/serverless-triggers.adoc#serverless-triggers[trigger] to a subscriber. Event delivery parameters are configured individually per subscriber.
|
||||
|
||||
include::modules/serverless-event-delivery-component-behaviors.adoc[leveloffset=+1]
|
||||
|
||||
[id="serverless-event-delivery-parameters"]
|
||||
== Configurable parameters
|
||||
|
||||
The following parameters can be configured for event delivery.
|
||||
|
||||
Dead letter sink:: You can configure the `deadLetterSink` delivery parameter so that if an event fails to be delivered it is sent to the specified event sink.
|
||||
|
||||
Retries:: You can set a minimum number of times that the delivery must be retried before the event is sent to the dead letter sink, by configuring the `retry` delivery parameter with an integer value.
|
||||
|
||||
Back off delay:: You can set the `backoffDelay` delivery parameter to specify the time delay before an event delivery retry is attempted after a failure. The duration of the `backoffDelay` parameter is specified using the https://en.wikipedia.org/wiki/ISO_8601#Durations[ISO 8601] format.
|
||||
|
||||
Back off policy:: The `backoffPolicy` delivery parameter can be used to specify the retry back off policy. The policy can be specified as either `linear` or `exponential`. When using the `linear` back off policy, the back off delay is the time interval specified between retries. When using the `exponential` backoff policy, the back off delay is equal to `backoffDelay*2^<numberOfRetries>`.
|
||||
|
||||
include::modules/serverless-subscription-event-delivery-config.adoc[leveloffset=+1]
|
||||
// add docs for configuration in triggers
|
||||
@@ -7,25 +7,30 @@ include::modules/common-attributes.adoc[]
|
||||
|
||||
toc::[]
|
||||
|
||||
Knative Kafka functionality is available in an {ServerlessProductName} installation xref:../../serverless/admin_guide/serverless-kafka-admin.adoc#serverless-kafka-admin[if a cluster administrator has installed the `KnativeKafka` custom resource].
|
||||
|
||||
Knative Kafka provides additional options, such as:
|
||||
|
||||
* Kafka event source
|
||||
* xref:../../serverless/develop/serverless-creating-channels.adoc#serverless-creating-channels[Kafka channel]
|
||||
// * Kafka broker
|
||||
Knative Kafka functionality is available in an {ServerlessProductName} installation xref:../../serverless/admin_guide/serverless-kafka-admin.adoc#serverless-install-kafka-odc_serverless-kafka-admin[if a cluster administrator has installed the `KnativeKafka` custom resource].
|
||||
|
||||
[NOTE]
|
||||
====
|
||||
Knative Kafka is not currently supported for IBM Z and IBM Power Systems.
|
||||
====
|
||||
|
||||
Knative Kafka provides additional options, such as:
|
||||
|
||||
* Kafka source
|
||||
* Kafka channel
|
||||
* Kafka broker
|
||||
|
||||
:FeatureName: Kafka broker
|
||||
include::modules/technology-preview.adoc[leveloffset=+1]
|
||||
|
||||
include::modules/serverless-kafka-event-delivery.adoc[leveloffset=+1]
|
||||
|
||||
[id="serverless-kafka-developer-event-source"]
|
||||
== Using a Kafka event source
|
||||
See the xref:../../serverless/develop/serverless-event-delivery.adoc#serverless-event-delivery[Event delivery] documentation for more information about delivery guarantees.
|
||||
|
||||
You can create a Knative Kafka event source that reads events from an Apache Kafka cluster and passes these events to a sink.
|
||||
[id="serverless-kafka-developer-source"]
|
||||
== Kafka source
|
||||
|
||||
You can create a Kafka source that reads events from an Apache Kafka cluster and passes these events to a sink.
|
||||
|
||||
// dev console
|
||||
include::modules/serverless-kafka-source-odc.adoc[leveloffset=+2]
|
||||
@@ -35,8 +40,23 @@ include::modules/specifying-sink-flag-kn.adoc[leveloffset=+3]
|
||||
// YAML
|
||||
include::modules/serverless-kafka-source-yaml.adoc[leveloffset=+2]
|
||||
|
||||
[id="serverless-kafka-developer-broker"]
|
||||
== Kafka broker
|
||||
|
||||
:FeatureName: Kafka broker
|
||||
include::modules/technology-preview.adoc[leveloffset=+2]
|
||||
|
||||
If a cluster administrator has configured your {ServerlessProductName} deployment to use Kafka broker as the default broker type, xref:../../serverless/develop/serverless-using-brokers.adoc#serverless-using-brokers-creating-brokers[creating a broker by using the default settings] creates a Kafka-based `Broker` object. If your {ServerlessProductName} deployment is not configured to use Kafka broker as the default broker type, you can still use the following procedure to create a Kafka-based broker.
|
||||
|
||||
include::modules/serverless-kafka-broker.adoc[leveloffset=+2]
|
||||
|
||||
// Kafka channels
|
||||
include::modules/serverless-create-kafka-channel-yaml.adoc[leveloffset=+1]
|
||||
|
||||
[id="additional-resources_serverless-kafka-developer"]
|
||||
== Additional resources
|
||||
|
||||
* See the link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html/amq_streams_on_openshift_overview/kafka-concepts_str#kafka-concepts-key_str[Red Hat AMQ Streams] documentation for more information about Kafka concepts.
|
||||
* See xref:../../serverless/discover/knative-event-sources.adoc#knative-event-sources[Event sources].
|
||||
* See the Red Hat AMQ Streams link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html-single/using_amq_streams_on_rhel/index#assembly-kafka-encryption-and-authentication-str[TLS and SASL on Kafka] documentation.
|
||||
* See the xref:../../serverless/develop/serverless-event-delivery.adoc#serverless-event-delivery[Event delivery] documentation for information about configuring event delivery parameters.
|
||||
* See the xref:../../serverless/admin_guide/serverless-kafka-admin.adoc#serverless-kafka-admin[Knative Kafka cluster administrator documentation] for information about installing Kafka components and setting default configurations if you have cluster administrator permissions.
|
||||
|
||||
@@ -11,7 +11,7 @@ After events have been sent to a channel from an event source or producer, these
|
||||
|
||||
image::serverless-event-channel-workflow.png[Channel workflow overview]
|
||||
|
||||
If a subscriber rejects an event, there are no re-delivery attempts by default. Developers can configure xref:../../serverless/knative_eventing/serverless-event-delivery.adoc#serverless-event-delivery[re-delivery attempts] by modifying the `delivery` spec in a `Subscription` object.
|
||||
If a subscriber rejects an event, there are no re-delivery attempts by default. Developers can configure xref:../../serverless/develop/serverless-event-delivery.adoc#serverless-event-delivery[re-delivery attempts] by modifying the `delivery` spec in a `Subscription` object.
|
||||
|
||||
[id="serverless-subs-creating-subs"]
|
||||
== Creating subscriptions
|
||||
|
||||
29
serverless/develop/serverless-triggers.adoc
Normal file
29
serverless/develop/serverless-triggers.adoc
Normal file
@@ -0,0 +1,29 @@
|
||||
:_content-type: ASSEMBLY
|
||||
[id="serverless-triggers"]
|
||||
= Filtering events from a broker by using triggers
|
||||
include::modules/common-attributes.adoc[]
|
||||
include::modules/serverless-document-attributes.adoc[]
|
||||
:context: serverless-triggers
|
||||
|
||||
toc::[]
|
||||
|
||||
Using triggers enables you to filter events from the broker for delivery to event sinks.
|
||||
|
||||
[id="prerequisites_serverless-triggers"]
|
||||
== Prerequisites
|
||||
|
||||
* You have installed Knative Eventing and the `kn` CLI.
|
||||
* You have access to an available broker.
|
||||
* You have access to an available event consumer, such as a Knative service.
|
||||
|
||||
// ODC
|
||||
include::modules/serverless-create-trigger-odc.adoc[leveloffset=+1]
|
||||
include::modules/serverless-delete-trigger-odc.adoc[leveloffset=+1]
|
||||
|
||||
// kn trigger
|
||||
include::modules/serverless-create-kn-trigger.adoc[leveloffset=+1]
|
||||
include::modules/kn-trigger-list.adoc[leveloffset=+1]
|
||||
include::modules/kn-trigger-describe.adoc[leveloffset=+1]
|
||||
include::modules/kn-trigger-filtering.adoc[leveloffset=+1]
|
||||
include::modules/kn-trigger-update.adoc[leveloffset=+1]
|
||||
include::modules/delete-kn-trigger.adoc[leveloffset=+1]
|
||||
43
serverless/develop/serverless-using-brokers.adoc
Normal file
43
serverless/develop/serverless-using-brokers.adoc
Normal file
@@ -0,0 +1,43 @@
|
||||
:_content-type: ASSEMBLY
|
||||
[id="serverless-using-brokers"]
|
||||
= Brokers
|
||||
include::modules/common-attributes.adoc[]
|
||||
include::modules/serverless-document-attributes.adoc[]
|
||||
:context: serverless-using-brokers
|
||||
|
||||
toc::[]
|
||||
|
||||
Brokers can be used in combination with xref:../../serverless/develop/serverless-triggers.adoc#serverless-triggers[triggers] to deliver events from an xref:../../serverless/discover/knative-event-sources.adoc#knative-event-sources[event source] to an event sink.
|
||||
|
||||
image::serverless-event-broker-workflow.png[Broker event delivery overview]
|
||||
|
||||
Events can be sent from an event source to a broker as an HTTP `POST` request. After events have entered the broker, they can be filtered by https://github.com/cloudevents/spec/blob/v1.0/spec.md#context-attributes[CloudEvent attributes] using triggers, and sent as an HTTP `POST` request to an event sink.
|
||||
|
||||
include::modules/serverless-broker-types.adoc[leveloffset=+1]
|
||||
|
||||
:FeatureName: Kafka broker
|
||||
include::modules/technology-preview.adoc[leveloffset=+2]
|
||||
|
||||
.Additional resources
|
||||
|
||||
* See the xref:../../serverless/develop/serverless-event-delivery.adoc#serverless-event-delivery[Event delivery] documentation for more information about delivery guarantees.
|
||||
|
||||
* See the xref:../../serverless/develop/serverless-kafka-developer.adoc#serverless-kafka-developer-broker[Kafka broker] documentation for information about using Kafka brokers.
|
||||
|
||||
[id="serverless-using-brokers-creating-brokers"]
|
||||
== Creating a broker that uses default settings
|
||||
|
||||
{ServerlessProductName} provides a `default` Knative broker that you can create by using the `kn` CLI. You can also create the `default` broker by adding the `eventing.knative.dev/injection: enabled` annotation to a trigger, or by adding the `eventing.knative.dev/injection=enabled` label to a namespace.
|
||||
|
||||
include::modules/serverless-create-broker-kn.adoc[leveloffset=+2]
|
||||
include::modules/serverless-creating-broker-annotation.adoc[leveloffset=+2]
|
||||
include::modules/serverless-creating-broker-labeling.adoc[leveloffset=+2]
|
||||
include::modules/serverless-deleting-broker-injection.adoc[leveloffset=+2]
|
||||
|
||||
[id="serverless-using-brokers-managing-brokers"]
|
||||
== Managing brokers
|
||||
|
||||
The `kn` CLI provides commands that can be used to list, describe, update, and delete brokers.
|
||||
|
||||
include::modules/serverless-list-broker-kn.adoc[leveloffset=+2]
|
||||
include::modules/serverless-describe-broker-kn.adoc[leveloffset=+2]
|
||||
Reference in New Issue
Block a user