1
0
mirror of https://github.com/openshift/openshift-docs.git synced 2026-02-05 12:46:18 +01:00
Modified common-attributes to see if build error get resolved

minor editorial fixes

Minor editorial changes

Trying to solve build errors

Adding text before xref in Additional Resources

Minor edits

Incorporated Robert's comments

Incorporated peer review comments

Incorporated peer review editorial suggestions

Incorporated peer review suggestions

EDits to resolve rendering issue
This commit is contained in:
Souvik Sarkar
2021-09-02 14:48:05 +05:30
committed by openshift-cherrypick-robot
parent 0bd28956dc
commit 4e1eaf491b
6 changed files with 130 additions and 0 deletions

View File

@@ -1453,6 +1453,8 @@ Topics:
File: using-pods-in-a-privileged-security-context
- Name: Securing webhooks with event listeners
File: securing-webhooks-with-event-listeners
- Name: Viewing pipeline logs using the OpenShift Logging Operator
File: viewing-pipeline-logs-using-the-openshift-logging-operator
- Name: GitOps
Dir: gitops
Distros: openshift-enterprise

View File

@@ -0,0 +1,31 @@
[id="viewing-pipeline-logs-using-the-openshift-logging-operator"]
= Viewing pipeline logs using the OpenShift Logging Operator
include::modules/common-attributes.adoc[]
include::modules/pipelines-document-attributes.adoc[]
:context: viewing-pipeline-logs-using-the-openshift-logging-operator
toc::[]
The logs generated by pipeline runs, task runs, and event listeners are stored in their respective pods. It is useful to review and analyze logs for troubleshooting and audits.
However, retaining the pods indefinitely leads to unnecessary resource consumption and cluttered namespaces.
To eliminate any dependency on the pods for viewing pipeline logs, you can use the OpenShift Elasticsearch Operator and the OpenShift Logging Operator. These Operators help you to view pipeline logs by using the link:https://www.elastic.co/guide/en/kibana/6.8/connect-to-elasticsearch.html[Elasticsearch Kibana] stack, even after you have deleted the pods that contained the logs.
[id="prerequisites_viewing-pipeline-logs-using-the-openshift-logging-operator"]
== Prerequisites
Before trying to view pipeline logs in a Kibana dashboard, ensure the following:
* The steps are performed by a cluster administrator.
* Logs for pipeline runs and task runs are available.
* The OpenShift Elasticsearch Operator and the OpenShift Logging Operator are installed.
include::modules/op-viewing-pipeline-logs-in-kibana.adoc[leveloffset=+1]
[id="additional-resources_viewing-pipeline-logs-using-the-openshift-logging-operator"]
== Additional resources
* xref:../../logging/cluster-logging-deploying.adoc[Installing OpenShift Logging]
* xref:../../logging/viewing-resource-logs.adoc[Viewing logs for a resource]
* xref:../../logging/cluster-logging-visualizer.adoc[Viewing cluster logs by using Kibana]

Binary file not shown.

After

Width:  |  Height:  |  Size: 189 KiB

BIN
images/grid.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 KiB

BIN
images/not-placetools.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

View File

@@ -0,0 +1,97 @@
// Module included in the following assemblies:
// cicd/pipelines/viewing-pipeline-logs-using-the-openshift-logging-operator.adoc
//
[id="op-viewing-pipeline-logs-in-kibana_{context}"]
= Viewing pipeline logs in Kibana
To view pipeline logs in the Kibana web console:
.Procedure
. Log in to {product-title} web console as a cluster administrator.
. In the top right of the menu bar, click the *grid* icon → *Observability* → *Logging*. The Kibana web console is displayed.
. Create an index pattern:
.. On the left navigation panel of the *Kibana* web console, click *Management*.
.. Click *Create index pattern*.
.. Under *Step 1 of 2: Define index pattern* → *Index pattern*, enter a *`pass:[*]`* pattern and click *Next Step*.
.. Under *Step 2 of 2: Configure settings* → *Time filter field name*, select *@timestamp* from the drop-down menu, and click *Create index pattern*.
. Add a filter:
.. On the left navigation panel of the *Kibana* web console, click *Discover*.
.. Click *Add a filter +* → *Edit Query DSL*.
+
[NOTE]
====
* For each of the example filters that follows, edit the query and click *Save*.
* The filters are applied one after another.
====
+
... Filter the containers related to pipelines:
+
.Example query to filter pipelines containers
[source,json]
----
{
"query": {
"match": {
"kubernetes.flat_labels": {
"query": "app_kubernetes_io/managed-by=tekton-pipelines",
"type": "phrase"
}
}
}
}
----
+
... Filter all containers that are not `place-tools` container. As an illustration of using the graphical drop-down menus instead of editing the query DSL, consider the following approach:
+
.Example of filtering using the drop-down fields
image::../../images/not-placetools.png[Not place-tools]
+
... Filter `pipelinerun` in labels for highlighting:
+
.Example query to filter `pipelinerun` in labels for highlighting
[source,json]
----
{
"query": {
"match": {
"kubernetes.flat_labels": {
"query": "tekton_dev/pipelineRun=",
"type": "phrase"
}
}
}
}
----
+
... Filter `pipeline` in labels for highlighting:
+
.Example query to filter `pipeline` in labels for highlighting
[source,json]
----
{
"query": {
"match": {
"kubernetes.flat_labels": {
"query": "tekton_dev/pipeline=",
"type": "phrase"
}
}
}
}
----
+
.. From the *Available fields* list, select the following fields:
* `kubernetes.flat_labels`
* `message`
+
Ensure that the selected fields are displayed under the *Selected fields* list.
+
.. The logs are displayed under the *message* field.
+
.Filtered messages
image::../../images/filtered-messages.png[Filtered messages]