OpenShift Logging Concepts: Difference between revisions
Line 14: | Line 14: | ||
Logging support is not provided by default but it can be enabled during installation, by setting "[[OpenShift_hosts#openshift_hosted_logging_deploy|openshift_hosted_logging_deploy]]=true" in the [[OpenShift_hosts|Ansible hosts file]]. | Logging support is not provided by default but it can be enabled during installation, by setting "[[OpenShift_hosts#openshift_hosted_logging_deploy|openshift_hosted_logging_deploy]]=true" in the [[OpenShift_hosts|Ansible hosts file]]. | ||
=Installation= | |||
=Organizatorium= | =Organizatorium= |
Revision as of 18:08, 30 July 2017
External
- https://docs.openshift.com/container-platform/latest/install_config/aggregate_logging.html
- https://docs.openshift.com/container-platform/latest/install_config/install/advanced_install.html#advanced-install-cluster-logging
Internal
Overview
OpenShift provides log aggregation with the EFK stack. fluentd is used to capture logs from nodes, pods and application and stored log data in ElasticSearch. Kibana offers a UI for ElasticSearch. fluentd, ElasticSearch and Kibana are deployed as OpenShift pods, on dedicated infrastructure nodes. Logging components communicate securely. They are usually part of the "logging" namespace. Application developers can view the logs for projects they have view access for. Cluster administrators can view all logs.
Logging support is not provided by default but it can be enabled during installation, by setting "openshift_hosted_logging_deploy=true" in the Ansible hosts file.
Installation
Organizatorium
Docker Container Logs
Docker containers use a json-file logging driver and store logs in /var/lib/docker/containers/<hash>/<hash>-json.log
Aggregated logging is only supported using the journald driver in Docker. More details in https://docs.openshift.com/container-platform/latest/install_config/aggregate_logging.html#fluentd-upgrade-source.