Log Connection to ELK System

This document is suitable for application development and operation and maintenance personnel who use Kato and ELK systems at the same time.

This document is suitable for scenarios where: the logs of service components running on Kato are connected to the ELK system for log collection and analysis through the fileBeat log collection plugin.

Prerequisites

  • The deployed Nginx sample service component.
  • One-click installation of elasticsearch_kibana application through the application market.
  • There is a default plugin of fileBeat log collection plugin in the team.

Take the collection of Nginx service component logs as an example. Through the fileBeat log collection plugin, Nginx access logs and error logs are reported to ElasticSearch and displayed through Kibana.

Connect to ELK

The Nginx component needs to depend on Elasticsearch and Kibana so that the Nginx component can push logs to ElasticSearch through the default configuration of the fileBeat log collection plugin.

After relying on other components, you need to update the component for the dependency to take effect.

Plug-in Installation and Activation

On the My Plugin page in the team view, select the fileBeat log collection plugin, click Install, and the component can use the plugin after the installation is complete.

After the installation is complete, on the plug-in page of the component management page, in the list of unopened, find the fileBeat log collection plug-in, and click the open button on the right to activate the plug-in. After that, the plug-in will appear in the opened list.

Parameter Configuration

You can click the View Configuration button on the right side of the plug-in to view the configuration parameter information of the plug-in.

Parameter nameDefault valueDescription
NGINX_ACCESS_PATH/var/log/nginx/access.logAccess log path
NGINX_ERROR_PATH/var/log/nginx/error.logError log path
INPUT_PATH/var/log/nginx/*.logFB log collection path
ES_HOST127.0.0.1ES Address
ES_PORT9200ES Port
ES_USERNAMEelasticES username
ES_PASSchangemeES password
KIBANA_HOST127.0.0.1KB Address
KIBANA_PORT5601KB port

In the configuration items of the plug-in, all variables have default values. If necessary, we need to modify it. Among them ES_HOST, ES_PORT, ES_USERNAME, ES_PASS four variables define the connection information of Elasticsearch. KIBANA_HOST, KIBANA_PORT specify the connection address of Kibana. The reason that the default can also take effect here is that Nginx already relies on Elasticsearch and Kibana deployed in Kato.

Shared Storage

The plug-in needs to share the log file directory of the component to collect logs, and the log file directory of the component needs to be shared with the plug-in. This can be achieved by mounting storage.

On the storage page of the component management page, add storage of type temporary storage, and fill in the mount path to the path where the component will generate the log file. Such as /var/log/nginx.

After configuring the storage, you need to update the restart component to make it take effect.

At this point, the log collection and analysis of the ELK system through the default plug-in fileBeat log collection plug-in has been completed. If you find that Kibana cannot continue to collect logs, please check whether the operation is wrong.

Understanding the Principle

FileBeat monitors the log output and uploads it to the specified Elasticsearch in a manner similar to tail -f sth.log. Service components running on Kato can share log files with plugins by persisting the log directory. Through this mechanism, based on the plug-in made by FileBeat, Nginx logs can be monitored. The configuration in the plug-in is to determine the log path and specify the connection address of Elasticsearch and Kibana.

Common Problems

The component does not depend on Elasticsearch and Kibana, which results in the logs not being collected

You can select the Elasticsearch and Kibana installed from the application market to rely on the dependency page of the component management page, and restart the update to make it take effect.

In the plug-in configuration, the ES password does not match and the logs cannot be collected

You can check the environment variables in the Elasticsearch component to confirm the ES password, or check the dependency page of the component to confirm whether the password matches. Restart the update to make it effective.

All dependencies exist, and the plug-in configuration is correct, but logs cannot be collected

You can try to restart the component, confirm that all the configurations have taken effect, and then confirm whether the log collection was successful.