In my previous series of articles, we showed you how to use Elastic Stack to analyze Spring Boot microservice logs. These articles are:

  • Elastic: Using Elastic Stack to analyze Spring Boot microservice logs
  • Elastic: Using Elastic Stack to analyze Spring Boot microservice logs

As a careful developer might have noticed, we use Logstash to analyze our logs and convert unstructured logs into structured ones. This can be very useful in many situations. However, there is a very bad thing about the output of the Java application in the example above. That is, the output log is an unstructured log, which requires a Logstash or ingest pipeline to help us achieve the structure of the data. This is very inefficient in many cases. Can we output structured logs at the same time we produce logs?

Java is a recognized object-oriented programming language that represents cross-platform software development and helps popularize the “write once, run anywhere” (WORA) concept. Java runs on billions of devices around the world and supports important software such as the popular Android operating system and Elasticsearch. In this tutorial, we’ll show you how to manage Java logs using Elastic Stack.

Java applications, like applications running in any other language, need to provide visibility into their operations so that the people who manage them can identify problems and troubleshoot them. This can be done by simply logging diagnostic information, but because the observability requirements of these applications grow in proportion to their scope and size, finding the right information can be tedious and time consuming.

Fortunately, Elasticsearch (written in Java) is an excellent tool for storing and searching large amounts of unstructured data. In this blog post, I’ll show you how to write logs from a Java application, how to import Java logs into Elasticsearch, and how to use Kibana to find the information you need.

To help you experiment, please download one of my Java Maven projects: github.com/liu-xiao-gu… . This is a complete Java application in which all the files for debugging have been configured.

 

Java Logging Overview

There are several different libraries available for writing Java logging, all of which have similar capabilities. In this example, I’ll use the popular Log4j2 library. In fact, Elasticsearch itself uses Log4j2 as its logging framework, so you may encounter it during Elasticsearch logging configuration.

Introduction to Log4j2

To use Log4j2, you first need to add the related dependencies to the build tool’s configuration file. For example, if you are using Maven, you need to add the following to the pom.xml file:

<dependencies> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-api</artifactId> < version > 2.13.3 < / version > < / dependency > < the dependency > < groupId > org. Apache. Logging. Log4j < / groupId > The < artifactId > log4j - core < / artifactId > < version > 2.13.3 < / version > < / dependency > < / dependencies >Copy the code

With this, a simple program (the following program) is enough to see some log output:

package hello;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.LogManager;

import org.joda.time.LocalTime;

public class HelloWorld {
	private static final Logger logger = LogManager.getLogger(HelloWorld.class);
	
	public static void main(String[] args) {
		LocalTime currentTime = new LocalTime();
		System.out.println("The current local time is: " + currentTime);
		Greeter greeter = new Greeter();
		System.out.println(greeter.sayHello());
		
		logger.error("Application is running!");		
	}
}
Copy the code

Above, we use logger.error(“Application is running!” ); To log Java. It will output the following log on the console:

12:30:58.656 [main] ERROR hello.HelloWorld - Application is running!
Copy the code

Obviously the output of this log is not directed to a file. We need to configure log4j2 to redirect this log to a file, which is the JSON file format we need.

The Log4j2 configuration is defined in log4j2.xml. Since we haven’t configured anything yet, the default configuration used by the logging library is:

  • Writes output to the ConsoleAppender. Appenders are used to send log data to different local or remote targets, such as files, databases, sockets, and message brokers.
  • Construct the output PatternLayout as shown above. Layouts are used to format log data into JSON, HTML, XML, and other formatting strings. This provides the most appropriate format for the log consumer.
  • Minimum error log level, defined as the root level of Logger in Log4J. This means that no lower-level logs (such as INFO) are written at all. This is why (for illustrative purposes only) we used a degree of error with the content of what should really be informational messages.

Note that the Java code is independent of the configuration details above, so changing these details does not require code changes. We need to change these Settings to achieve the goal of centralizing logging in Elasticsearch, and to ensure that logging can be used by a wider range of Java applications.

 

Configuration Log4j2

Create a file called log4j2.xml and place it where the classpath can access it. Add the following to it:

<? The XML version = "1.0" encoding = "utf-8"? > <Configuration status="WARN"> <Appenders> <File name="FileAppender" filename="/path_to_logs/myapp.log"> <JSONLayout compact="true" eventEol="true"> <KeyValuePair key="@timestamp" value="$${date:yyyy-MM-dd'T'HH:mm:ss.SSSZ}" /> </JSONLayout> </File> </Appenders> <Loggers> <Root level="trace"> <AppenderRef ref="FileAppender"/> </Root> </Loggers> </Configuration>Copy the code

This configuration uses FileAppender in conjunction with J SONLayout to write JSON-formatted output to a file to get logs for level tracking and above. It also contains an @TIMESTAMP field, which will help Elasticsearch determine the order of time series data.

You will also need to use the build tool to add the Jackson Databind package as a dependency, since it is required at run time. With Maven, this means adding the following to POM.xml:

<dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> The < version > 2.11.1 < / version > < / dependency >Copy the code

After the above modification, we can run the Java application directly and see the required log information in /path_to_logs/myapp.log.

Now take my example. First clone project:

git clone https://github.com/liu-xiao-guo/gs-maven
Copy the code

Then enter the project:

$ pwd /Users/liuxg/java/gs-maven/complete liuxg:complete liuxg$ ls dependency-reduced-pom.xml mvnw.cmd target log4j2.xml  pom.xml mvnw src liuxg:complete liuxg$ mvn clean packageCopy the code

The above command will generate a jar file package:

log4j2.xml

<? The XML version = "1.0" encoding = "utf-8"? > <Configuration status="WARN"> <Appenders> <File name="FileAppender" filename="/Users/liuxg/data/java_logs/java_app.log"> <JSONLayout compact="true" eventEol="true"> <KeyValuePair key="@timestamp" value="$${date:yyyy-MM-dd'T'HH:mm:ss.SSSZ}" /> </JSONLayout> </File> </Appenders> <Loggers> <Root level="trace"> <AppenderRef ref="FileAppender"/> </Root> </Loggers> </Configuration>Copy the code

Above, I define my filename path. Create your own path and modify the file. We can run it with the following command:

Java - jar - Dlog4j. ConfigurationFile = / Users/liuxg/Java/gs - maven/complete/log4j2. XML. / target/gs - maven - 0.1.0 from. The jarCopy the code

The command above shows:

The current local time is: 12:45:40.001
Hello world!
Copy the code

Let’s look at the log path defined in log4j2.xml:

Above, we see the log information in JSON format. We can import this data directly using Filebeat.

 

Import logs to Elasticsearch

Writing logs to files has many benefits. The process is fast and robust, and the application does not need to know anything about the type of storage that will ultimately be used for logging. Elasticsearch provides Beats to help you collect data from a variety of sources (including files) and ship it to Elasticsearch reliably and efficiently. Once the log data is in Elasticsearch, you can analyze it using Kibana.

The log data sent to Elasticsearch needs to be parsed so that Elasticsearch can construct it correctly. Elasticsearch can easily handle JSON data. You can set up more complex parsing for other formats.

We created the following files in fileBeat’s installation directory:

filebeat_json.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /Users/liuxg/data/java_logs/java_app.log
  json:
    keys_under_root: true
    overwrite_keys: true
    message_key: 'message'
 
processors:
  - decode_json_fields:
      fields: ['message']
      target: json
 
setup.template.enabled: false
setup.ilm.enabled: false
 
output.elasticsearch:
  hosts: ["localhost:9200"]
  index: "java_logs"
  bulk_max_size: 1000
Copy the code

Please note: you must replace the paths above with your own.

We use the following command to import the data:

$ ls filebeat_json.yml 
filebeat_json.yml
 
$ ./filebeat -e -c ./filebeat_json.yml
Copy the code

After running the above command, we check the produced index JAVA_logs with the following command:

GET _cat/indices
Copy the code

The command above shows:

We see the newly generated JAVA_LOGS index above. We can use the following command to view:

GET java_logs/_search
Copy the code

We can see the following document:

{ "took" : 1, "timed_out" : false, "_shards" : { "total" : 1, "successful" : 1, "skipped" : 0, "failed" : 0 }, "hits" : {" total ": {" value" : 3, "base" : "eq"}, "max_score" : 1.0, "hits" : [{" _index ":" java_logs ", "_type" : "_doc", "_id" : "lHU7P3UBwnhF9_ZDScdQ", "_score" : 1.0, "_source" : {" @ timestamp ": "2020-10-19T05:00:50.022Z", "loggerName" : "hello.HelloWorld", "endOfBatch" : false, "loggerFqcn" : "org.apache.logging.log4j.spi.AbstractLogger", "level" : "ERROR", "instant" : { "nanoOfSecond" : 8000000, "epochSecond" : 1603080024 }, "threadPriority" : 5, "threadId" : 1, "host" : { "name" : "liuxg" }, "message" : "The Application is running!", "thread", "main", "input" : {" type ":" log "}, "ecs" : {" version ":" 1.5.0} ", "log" : { "offset" : 0, "file" : { "path" : "/Users/liuxg/data/java_logs/java_app.log" } }, "agent" : { "hostname" : "liuxg", "ephemeral_id" : "0ec0c4ad-00a1-4754-bd30-e753852d4425", "id" : "B04e426a-35f8-4dbe-8702-42542624a45d ", "name" : "liuxg", "type" :" fileBeat ", "version" : "7.9.1"}}}, {"_index" : "Java_logs _type", "" :" _doc ", "_id" : "lXU7P3UBwnhF9_ZDScdQ", "_score" : 1.0, "_source" : {" @ timestamp ": "The 2020-10-19 T05:00:50. 023 z", "log" : {" offset ": 317," file ": {" path" : "/Users/liuxg/data/java_logs/java_app.log" } }, "instant" : { "nanoOfSecond" : 613000000, "epochSecond" : 1603080522}, "level" : "ERROR", "ecs" : {" version ":" 1.5.0} ", "the host" : {" name ":" liuxg} ", "agent" : {" type ": "Filebeat ", "version" : "7.9.1", "hostname" : "liuxg", "ephemeral_id" :" 0ec0C4AD-00A1-4754-bd30-E753852D4425 ", "ID" : "b04e426a-35f8-4dbe-8702-42542624a45d", "name" : "liuxg" }, "loggerFqcn" : "org.apache.logging.log4j.spi.AbstractLogger", "threadPriority" : 5, "thread" : "main", "message" : "Application is running!", "threadId" : 1, "input" : { "type" : "log" }, "endOfBatch" : false, "loggerName" : "hello.HelloWorld" } }, { "_index" : "java_logs", "_type" : "_doc", "_id" : "lnU7P3UBwnhF9_ZDScdQ", "_score" : 1.0, the "_source" : {" @ timestamp ":" the 2020-10-19 T05:00:50. 023 z ", "the host" : {" name ":" liuxg} ", "endOfBatch" : false, "instant" : { "nanoOfSecond" : 18000000, "epochSecond" : 1603082740 }, "threadId" : 1, "threadPriority" : 5, "loggerFqcn" : "org.apache.logging.log4j.spi.AbstractLogger", "input" : { "type" : "log" }, "agent" : { "id" : "B04e426a-35f8-4dbe-8702-42542624 A45D ", "name" : "liuxg", "type" :" fileBeat ", "version" : "7.9.1", "hostname" : "liuxg", "ephemeral_id" : "0ec0c4ad-00a1-4754-bd30-e753852d4425" }, "loggerName" : "hello.HelloWorld", "thread" : "Main", "level" : "ERROR", "ecs" : {" version ":" 1.5.0} ", "log" : {" offset ": 636," file ": {" path" : "/Users/liuxg/data/java_logs/java_app.log" } }, "message" : "Application is running!" } } ] } }Copy the code

At this point, we have finished writing to the Java application log. Hope to help you!