In our previous article “Beats: Log Structuring with Filebeat”, I used a way to parse a JSON-formatted file and import it into Elasticsearch. In today’s article, I’ll show you how to import a JSON file in a different way.

 

To prepare data

Using the data from the previous article as an example, we used the following file:

sample.json

{"user_name": "arthur", "id": 42, "verified": false, "event": "logged_in"}
{"user_name": "arthur", "id": 42, "verified": true, "event": "changed_state"}
Copy the code

There are only two pieces of data in there.

 

Create a Filebeat profile

To parse the JSON file above, we use the following configuration file:

filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /Users/liuxg/data/processors/sample.json

processors:
 - decode_json_fields:
     fields: ['message']
     target: ''
     overwrite_keys: true

 - drop_fields:
     fields: ["message", "ecs", "agent", "log"]

setup.template.enabled: false
setup.ilm.enabled: false

output.elasticsearch:
  hosts: ["localhost:9200"]
  index: "logs_json"
  bulk_max_size: 1000
Copy the code

Above, we used the Processor decode_json_fields to parse JSON logs. The location of the JSON file sample. JSON in the above file depends on the location of your file and needs to be changed. This is different from the approach introduced in the previous article. You can compare. At the same time, we used Drop_fields to remove some fields that were not needed.

 

Import data

To import data into Elasticsearch, use the following command:

./filebeat -e -c  filebeat.yml 
Copy the code

After executing the above command, we can check it in Kibana:

GET logs_json/_search
Copy the code
{ "took" : 0, "timed_out" : false, "_shards" : { "total" : 1, "successful" : 1, "skipped" : 0, "failed" : 0 }, "hits" : {" total ": {" value" : 2, the "base" : "eq"}, "max_score" : 1.0, "hits" : [{" _index ":" logs_json ", "_type" : "_doc", "_id" : "hzdVc3QBk0AMDyd4y0Cq", "_score" : 1.0, "_source" : {" @ timestamp ": "The 2020-09-09 T14: when 338 z", "the host" : {" name ":" liuxg} ", "user_name" : "Arthur", "id" : 42, "input" : {" type ": "log" }, "verified" : false, "event" : "logged_in" } }, { "_index" : "logs_json", "_type" : "_doc", "_id" : "IDdVc3QBk0AMDyd4y0Cq ", "_score" : 1.0, "_source" : {"@timestamp" :" 2020-09-09t14:47:15.338z ", "id" : 42, "verified" : true, "host" : { "name" : "liuxg" }, "input" : { "type" : "log" }, "user_name" : "arthur", "event" : "changed_state" } } ] } }Copy the code

We can see the two imported documents from above.