-1

I´ve playing for a whilen and resarching but I´m not able to find a solution. I have some huge logs, where sometimes there are also huge JSONS. But to simplify imagine the following:

mkdir logs
cd logs/
echo "$(date) [INFO] something" >> huge_log.log
echo "$(date) [INFO] something more" >> huge_log.log
echo "$(date) [INFO] Something with json: {\"foo\": \"bar\"}" >> huge_log.log
tail -n 5 -f huge_log.log | how_to_filter? | jq '.'

Is it possible to be able to see something like this (the json as the output of jq '.' would be):

Tue Aug 18 12:42:24 CEST 2020 [INFO] something
Tue Aug 18 12:42:29 CEST 2020 [INFO] something more
Tue Aug 18 12:43:05 CEST 2020 [INFO] Something with json: 
{
    "foo": "bar"
}

So, somehow, automatically detect the jsons while printing the log and show them as the output would be for:

echo "{\"foo\": \"bar\"}" | jq '.'
{
  "foo": "bar"
}
nck
  • 129

1 Answers1

1

The below worked for your example and seems reasonable to me. I'm iterating over every line in the logfile and if a line is matching the regex "{.*}" it's identifed as a json object and then formatted with jq. If the regex doesn't match however it prints out normal. I tested it, it works actually really well also for bigger more complex file.

tail -n 15 huge_log.log | while read line ; do if echo $line | egrep "\{.*}" >/dev/null;json=$(echo $line | egrep -oh "\{.*}" | jq '.');then echo $line | awk -v json="$json" -F "\{.*}" '{printf "%s\n%s\n%s\n",$1,json,$2}'; fi;  done