Full-stack Engineer's blog

Useful linux command tips for dealing with log files

日本語 | English

Tips for dealing with text files such as log files when you are required to be done ease aggregation. Usually these depends on machine spec.


・ Excel: good up to about 50K lines.
・ Sublime text: good up to about 300K lines.
・ For more data, it works fine with linux command. even 10M lines

(Regardless of data amount, I personaly prefer use linux commands)

Check part of data (a few lines)

# show first 10 lines 
$ head -n 10 hoge.log

# see file by filtering a word "piyo". good to see access log.
$ tail -f hoge.log | grep piyo

Aggregate command

# combine some files into one file.
$ cat *.log >> aaa.log

# check the file line count.
$ cat hoge.log | wc -l

# filtering by a word "bar"
$ cat hoge.log | grep bar >> aaa.log

# except line which contains a word "piyo"
$ cat hoge.log | grep -v piyo >> aaa.log

# replace a word aaa -> bbb
$ sed -e s/aaa/bbb/g >> aaa.log

# delete duplicated line without sort in advance
$ awk '!a[$0]++' hoge.log >> aaa.log


These commands helps you to do easy aggregation.