SlideShare uma empresa Scribd logo
1 de 55
Baixar para ler offline
James Turnbull
@kartar
Yes, Logging Can
Be Awesome
who
operations chap
Puppet chap
erstwhile Ruby chap
funny accent
(photo by Jennie Rainsford)
other matters
author
hack-n-slash developer
pontification
http://www.jamesturnbull.net
https://github.com/jamtur01
http://www.kartar.net
books
Pro Puppet
Pro Linux System Administration
Pro Nagios 2.0
Hardening Linux
the logstash book
So who are you folks?
so what's a log
(photo by Rick Payette)
timestamp + data = log
May 7 16:07:10 pelin systemd[1]: Starting Command Scheduler...
May 7 16:07:10 < timestamp
pelin systemd[1]: Starting Command
Scheduler... < data
lifecycle of a log
actual lifecycle of a log
actual actual lifecycle of a log
so why isn't logging awesome?
I'll tell you a story
123.151.148.182 - - [11/May/2013:20:48:25 -0400] "GET /2010/08/rag-of-the-week-busted/trackback HTTP/1.1" 302 5 "http://www.stumpdinpdx.com/"
"Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)"
123.151.148.182 - - [11/May/2013:20:48:25 -0400] "GET /2010/08/rag-of-the-week-busted/ HTTP/1.1" 200 11678 "http://www.stumpdinpdx.com/"
"Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)"
96.126.127.108 - - [11/May/2013:20:48:35 -0400] "POST /wp-cron.php?doing_wp_cron=1368319715.1563251018524169921875 HTTP/1.0" 200 0 "-"
"WordPress/3.5.1; http://www.stumpdinpdx.com"
123.151.148.182 - - [11/May/2013:20:48:35 -0400] "GET /2010/08/rag-of-the-week-busted/feed HTTP/1.1" 301 5 "http://www.stumpdinpdx.com/"
"Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)"
123.151.148.182 - - [11/May/2013:20:48:35 -0400] "GET /2010/08/rag-of-the-week-busted/feed/ HTTP/1.1" 200 2559 "http://www.stumpdinpdx.com/"
"Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)"
107.20.202.46 - - [11/May/2013:20:52:34 -0400] "GET /feed/ HTTP/1.1" 200 135969 "-" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US)
AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16"
107.20.202.46 - - [11/May/2013:20:52:34 -0400] "GET /feed/ HTTP/1.1" 200 135969 "-" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US)
AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16"
96.126.127.108 - - [11/May/2013:20:54:02 -0400] "POST /wp-cron.php?doing_wp_cron=1368320042.6065499782562255859375 HTTP/1.0" 200 0 "-"
"WordPress/3.5.1; http://www.stumpdinpdx.com"
92.64.254.225 - - [11/May/2013:20:54:03 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)"
209.85.238.233 - - [11/May/2013:21:07:01 -0400] "GET /feed/ HTTP/1.1" 200 46099 "-" "Feedfetcher-Google;
(+http://www.google.com/feedfetcher.html; 48 subscribers; feed-id=5312968832043971344)"
121.219.57.195 - - [11/May/2013:21:08:21 -0400] "GET / HTTP/1.1" 200 6142 "-" "Reeder/1020.09.00 CFNetwork/596.3.3 Darwin/12.3.0 (x86_64)
(MacBookPro8%2C2)"
121.219.57.195 - - [11/May/2013:21:08:21 -0400] "GET / HTTP/1.1" 200 6142 "-" "Reeder/1020.09.00 CFNetwork/596.3.3 Darwin/12.3.0 (x86_64)
(MacBookPro8%2C2)"
96.126.127.108 - - [11/May/2013:21:10:51 -0400] "POST /wp-cron.php?doing_wp_cron=1368321051.2980649471282958984375 HTTP/1.0" 200 0 "-"
"WordPress/3.5.1; http://www.stumpdinpdx.com"
94.125.180.90 - - [11/May/2013:21:10:51 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)"
217.34.181.76 - - [11/May/2013:21:10:51 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)"
96.126.127.108 - - [11/May/2013:21:12:09 -0400] "POST /wp-cron.php?doing_wp_cron=1368321129.5501360893249511718750 HTTP/1.0" 200 0 "-"
"WordPress/3.5.1; http://www.stumpdinpdx.com"
190.199.60.150 - - [11/May/2013:21:12:09 -0400] "POST /wp-login.php HTTP/1.0" 200 4463 "http://www.stumpdinpdx.com/wp-login.php" "Mozilla/4.0
(compatible; MSIE 6.0; Windows NT 5.1; SV1)"
184.154.100.20 - - [11/May/2013:21:12:56 -0400] "GET /2012/12/50-things-i-will-miss-about-portland/comment-page-1/ HTTP/1.0" 200 12699
"http://www.stumpdinpdx.com/" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; Mozilla/4.0 (compatible; MSIE 6.0; Windows NT
5.1; SV1) ; .NET CLR 3.5.30729)"
96.126.127.108 - - [11/May/2013:21:13:29 -0400] "POST /wp-cron.php?doing_wp_cron=1368321209.4377140998840332031250 HTTP/1.0" 200 0 "-"
"WordPress/3.5.1; http://www.stumpdinpdx.com"
217.91.37.3 - - [11/May/2013:21:13:29 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)"
80.93.213.249 - - [11/May/2013:21:15:32 -0400] "GET /2010/05/food-carts-of-melbourne-all-four-of-them/ HTTP/1.1" 200 16569
"http://www.stumpdinpdx.com/" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; FunWebProducts; .NET CLR 1.1.4322; PeoplePal 6.2)"
80.93.213.249 - - [11/May/2013:21:15:33 -0400] "GET /2012/12/50-things-i-will-miss-about-portland/comment-page-1/ HTTP/1.1" 200 12720
"http://www.stumpdinpdx.com/" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; FunWebProducts; .NET CLR 1.1.4322; PeoplePal 6.2)"
[11-May-2013 14:10:04 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:11:32 UTC] PHP Fatal error: Call to a member function setting() on a non-object in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/feedwordpress.php on line 606
[11-May-2013 15:21:58 UTC] PHP Fatal error: Call to a member function setting() on a non-object in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/feedwordpress.php on line 606
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 17:10:07 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
[11-May-2013 17:10:07 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp-
content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
Jun 4, 2011 10:01:06 AM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-8080
Jun 4, 2011 10:24:48 AM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap
SEVERE: The web application [] created a ThreadLocal with key of type [null] (value [clojure.lang.Var$1@564ca930]) and a value of type
[clojure.lang.Var.Frame] (value [clojure.lang.Var$Frame@42f7ba93]) but failed to remove it when the web application was stopped. This is very
likely to create a memory leak.
Jun 4, 2011 10:24:48 AM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap
SEVERE: The web application [] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@15fa2b3e]) and a
value of type [clojure.lang.LockingTransaction] (value [clojure.lang.LockingTransaction@5b2cfeb7]) but failed to remove it when the web
application was stopped. This is very likely to create a memory leak.
Jun 4, 2011 10:24:50 AM org.apache.catalina.core.StandardContext resourcesStart
SEVERE: Error starting static Resources
java.lang.IllegalArgumentException: Document base /var/lib/tomcat6/webapps/ROOT does not exist or is not a readable directory
at org.apache.naming.resources.FileDirContext.setDocBase(FileDirContext.java:142)
at org.apache.catalina.core.StandardContext.resourcesStart(StandardContext.java:4249)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4418)
at org.apache.catalina.startup.HostConfig.checkResources(HostConfig.java:1244)
at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1342)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:303)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1337)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1601)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1610)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1590)
at java.lang.Thread.run(Thread.java:662)
Jun 4, 2011 10:24:50 AM org.apache.catalina.core.StandardContext start
SEVERE: Error in resourceStart()
Jun 4, 2011 10:24:50 AM org.apache.catalina.core.StandardContext start
SEVERE: Error getConfigured
all of these logs tell us (useful)
stories
pretty confusing stories though
eh?
so what's wrong?
so many sodding formats
don't even get me started on timestamps
no context
really unhelpful error messages
doesn't scale
enter logstash, parsing heavily
what?
collects, transmits, interprets, stores
free and open source
primarily written by Jordan Sissel
maxim: if a new user has a bad time, it's a bug in logstash
awesome!
logstash architecture
how does it work?
202.46.52.20 - - [21/Jan/2013:14:59:39 -0800] "GET / HTTP/1.1" 200 931 "-"
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
119.63.193.196 - - [21/Jan/2013:15:00:27 -0800] "GET / HTTP/1.1" 200 931 "-"
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
208.115.113.88 - - [21/Jan/2013:15:04:30 -0800] "GET /robots.txt HTTP/1.1" 404 297
"-" "Mozilla/5.0 (compatible; Ezooms/1.0; ezooms.bot@gmail.com)"
188.138.88.171 - - [21/Jan/2013:15:09:46 -0800] "GET /w00tw00t.at.ISC.SANS.DFind:)
HTTP/1.1" 400 315 "-" "-"
220.181.108.81 - - [21/Jan/2013:15:21:34 -0800] "GET / HTTP/1.1" 200 935 "-"
"Mozilla/5.0 (compatible; Baiduspider/2.0;
+http://www.baidu.com/search/spider.html)"
123.125.71.31 - - [21/Jan/2013:15:21:58 -0800] "GET / HTTP/1.1" 200 935 "-"
"Mozilla/5.0 (compatible; Baiduspider/2.0;
+http://www.baidu.com/search/spider.html)"
123.151.148.162 - - [21/Jan/2013:15:37:11 -0800] "GET / HTTP/1.1" 200 931 "-"
"Sosospider+(+http://help.soso.com/webspider.htm)"
119.63.196.28 - - [21/Jan/2013:15:41:28 -0800] "GET / HTTP/1.1" 200 930 "-"
"Mozilla/5.0 (compatible; Baiduspider/2.0;
+http://www.baidu.com/search/spider.html)"
209.85.238.174 - - [21/Jan/2013:15:45:20 -0800] "GET /?type=atom10 HTTP/1.1" 200
930 "-" "Feedfetcher-Google; (+http://www.google.com/feedfetcher.html; 2
subscribers; feed-id=16157856257601629822)"
188.138.88.171 - - [21/Jan/2013:16:17:06 -0800] "GET /w00tw00t.at.ISC.SANS.DFind:)
HTTP/1.1" 400 315 "-" "-"
123.125.71.35 - - [21/Jan/2013:16:19:22 -0800] "GET / HTTP/1.1" 200 927 "-"
"Mozilla/5.0 (compatible; Baiduspider/2.0;
+http://www.baidu.com/search/spider.html)"
220.181.108.78 - - [21/Jan/2013:16:19:29 -0800] "GET / HTTP/1.1" 200 927 "-"
"Mozilla/5.0 (compatible; Baiduspider/2.0;
+http://www.baidu.com/search/spider.html)"
180.76.5.55 - - [21/Jan/2013:16:20:14 -0800] "GET / HTTP/1.1" 200 930 "-"
"Mozilla/5.0 (compatible; Baiduspider/2.0;
+http://www.baidu.com/search/spider.html)"
simple is as simple does
input {
file {
type => "web"
path => "/var/log/httpd/access.log" }
}
filter {
grok {
type => "web"
pattern => "%{COMBINEDAPACHELOG}" }
date {
type => "web"
timestamp => "dd/MMM/yyyy:HH:mm:ss Z" }
}
output {
elasticsearch { }
}
the input
input {
file {
type => "web"
path => "/var/log/httpd/access.log" }
}
turns
202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] "GET / HTTP/1.1" 200 935 "-"
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
into
{"@source"=>"file://pelin.example.com/var/httpd/access.log", "@tags"=>[], "@fields"=>
{}, "@timestamp"=>"2013-01-21T16:41:38.030Z", "@source_host"=>"pelin.example.com",
"@source_path"=>"/var/log/httpd/access.log", "@message"=>"202.46.63.192 - -
[21/Jan/2013:16:41:38 -0800] GET / HTTP/1.1 200 935 - Mozilla/4.0 (compatible; MSIE
7.0; Windows NT 6.0)", "@type"=>"web"}
still looks like a
mess eh?
but it's now a
structured mess!
structured data
for the win!
the filters
grok {
type => "web"
pattern => "%{COMBINEDAPACHELOG}"
}
use the power of regex
to add context
instead of ... evil ... like:
(?:(?:rn)?[ t])*(?:(?:(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t]
)+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:
rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(
?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[
t]))*"(?:(?:rn)?[ t])*))*@(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-0
31]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*
](?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+
(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:
(?:rn)?[ t])*))*|(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+|Z
|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:rn)
?[ t])*)*<(?:(?:rn)?[ t])*(?:@(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:
rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[
t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)
?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t]
)*))*(?:,@(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[
t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t])*
)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t]
)+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t])*))*)
*:(?:(?:rn)?[ t])*)?(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+
|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:r
n)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:
rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t
]))*"(?:(?:rn)?[ t])*))*@(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031
]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](
?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?
:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?
:rn)?[ t])*))*>(?:(?:rn)?[ t])*)|(?:[^()<>@,;:".[] 000-031]+(?:(?
:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?
[ t]))*"(?:(?:rn)?[ t])*)*:(?:(?:rn)?[ t])*(?:(?:(?:[^()<>@,;:".[]
000-031]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|
.|(?:(?:rn)?[ t]))*"(?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>
@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"
(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:rn)?[ t])*))*@(?:(?:rn)?[ t]
%{SYNTAX:SEMANTIC}
Log: May 12 03:36:31 pelin dhclient[2335]: DHCPACK from 97.107.143.38 (xid=0x6f62572d)
Grok: %{SYSLOGTIMESTAMP:timestamp} %{HOSTNAME:host} %{SYSLOGPROG:program}: %{DATA:message}
SYSLOGTIMESTAMP: %{MONTH} +%{MONTHDAY} %{TIME}
HOSTNAME: b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(.?|b)
SYSLOGPROG %{PROG:program}(?:[%{POSINT:pid}])?
remember this?
{"@source"=>"file://pelin.example.com/var/httpd/access.log", "@tags"=>[], "@fields"=>
{}, "@timestamp"=>"2013-01-21T16:41:38.030Z", "@source_host"=>"pelin.example.com",
"@source_path"=>"/var/log/httpd/access.log", "@message"=>"202.46.63.192 - -
[21/Jan/2013:16:41:38 -0800] GET / HTTP/1.1 200 935 - Mozilla/4.0 (compatible; MSIE
7.0; Windows NT 6.0)", "@type"=>"web"}
with grok it becomes
{"@source" => "file://pelin.example.com/var/httpd/access.log",
"@tags" => [],
"@fields" => {
"clientip": [ "202.46.63.192" ],
"ident": [ "-" ],
"auth": [ "-" ],
"timestamp": [ "21/Jan/2013:16:41:38 -0800" ],
"verb": [ "GET" ],
"request": [ "/" ],
"httpversion": [ "1.1" ],
"response": [ "200" ],
"bytes": [ "935" ],
"referrer": [ ""-"" ],
"agent": [ ""Mozilla/4.0 (compatible; MSIE 7.0; Windows NT
6.0)"" ] },
"@timestamp" => "2013-01-21T16:41:38.030Z",
"@source_host" => "pelin.example.com",
"@source_path" => "/var/log/httpd/access.log",
"@message" => "202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] GET / HTTP/1.1 200
935 - Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)",
"@type" => "web"}
grok makes better
over 100 patterns
numbers, strings, hosts, network addresses, urls, etc
chain patterns together
easy to extend, easy to test
you can test your patterns
http://grokdebug.herokuapp.com/
or you can even write tests for
your patterns
you write tests right?
did I mention time?
date {
type => "web"
timestamp => "dd/MMM/yyyy:HH:mm:ss Z" }
}
problem?
so many fucking time formats
seriously. stop adding time
formats.
solution.
standardize with the time filter.
filters rock
30+ filters
munge, mangle, mutate
lookup, research, aggregate
filters turn abstract information
like
202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] "GET / HTTP/1.1" 200 935 "-"
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
into
the truth will set you free
... or at least wake you up.
outputs
output {
elasticsearch { }
}
outputs
50+ outputs
search, store, transit
email, irc, alert
graph, aggregate, execute
all of the pretty things
all of the pretty things
scales like a mofo
all of the logstashes
logstash-users@googlegroups.com
#logstash on freenode irc
logstash.net
logstash.jira.com
Questions?
references
Doctor Who © BBC
He-Man © Mattel

Mais conteúdo relacionado

Mais procurados

Debugging and Testing ES Systems
Debugging and Testing ES SystemsDebugging and Testing ES Systems
Debugging and Testing ES Systems
Chris Birchall
 

Mais procurados (20)

LogStash in action
LogStash in actionLogStash in action
LogStash in action
 
Tuning Elasticsearch Indexing Pipeline for Logs
Tuning Elasticsearch Indexing Pipeline for LogsTuning Elasticsearch Indexing Pipeline for Logs
Tuning Elasticsearch Indexing Pipeline for Logs
 
Logstash family introduction
Logstash family introductionLogstash family introduction
Logstash family introduction
 
Fluentd meetup #2
Fluentd meetup #2Fluentd meetup #2
Fluentd meetup #2
 
Web::Scraper
Web::ScraperWeb::Scraper
Web::Scraper
 
Machine Learning in a Twitter ETL using ELK
Machine Learning in a Twitter ETL using ELK Machine Learning in a Twitter ETL using ELK
Machine Learning in a Twitter ETL using ELK
 
ElasticSearch
ElasticSearchElasticSearch
ElasticSearch
 
Доклад Антона Поварова "Go in Badoo" с Golang Meetup
Доклад Антона Поварова "Go in Badoo" с Golang MeetupДоклад Антона Поварова "Go in Badoo" с Golang Meetup
Доклад Антона Поварова "Go in Badoo" с Golang Meetup
 
Debugging and Testing ES Systems
Debugging and Testing ES SystemsDebugging and Testing ES Systems
Debugging and Testing ES Systems
 
Dive into Fluentd plugin v0.12
Dive into Fluentd plugin v0.12Dive into Fluentd plugin v0.12
Dive into Fluentd plugin v0.12
 
(Fios#02) 2. elk 포렌식 분석
(Fios#02) 2. elk 포렌식 분석(Fios#02) 2. elk 포렌식 분석
(Fios#02) 2. elk 포렌식 분석
 
To Hire, or to train, that is the question (Percona Live 2014)
To Hire, or to train, that is the question (Percona Live 2014)To Hire, or to train, that is the question (Percona Live 2014)
To Hire, or to train, that is the question (Percona Live 2014)
 
Customer Intelligence: Using the ELK Stack to Analyze ForgeRock OpenAM Audit ...
Customer Intelligence: Using the ELK Stack to Analyze ForgeRock OpenAM Audit ...Customer Intelligence: Using the ELK Stack to Analyze ForgeRock OpenAM Audit ...
Customer Intelligence: Using the ELK Stack to Analyze ForgeRock OpenAM Audit ...
 
Dexador Rises
Dexador RisesDexador Rises
Dexador Rises
 
Logging logs with Logstash - Devops MK 10-02-2016
Logging logs with Logstash - Devops MK 10-02-2016Logging logs with Logstash - Devops MK 10-02-2016
Logging logs with Logstash - Devops MK 10-02-2016
 
Presto overview
Presto overviewPresto overview
Presto overview
 
'Scalable Logging and Analytics with LogStash'
'Scalable Logging and Analytics with LogStash''Scalable Logging and Analytics with LogStash'
'Scalable Logging and Analytics with LogStash'
 
Central LogFile Storage. ELK stack Elasticsearch, Logstash and Kibana.
Central LogFile Storage. ELK stack Elasticsearch, Logstash and Kibana.Central LogFile Storage. ELK stack Elasticsearch, Logstash and Kibana.
Central LogFile Storage. ELK stack Elasticsearch, Logstash and Kibana.
 
How to create Treasure Data #dotsbigdata
How to create Treasure Data #dotsbigdataHow to create Treasure Data #dotsbigdata
How to create Treasure Data #dotsbigdata
 
Haproxy - zastosowania
Haproxy - zastosowaniaHaproxy - zastosowania
Haproxy - zastosowania
 

Destaque

Monólog os avanzado
Monólog os avanzadoMonólog os avanzado
Monólog os avanzado
adjnt1979
 
State of the Puppet Community - PuppetConf 2012
State of the Puppet Community - PuppetConf 2012State of the Puppet Community - PuppetConf 2012
State of the Puppet Community - PuppetConf 2012
James Turnbull
 

Destaque (20)

Down and dirty with Elasticsearch
Down and dirty with ElasticsearchDown and dirty with Elasticsearch
Down and dirty with Elasticsearch
 
Monitoring As A Service - Modernity and Self-Service CraftConf 2016
Monitoring As A Service - Modernity and Self-Service CraftConf 2016Monitoring As A Service - Modernity and Self-Service CraftConf 2016
Monitoring As A Service - Modernity and Self-Service CraftConf 2016
 
Current and Future of Apache Kafka
Current and Future of Apache KafkaCurrent and Future of Apache Kafka
Current and Future of Apache Kafka
 
Elasticsearch in Netflix
Elasticsearch in NetflixElasticsearch in Netflix
Elasticsearch in Netflix
 
Monólog os avanzado
Monólog os avanzadoMonólog os avanzado
Monólog os avanzado
 
The Automation Factory
The Automation FactoryThe Automation Factory
The Automation Factory
 
Hadoop - Splitting big problems into manageable pieces.
Hadoop - Splitting big problems into manageable pieces.Hadoop - Splitting big problems into manageable pieces.
Hadoop - Splitting big problems into manageable pieces.
 
Elastic Search Performance Optimization - Deview 2014
Elastic Search Performance Optimization - Deview 2014Elastic Search Performance Optimization - Deview 2014
Elastic Search Performance Optimization - Deview 2014
 
Log4J
Log4JLog4J
Log4J
 
Use of Monolog with PHP
Use of Monolog with PHPUse of Monolog with PHP
Use of Monolog with PHP
 
An Introduction to Apache Kafka
An Introduction to Apache KafkaAn Introduction to Apache Kafka
An Introduction to Apache Kafka
 
SF ElasticSearch Meetup 2013.04.06 - Monitoring
SF ElasticSearch Meetup 2013.04.06 - MonitoringSF ElasticSearch Meetup 2013.04.06 - Monitoring
SF ElasticSearch Meetup 2013.04.06 - Monitoring
 
Docker for Developers
Docker for DevelopersDocker for Developers
Docker for Developers
 
Elasticsearch in Production (London version)
Elasticsearch in Production (London version)Elasticsearch in Production (London version)
Elasticsearch in Production (London version)
 
Why Monitoring Sucks and what Configuration Management can do about it
Why Monitoring Sucks and what Configuration Management can do about itWhy Monitoring Sucks and what Configuration Management can do about it
Why Monitoring Sucks and what Configuration Management can do about it
 
Orchestrating Docker - Making the Whale Dance
Orchestrating Docker - Making the Whale DanceOrchestrating Docker - Making the Whale Dance
Orchestrating Docker - Making the Whale Dance
 
State of the Puppet Community - PuppetConf 2012
State of the Puppet Community - PuppetConf 2012State of the Puppet Community - PuppetConf 2012
State of the Puppet Community - PuppetConf 2012
 
Fig for Local Development
Fig for Local DevelopmentFig for Local Development
Fig for Local Development
 
Rollback: The Impossible Dream
Rollback: The Impossible DreamRollback: The Impossible Dream
Rollback: The Impossible Dream
 
Virtualbox
VirtualboxVirtualbox
Virtualbox
 

Semelhante a LogStash - Yes, logging can be awesome

2012 coscup - Build your PHP application on Heroku
2012 coscup - Build your PHP application on Heroku2012 coscup - Build your PHP application on Heroku
2012 coscup - Build your PHP application on Heroku
ronnywang_tw
 

Semelhante a LogStash - Yes, logging can be awesome (20)

Plagger the duct tape of internet
Plagger the duct tape of internetPlagger the duct tape of internet
Plagger the duct tape of internet
 
Puppet Camp Berlin 2014 Closing Keynote: Next steps for doing more awesome th...
Puppet Camp Berlin 2014 Closing Keynote: Next steps for doing more awesome th...Puppet Camp Berlin 2014 Closing Keynote: Next steps for doing more awesome th...
Puppet Camp Berlin 2014 Closing Keynote: Next steps for doing more awesome th...
 
OSDC 2015: Pere Urbon | Scaling Logstash: A Collection of War Stories
OSDC 2015: Pere Urbon | Scaling Logstash: A Collection of War StoriesOSDC 2015: Pere Urbon | Scaling Logstash: A Collection of War Stories
OSDC 2015: Pere Urbon | Scaling Logstash: A Collection of War Stories
 
Puppet Data Mining
Puppet Data MiningPuppet Data Mining
Puppet Data Mining
 
First there was the command line
First there was the command lineFirst there was the command line
First there was the command line
 
Learning the command line
Learning the command lineLearning the command line
Learning the command line
 
vBACD - Introduction to Opscode Chef - 2/29
vBACD - Introduction to Opscode Chef - 2/29vBACD - Introduction to Opscode Chef - 2/29
vBACD - Introduction to Opscode Chef - 2/29
 
Got Logs? Get Answers with Elasticsearch ELK - PuppetConf 2014
Got Logs? Get Answers with Elasticsearch ELK - PuppetConf 2014Got Logs? Get Answers with Elasticsearch ELK - PuppetConf 2014
Got Logs? Get Answers with Elasticsearch ELK - PuppetConf 2014
 
A Continuous Packaging Pipeline
A Continuous Packaging PipelineA Continuous Packaging Pipeline
A Continuous Packaging Pipeline
 
Ferramentas de apoio ao desenvolvedor
Ferramentas de apoio ao desenvolvedorFerramentas de apoio ao desenvolvedor
Ferramentas de apoio ao desenvolvedor
 
Conquering the Command Line
Conquering the Command LineConquering the Command Line
Conquering the Command Line
 
Sling IDE Tooling @ adaptTo 2013
Sling IDE Tooling @ adaptTo 2013Sling IDE Tooling @ adaptTo 2013
Sling IDE Tooling @ adaptTo 2013
 
Real-time data analysis using ELK
Real-time data analysis using ELKReal-time data analysis using ELK
Real-time data analysis using ELK
 
2012 coscup - Build your PHP application on Heroku
2012 coscup - Build your PHP application on Heroku2012 coscup - Build your PHP application on Heroku
2012 coscup - Build your PHP application on Heroku
 
IBM dwLive, "Internet & HTTP - 잃어버린 패킷을 찾아서..."
IBM dwLive, "Internet & HTTP - 잃어버린 패킷을 찾아서..."IBM dwLive, "Internet & HTTP - 잃어버린 패킷을 찾아서..."
IBM dwLive, "Internet & HTTP - 잃어버린 패킷을 찾아서..."
 
Divolte Collector - meetup presentation
Divolte Collector - meetup presentationDivolte Collector - meetup presentation
Divolte Collector - meetup presentation
 
ApacheCon 2014 - What's New in Apache httpd 2.4
ApacheCon 2014 - What's New in Apache httpd 2.4ApacheCon 2014 - What's New in Apache httpd 2.4
ApacheCon 2014 - What's New in Apache httpd 2.4
 
Ruby and Rails Packaging to Production
Ruby and Rails Packaging to ProductionRuby and Rails Packaging to Production
Ruby and Rails Packaging to Production
 
Install tomcat 5.5 in debian os and deploy war file
Install tomcat 5.5 in debian os and deploy war fileInstall tomcat 5.5 in debian os and deploy war file
Install tomcat 5.5 in debian os and deploy war file
 
Mongo performance tuning: tips and tricks
Mongo performance tuning: tips and tricksMongo performance tuning: tips and tricks
Mongo performance tuning: tips and tricks
 

Mais de James Turnbull

Using Puppet - Real World Configuration Management
Using Puppet - Real World Configuration ManagementUsing Puppet - Real World Configuration Management
Using Puppet - Real World Configuration Management
James Turnbull
 

Mais de James Turnbull (12)

And lo there was monitoring!
And lo there was monitoring!And lo there was monitoring!
And lo there was monitoring!
 
Monitoring as a service - Velocity NYC 2015
Monitoring as a service - Velocity NYC 2015Monitoring as a service - Velocity NYC 2015
Monitoring as a service - Velocity NYC 2015
 
Developing Good Operations Tools
Developing Good Operations ToolsDeveloping Good Operations Tools
Developing Good Operations Tools
 
Monitoring As A Service - Monitorama 2015
Monitoring As A Service - Monitorama 2015Monitoring As A Service - Monitorama 2015
Monitoring As A Service - Monitorama 2015
 
Monitoring As a Service
Monitoring As a ServiceMonitoring As a Service
Monitoring As a Service
 
Introduction to Docker
Introduction to DockerIntroduction to Docker
Introduction to Docker
 
Software archaeology for beginners: code, community and culture
Software archaeology for beginners: code, community and cultureSoftware archaeology for beginners: code, community and culture
Software archaeology for beginners: code, community and culture
 
Introduction to Docker
Introduction to DockerIntroduction to Docker
Introduction to Docker
 
Once upon a time: Why operations mythology matters
Once upon a time: Why operations mythology mattersOnce upon a time: Why operations mythology matters
Once upon a time: Why operations mythology matters
 
Security Loves DevOps: DevOpsDays Austin 2012
Security Loves DevOps: DevOpsDays Austin 2012Security Loves DevOps: DevOpsDays Austin 2012
Security Loves DevOps: DevOpsDays Austin 2012
 
What the Fuck is DevOps?
What the Fuck is DevOps?What the Fuck is DevOps?
What the Fuck is DevOps?
 
Using Puppet - Real World Configuration Management
Using Puppet - Real World Configuration ManagementUsing Puppet - Real World Configuration Management
Using Puppet - Real World Configuration Management
 

Último

Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 

Último (20)

From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
 
Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 

LogStash - Yes, logging can be awesome

  • 2. who operations chap Puppet chap erstwhile Ruby chap funny accent (photo by Jennie Rainsford)
  • 3.
  • 5. books Pro Puppet Pro Linux System Administration Pro Nagios 2.0 Hardening Linux
  • 9. timestamp + data = log May 7 16:07:10 pelin systemd[1]: Starting Command Scheduler... May 7 16:07:10 < timestamp pelin systemd[1]: Starting Command Scheduler... < data
  • 15. 123.151.148.182 - - [11/May/2013:20:48:25 -0400] "GET /2010/08/rag-of-the-week-busted/trackback HTTP/1.1" 302 5 "http://www.stumpdinpdx.com/" "Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)" 123.151.148.182 - - [11/May/2013:20:48:25 -0400] "GET /2010/08/rag-of-the-week-busted/ HTTP/1.1" 200 11678 "http://www.stumpdinpdx.com/" "Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)" 96.126.127.108 - - [11/May/2013:20:48:35 -0400] "POST /wp-cron.php?doing_wp_cron=1368319715.1563251018524169921875 HTTP/1.0" 200 0 "-" "WordPress/3.5.1; http://www.stumpdinpdx.com" 123.151.148.182 - - [11/May/2013:20:48:35 -0400] "GET /2010/08/rag-of-the-week-busted/feed HTTP/1.1" 301 5 "http://www.stumpdinpdx.com/" "Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)" 123.151.148.182 - - [11/May/2013:20:48:35 -0400] "GET /2010/08/rag-of-the-week-busted/feed/ HTTP/1.1" 200 2559 "http://www.stumpdinpdx.com/" "Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)" 107.20.202.46 - - [11/May/2013:20:52:34 -0400] "GET /feed/ HTTP/1.1" 200 135969 "-" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16" 107.20.202.46 - - [11/May/2013:20:52:34 -0400] "GET /feed/ HTTP/1.1" 200 135969 "-" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16" 96.126.127.108 - - [11/May/2013:20:54:02 -0400] "POST /wp-cron.php?doing_wp_cron=1368320042.6065499782562255859375 HTTP/1.0" 200 0 "-" "WordPress/3.5.1; http://www.stumpdinpdx.com" 92.64.254.225 - - [11/May/2013:20:54:03 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)" 209.85.238.233 - - [11/May/2013:21:07:01 -0400] "GET /feed/ HTTP/1.1" 200 46099 "-" "Feedfetcher-Google; (+http://www.google.com/feedfetcher.html; 48 subscribers; feed-id=5312968832043971344)" 121.219.57.195 - - [11/May/2013:21:08:21 -0400] "GET / HTTP/1.1" 200 6142 "-" "Reeder/1020.09.00 CFNetwork/596.3.3 Darwin/12.3.0 (x86_64) (MacBookPro8%2C2)" 121.219.57.195 - - [11/May/2013:21:08:21 -0400] "GET / HTTP/1.1" 200 6142 "-" "Reeder/1020.09.00 CFNetwork/596.3.3 Darwin/12.3.0 (x86_64) (MacBookPro8%2C2)" 96.126.127.108 - - [11/May/2013:21:10:51 -0400] "POST /wp-cron.php?doing_wp_cron=1368321051.2980649471282958984375 HTTP/1.0" 200 0 "-" "WordPress/3.5.1; http://www.stumpdinpdx.com" 94.125.180.90 - - [11/May/2013:21:10:51 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)" 217.34.181.76 - - [11/May/2013:21:10:51 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)" 96.126.127.108 - - [11/May/2013:21:12:09 -0400] "POST /wp-cron.php?doing_wp_cron=1368321129.5501360893249511718750 HTTP/1.0" 200 0 "-" "WordPress/3.5.1; http://www.stumpdinpdx.com" 190.199.60.150 - - [11/May/2013:21:12:09 -0400] "POST /wp-login.php HTTP/1.0" 200 4463 "http://www.stumpdinpdx.com/wp-login.php" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)" 184.154.100.20 - - [11/May/2013:21:12:56 -0400] "GET /2012/12/50-things-i-will-miss-about-portland/comment-page-1/ HTTP/1.0" 200 12699 "http://www.stumpdinpdx.com/" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1) ; .NET CLR 3.5.30729)" 96.126.127.108 - - [11/May/2013:21:13:29 -0400] "POST /wp-cron.php?doing_wp_cron=1368321209.4377140998840332031250 HTTP/1.0" 200 0 "-" "WordPress/3.5.1; http://www.stumpdinpdx.com" 217.91.37.3 - - [11/May/2013:21:13:29 -0400] "POST /wp-login.php HTTP/1.0" 200 4452 "-" "Mozilla/3.0 (compatible; Indy Library)" 80.93.213.249 - - [11/May/2013:21:15:32 -0400] "GET /2010/05/food-carts-of-melbourne-all-four-of-them/ HTTP/1.1" 200 16569 "http://www.stumpdinpdx.com/" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; FunWebProducts; .NET CLR 1.1.4322; PeoplePal 6.2)" 80.93.213.249 - - [11/May/2013:21:15:33 -0400] "GET /2012/12/50-things-i-will-miss-about-portland/comment-page-1/ HTTP/1.1" 200 12720 "http://www.stumpdinpdx.com/" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; FunWebProducts; .NET CLR 1.1.4322; PeoplePal 6.2)"
  • 16. [11-May-2013 14:10:04 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:11:32 UTC] PHP Fatal error: Call to a member function setting() on a non-object in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/feedwordpress.php on line 606 [11-May-2013 15:21:58 UTC] PHP Fatal error: Call to a member function setting() on a non-object in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/feedwordpress.php on line 606 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 15:50:03 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 17:10:07 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531 [11-May-2013 17:10:07 UTC] PHP Warning: Invalid argument supplied for foreach() in /var/www/html/planetdevops/wp- content/plugins/feedwordpress/magpiefromsimplepie.class.php on line 531
  • 17. Jun 4, 2011 10:01:06 AM org.apache.coyote.http11.Http11Protocol init INFO: Initializing Coyote HTTP/1.1 on http-8080 Jun 4, 2011 10:24:48 AM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [] created a ThreadLocal with key of type [null] (value [clojure.lang.Var$1@564ca930]) and a value of type [clojure.lang.Var.Frame] (value [clojure.lang.Var$Frame@42f7ba93]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. Jun 4, 2011 10:24:48 AM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@15fa2b3e]) and a value of type [clojure.lang.LockingTransaction] (value [clojure.lang.LockingTransaction@5b2cfeb7]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. Jun 4, 2011 10:24:50 AM org.apache.catalina.core.StandardContext resourcesStart SEVERE: Error starting static Resources java.lang.IllegalArgumentException: Document base /var/lib/tomcat6/webapps/ROOT does not exist or is not a readable directory at org.apache.naming.resources.FileDirContext.setDocBase(FileDirContext.java:142) at org.apache.catalina.core.StandardContext.resourcesStart(StandardContext.java:4249) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4418) at org.apache.catalina.startup.HostConfig.checkResources(HostConfig.java:1244) at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1342) at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:303) at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119) at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1337) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1601) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1610) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1590) at java.lang.Thread.run(Thread.java:662) Jun 4, 2011 10:24:50 AM org.apache.catalina.core.StandardContext start SEVERE: Error in resourceStart() Jun 4, 2011 10:24:50 AM org.apache.catalina.core.StandardContext start SEVERE: Error getConfigured
  • 20. so what's wrong? so many sodding formats don't even get me started on timestamps no context really unhelpful error messages doesn't scale
  • 22. what? collects, transmits, interprets, stores free and open source primarily written by Jordan Sissel maxim: if a new user has a bad time, it's a bug in logstash awesome!
  • 24. how does it work? 202.46.52.20 - - [21/Jan/2013:14:59:39 -0800] "GET / HTTP/1.1" 200 931 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)" 119.63.193.196 - - [21/Jan/2013:15:00:27 -0800] "GET / HTTP/1.1" 200 931 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)" 208.115.113.88 - - [21/Jan/2013:15:04:30 -0800] "GET /robots.txt HTTP/1.1" 404 297 "-" "Mozilla/5.0 (compatible; Ezooms/1.0; ezooms.bot@gmail.com)" 188.138.88.171 - - [21/Jan/2013:15:09:46 -0800] "GET /w00tw00t.at.ISC.SANS.DFind:) HTTP/1.1" 400 315 "-" "-" 220.181.108.81 - - [21/Jan/2013:15:21:34 -0800] "GET / HTTP/1.1" 200 935 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" 123.125.71.31 - - [21/Jan/2013:15:21:58 -0800] "GET / HTTP/1.1" 200 935 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" 123.151.148.162 - - [21/Jan/2013:15:37:11 -0800] "GET / HTTP/1.1" 200 931 "-" "Sosospider+(+http://help.soso.com/webspider.htm)" 119.63.196.28 - - [21/Jan/2013:15:41:28 -0800] "GET / HTTP/1.1" 200 930 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" 209.85.238.174 - - [21/Jan/2013:15:45:20 -0800] "GET /?type=atom10 HTTP/1.1" 200 930 "-" "Feedfetcher-Google; (+http://www.google.com/feedfetcher.html; 2 subscribers; feed-id=16157856257601629822)" 188.138.88.171 - - [21/Jan/2013:16:17:06 -0800] "GET /w00tw00t.at.ISC.SANS.DFind:) HTTP/1.1" 400 315 "-" "-" 123.125.71.35 - - [21/Jan/2013:16:19:22 -0800] "GET / HTTP/1.1" 200 927 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" 220.181.108.78 - - [21/Jan/2013:16:19:29 -0800] "GET / HTTP/1.1" 200 927 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" 180.76.5.55 - - [21/Jan/2013:16:20:14 -0800] "GET / HTTP/1.1" 200 930 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"
  • 25. simple is as simple does input { file { type => "web" path => "/var/log/httpd/access.log" } } filter { grok { type => "web" pattern => "%{COMBINEDAPACHELOG}" } date { type => "web" timestamp => "dd/MMM/yyyy:HH:mm:ss Z" } } output { elasticsearch { } }
  • 26. the input input { file { type => "web" path => "/var/log/httpd/access.log" } }
  • 27. turns 202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] "GET / HTTP/1.1" 200 935 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
  • 28. into {"@source"=>"file://pelin.example.com/var/httpd/access.log", "@tags"=>[], "@fields"=> {}, "@timestamp"=>"2013-01-21T16:41:38.030Z", "@source_host"=>"pelin.example.com", "@source_path"=>"/var/log/httpd/access.log", "@message"=>"202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] GET / HTTP/1.1 200 935 - Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)", "@type"=>"web"}
  • 31. the filters grok { type => "web" pattern => "%{COMBINEDAPACHELOG}" }
  • 34. instead of ... evil ... like: (?:(?:rn)?[ t])*(?:(?:(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t] )+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?: rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:( ?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:rn)?[ t])*))*@(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-0 31]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)* ](?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+ (?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?: (?:rn)?[ t])*))*|(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+|Z |(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:rn) ?[ t])*)*<(?:(?:rn)?[ t])*(?:@(?:[^()<>@,;:".[] 000-031]+(?:(?:(?: rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn) ?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t] )*))*(?:,@(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t])* )(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t] )+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(?:rn)?[ t])*))*) *:(?:(?:rn)?[ t])*)?(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+ |Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:r n)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(?:(?:(?: rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)?[ t ]))*"(?:(?:rn)?[ t])*))*@(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031 ]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*]( ?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<>@,;:".[] 000-031]+(? :(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|[([^[]r]|.)*](?:(? :rn)?[ t])*))*>(?:(?:rn)?[ t])*)|(?:[^()<>@,;:".[] 000-031]+(?:(? :(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]|.|(?:(?:rn)? [ t]))*"(?:(?:rn)?[ t])*)*:(?:(?:rn)?[ t])*(?:(?:(?:[^()<>@,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|"(?:[^"r]| .|(?:(?:rn)?[ t]))*"(?:(?:rn)?[ t])*)(?:.(?:(?:rn)?[ t])*(?:[^()<> @,;:".[] 000-031]+(?:(?:(?:rn)?[ t])+|Z|(?=[["()<>@,;:".[]]))|" (?:[^"r]|.|(?:(?:rn)?[ t]))*"(?:(?:rn)?[ t])*))*@(?:(?:rn)?[ t]
  • 35. %{SYNTAX:SEMANTIC} Log: May 12 03:36:31 pelin dhclient[2335]: DHCPACK from 97.107.143.38 (xid=0x6f62572d) Grok: %{SYSLOGTIMESTAMP:timestamp} %{HOSTNAME:host} %{SYSLOGPROG:program}: %{DATA:message} SYSLOGTIMESTAMP: %{MONTH} +%{MONTHDAY} %{TIME} HOSTNAME: b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(.?|b) SYSLOGPROG %{PROG:program}(?:[%{POSINT:pid}])?
  • 36. remember this? {"@source"=>"file://pelin.example.com/var/httpd/access.log", "@tags"=>[], "@fields"=> {}, "@timestamp"=>"2013-01-21T16:41:38.030Z", "@source_host"=>"pelin.example.com", "@source_path"=>"/var/log/httpd/access.log", "@message"=>"202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] GET / HTTP/1.1 200 935 - Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)", "@type"=>"web"}
  • 37. with grok it becomes {"@source" => "file://pelin.example.com/var/httpd/access.log", "@tags" => [], "@fields" => { "clientip": [ "202.46.63.192" ], "ident": [ "-" ], "auth": [ "-" ], "timestamp": [ "21/Jan/2013:16:41:38 -0800" ], "verb": [ "GET" ], "request": [ "/" ], "httpversion": [ "1.1" ], "response": [ "200" ], "bytes": [ "935" ], "referrer": [ ""-"" ], "agent": [ ""Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"" ] }, "@timestamp" => "2013-01-21T16:41:38.030Z", "@source_host" => "pelin.example.com", "@source_path" => "/var/log/httpd/access.log", "@message" => "202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] GET / HTTP/1.1 200 935 - Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)", "@type" => "web"}
  • 38. grok makes better over 100 patterns numbers, strings, hosts, network addresses, urls, etc chain patterns together easy to extend, easy to test
  • 41. did I mention time? date { type => "web" timestamp => "dd/MMM/yyyy:HH:mm:ss Z" } }
  • 44. filters rock 30+ filters munge, mangle, mutate lookup, research, aggregate
  • 45. filters turn abstract information like 202.46.63.192 - - [21/Jan/2013:16:41:38 -0800] "GET / HTTP/1.1" 200 935 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
  • 46. into
  • 49. outputs 50+ outputs search, store, transit email, irc, alert graph, aggregate, execute
  • 55. references Doctor Who © BBC He-Man © Mattel