An interactive web server log analyzer and analytics tool - with a twist.
Because of its nature, depending on size of your website and
strength of your computer, Web Log Storming can become slow. To avoid this, we
strongly recommend you to use global filters in order filter
out all hits that you are not interested in. Every site has files that don't
affect stat results (logos, buttons, some scripts, etc). Almost as a rule,
these files get most hits and thus affecting performance significantly.
Web Log Storming is different than other log analyzers.
Generating static reports is an "easy task". Other analyzers can free
the memory after creating predefined reports, they ignore hits/visitors that don't
meet the filter criteria and they ignore items that are not significant enough
(i.e. ignoring or grouping rare referrers as "Others").
From the beginning, we thought about the idea of "live"
(or "on-the-fly") filters. It means that software
doesn't want to know in advance what reports user wants to see. It doesn't know
what filter he will apply to inspect different cross-reports. Furthermore, nothing
is insignificant for the Web Log Storming unless user
explicitly says so. If user decides to get Countries report for visitors that view
the least popular page, he can get it with just few clicks - without
need to read log files again with different filter settings.
To accomplish this kind of full interactivity, we need to keep all relevant log
data in the memory and our main goal was to find a meaningful
compromise between processing, searching and sorting speed on one
side, and memory consumption on other side. In our opinion, results are more
than satisfactory, especially if user keeps in mind these facts and do few
tweaks to optimize results.
Just like the others, our software supports well-known concept of global
filters. While this is the only type of filters that other analyzers recognize,
for Web Log Storming it's just an additional tool particularly
useful for improving performance. If you exclude hits that you don't need in
the reports, you will save a considerable amount of memory, which will greatly
improve processing speed.
For example, your pages probably contain logos, backgrounds, buttons,
divider images, stylesheets, scripts, etc. Every file of
these types will cause server to write another hit-line into the log file.
Let's say that you have 10 links to those files on each of your pages (and you
could easily have much more). That means that every page view will be recorded
as 1 + 10 = 11 lines in log file. Now, if you exclude
collateral files by defining global filters, in this
particular example you will reduce memory consumption by 90%!
And that means that you will be able to analyze 10 times wider time
period without losing performance or useful information.
How exactly you will define your filter, really depends on your web site
structure. One example can look like this:
-*.css; -*.gif; -/script/*; -/forum/*
Sign "-" in front of wildcards means
"exclude", and wildcards can be delimited with ";"
or ",".
Of course, if you eventually wish to inspect some of excluded files later, you
always can make another project file with global filter that will include them.
How to Easily Exclude Unnecessary Hits
How to Track Custom Variables with Web Log Storming
Conversion Tracking in Web Log Storming
How to Track AdWords Campaign Without Using Cookies
Return to the list of articles
An interactive web server log analyzer and analytics tool - with a twist.