Improving performance |
The size of your web site, your log files, the speed of your Internet connection, the computer’s speed and memory all affect how fast Web Log Storming can analyze your files. Every site has files that don't affect stat results (logos, buttons, stylesheets, etc). Almost as a rule, these files receive the most hits and thus require the most processing time. Global filters can be used to filter out the hits that you are not interested in. Using global filters allows Web Log Storming to be more efficient.
All web log analyzers support the well-known concept of global filters. While this is the only type of filter other analyzers recognize, for Web Log Storming it's just an additional tool particularly useful for improving performance. If you use global filters to exclude hits that you don't need in your reports, you will save a considerable amount of memory, which will considerably improve the processing speed.
For example, your pages probably contain logos, background, buttons, divider images, stylesheets, scripts, etc, etc. Each of these collateral types of files will cause the server to write another hit-line in the log file. Let's say that you have 10 links to those files on each of your pages (and you could easily have more). That means that every page view will be recorded as 1 + 10 = 11 lines in log file. Now, if you exclude collateral files by defining global filters, in this particular example you will reduce memory consumption by 90%! And that means that you will be able to analyze 10 times more data without losing performance or useful information.
As we have stated, Web Log Storming is different than other log analyzers. Generating static reports is an "easy task". Other analyzers can free the memory after creating predefined reports. Other analyzers ignore hits/visitors that don't meet their filter criteria. Other analyzers ignore report items that are not significant enough (i.e. ignoring or grouping rare referrers as "Others").
From the beginning, we thought about the idea of "live" (or "on-the-fly") filters. It means that the software doesn't want to know in advance what reports the user will want. It doesn't know what filters will be applied to inspect different cross-reports. Furthermore, nothing is insignificant for the Web Log Storming unless the user explicitly says so. If user decides to get Countries report for visitors that saw a least popular page, he can get it with just few clicks – without the need of re-reading log files with different filter settings.
To accomplish this kind of full interactivity, we needed to keep all relevant log data in the memory. Although our team spent a substantial amount of time developing a highly optimized memory model, some users might experience a major slowdown when analyzing log files. Our main goal was to find a meaningful compromise between processing, searching, and sorting speed on one side, and memory consumption on other side. In our opinion, results are more than satisfactory, especially if the user keeps in mind these facts, and makes a few tweaks to achieve the best results.
How exactly you define global filter depends on your web site structure. One example can look like this:
-*.css; -*.gif; -/script/*; -/forum/*
Sign "-" in front of wildcards means "exclude", and wildcards can be delimited with ";" or ",".
Of course, if you eventually wish to inspect some of excluded files later, you always can make another project file with a global filter that will include all or only files that you wish to analyze.