You will find hereafter the list of all features of the free edition of the HttpLogBrowser:
- The application runs on Windows Vista/2008/7/2008 R2/8/2012/8.1/2012 R2/10/2016
Log file loading features
- Up to 2 million log rows can be loaded in memory for analysis (this number can be extended in the professional edition)
- Ability to analyze IIS HTTP logs including HTTP logs from Azure web applications.
- Ability to analyze Apache access logs in common log format or in combined log format.
- It’s possible to load either selected log files or all log files from a folder.
- A number of days to analyze can be specified to only load the latest log rows.
- You can define the maximum number of log rows to load to avoid using too much memory. If this number is reached, older records will not be loaded.
- Ability to define profiles for web sites you regularly analyze. Once configured, concerned logs can be quickly reloaded.
- You can reload quickly recent log files/folders without the need to create a profile.
- If you are running the application on the web server you can load directly log files from local IIS web sites through the file menu.
- Developers developing web sites with Visual Studio can load quickly log files from IIS Express the web server used by Visual Studio to debug web applications.
- Request to some kind of files can be excluded from the analysis (e.g. css, png, json) to keep only queries to pages.
- Request to some URL on the web site can be ignored. For example if you have a web service you don’t want to include in your analysis.
- For Azure web applications ability to ignore Always On requests and the X-ARR-LOG-ID variable. If you select the option Always on in an Azure web application, a request will be sent to your web site every 5 minutes to keep it active and avoid a slow down after a certain idle time. This is typically requests that are not interesting to include in the analysis and that you want to ignore. Also in Azure, the load balancer systematically adds a variable named X-ARR-LOG-ID with a different value in every query generating some disturbing noise if you want to do an analysis on the http request query parameters.
Log viewing and filtering
- Ability to easily filter on a field value just with a click on it. It is also possible to exclude a field value from the filter.
- Ability to go to the next/previous row with a specific field value. This allows to find specific log rows without filtering out surrounding log rows and making it possible to detect event coincidences.
- It’s possible to define custom filter conditions with SQL syntax.
- Fields with always the same value are removed from the log view and from the statistics and are displayed in a specific panel to lighten the logs and focus on what is changing.
Statistics and charts
- Statistics are automatically calculated for every field and a chart can be displayed (pie chart, top 5 or histogram if the field is numeric)
- Statistics are immediately visible in a panel with the top 20 values
- Detailed field statistics can be displayed in a separate window with all values and with a time evolution chart of the number of distinct values per period of time or the evolution of the average value or the sum if the field is numeric.
- Statisics of the URL path field can be switch in tree mode to easily see which parts of your web site have more activity.
- Ability to extract the web site from the referrer URL. This allows for example to easily exclude the web site itself to see only the first access of visitors.
- Extract fields from the user agent string (e.g. browser, OS, device). These extracted fields will help you for example to exclude bots from the analysis.
- Extract cookie values in a new field. This allows you to track internet users if the web site uses a specific cookie to remember them. This will allow you to see the user browse history.
- Ability to extract the ASP.NET session ID and the PHP session ID from the cookie field.
- You can specify the tracking field (e.g. client IP or extracted cookie) and field values to keep from the first request of a visitor (e.g. the referer). This allows to easily filter the web traffic generated by a specific source.
- The day of the week and the hour of the day can be extracted in separate fields to allow you to easily find when in the week or when in the day the web site has more activity.
- Ability to determine the country of the client IP address
- Ability to determine the host name of the client IP address
- Ability to extract keywords for some search engines (Bing for now) from the referrer. You will be able to see with which keywords users can find your web site on search engines.
- Ability to extract the gclid for Google ads landings when auto tagging is enabled. If you pay for Google ads, the Google automatic tracking system adds a variable named gclid with a unique number when the internet user lands on your website. This allow you to do a quick independent analysis on how much internet users come to your web site from paid ads without using Google analytics.
- For Microsoft Exchange web applications, ability to extract the Active sync command and device. The access to mailboxes in a Microsoft Exchange server is essentially done through IIS web applications. So analyzing IIS logs from an Exchange server is interesting to detect problems and more particularly security problems. This is in particular true for outside accesses through Outlook Web access, Outlook Anywhere or through ActiveSync (smartphones). In ActiveSync queries, the command and the concerned device is stored in a variable of the query. So extracting these fields may be interesting in the analysis
You may also be interested in features available in the professional edition.