Monday, August 27, 2012

MOSS Usage Reports explained


MOSS Usage Reports explained provides a very good insight into the kind the data for each parameter in the Site Collection Usage Summary page.

Mark Arend has written an excellent post with detailed descriptions of parameters displayed in MOSS 2007’s usage reports on the pages like SpUsageWeb.aspx (Site), SpUsageSite.aspx (Site Collection), SpUsageWebTopPages.aspx (Site), SpUsageSiteTopPages.aspx (Site Collection) and so on.

There are two report pages that are extremely useful, particularly for slightly smaller sites, that cannot be reached through the GUI interface in MOSS 2007. They are actually from the basic WSS system, and MOSS inexplicably misses out any direct reference though the administration pages.

  • -  /layouts/usage.aspx - This page brings data from the content database, which is the total hit for the page from ALL locations
  • -  /_layouts/usageDetails.aspx-  This URL brings data from ‘ SharedServices_DB ‘, which is processed thru multiple SQL table views and stored. It results only the hits to the page FROM A specific site collection.

In short, the data on the ‘usagedetails.aspx’ page is calculated for any hit (success or failure) to the location whereas the data in the ‘spUsageSite.aspx’ page shows the page which was accessed (and the number of times it was access in the Pie chart) FROM the site collection.

These are the definitions used by WSS and MOSS in summary usage reports (which are stored in the web metainfo):


  • Visit: A “total hit” that does not come from within the same server; that is, it either has no Referrer header, or it has a Referrer header from another server
  • Total hit: any hit that gets logged in the WSS http logs (we don’t log hits that result in error http results, or hits to the _layouts directory)
  • Hit: Anything in (2) except hits on files with these extensions: "gif","jpg","png","bmp","css","mid","wav","ico","xml","au","js","class"
  • Request:  Requests always measures Page Views, not all HTTP requests for individual items like images, style sheets, etc.

How Usage Analysis works

All WFEs behave in the same way as long as the Windows SharePoint Web Services service is running on each WFE. HTTP data from each WFE is collected and stored locally on disk. The method and process by which this data is persisted on disk is described in Usage Event Logging in Windows SharePoint Services 3.0. This behavior is the same for all WFEs.

How this data makes it back to DB, differs for WSS and MOSS


  • -  In WSS, a timer job called Usage Analysis, runs on each WFE and is responsible for parsing the usage log files and updating the information in the site’s content DB. 
  • -  In MOSS, a timer job called Office SharePoint Usage Analytics Log Import, runs on each WFE and is responsible for parsing the usage log files and uploading this data in the SSP DB (for SSP’s that have usage turned on, as per the instructions that you quote below)


  1. The Office SharePoint Usage Analytics Log Processing job is responsible for parsing and populating the usage report data in the SSP DB’s analytics tables (that use the ANL prefix)
  2. It runs every 15 min, to check is there’s new data imported (from the Office SharePoint Usage Analytics Log Import job) so that the reports are updated
  3. It also expires detailed data (kept only for 30 days) and report data (kept for 365 days)
  4. Windows SharePoint Services 3.0 generates usage event logs daily for each Web application when 'Enable logging' is selected on the Usage Analysis Processing page in SharePoint Central Administration
  5. When logging is enabled, Windows SharePoint Services by default creates log files in the 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\Logs' path on the front-end Web server, although you can specify an alternative location
  6. The Logs directory contains a folder for each Web application on the Web server, each named with a GUID that identifies the respective Web application
  7. Windows SharePoint Services 3.0 inserts an ampersand (&) between the top-level site URL and the sub site URL when it processes the log files
  8. This marks the log file as "processed" and prevents data from being counted twice if the usage processing job is accidentally run again on the same day

1 comment:

  1. Are you aware of a way to generate via a command line or script site collection usage information?

    ReplyDelete