Saturday, January 16, 2016

Website Tracking - Click Stream Data Collection

It is important to understand tracking methods and measure the expectations or requirements of the online marketing channels to analyze website visitor behavior and channel’s performance.


Click stream Data Collection Types


* Web Logs – Log files (Tool is known as: Log Analyzer Tool)
* Web Beacons – 1 x 1 transparent images that are placed in web pages
* Javascript Tags – A standard few lines of javascript code (Tool is known as: Page Tagging Tool)
* Packet Sniffing – a layer of software that is installed on the web runs “on top” of the web server data layer.


Page tagging and weblogs are the most popular methods to capture website visitor data. These both methods have their limitations.


Page Tagging: JavaScript code is pasted on each page of the site. The data is collected via the visitor’s browser. Every time the tagged page opened by visitor’s browser the JavaScript is processed and visitor information collected by the cookie.


Advantages of Page Tagging:
1) Tracks events like JavaScript, Flash and Videos etc.
2) Collecting and processing data in real time
3) Provides information on web design parameters such as browser, versions, screen resolution, connection speed etc.


Disadvantages of Page Tagging:
1) Set up errors lead to loss of data
2) Firewall can restrict JavaScript tags
3) Can’t track search engine spiders (robots) and bandwidth
4) Unable to directly track non html pages
5) Vendor specific


Web Logs: Here no need to use JavaScript code for tracking purpose. The data is collected by the web server independently of a visitor’s browser. It captures all the requests made to your web server including pages, images and PDFs.


Advantages of Web Log Method:
1) Historical data can be reprocessed easily
2) No firewall issues
3) Can track bandwidth and completed downloads
4) Track search engine spiders (robots)
5) Vendor independent


Disadvantages of Web Log Method:
1) Proxy and caching affects data collection – if the page is cached, no record is logged on your web server.
2) Requires data storage and archiving to be performed by your own team
3) No event tracking
4) Robots multiply visit counts


If you are using ClickTracks for instance you may observe significant differences in visit numbers. You may get 20%-30% more visit numbers. Standard visit duration is 30 mins. If any visitor gets inactive for more than 30 mins, session gets expire. The change in session duration affects the total number of visits. Therefore it is very keen to know that the basic definitions defined by the tool which you are using for measuring web site data. 

1 comment:

  1. Hey Thanks for encouraging me. Please revisit again and post some suggestions too.

    ReplyDelete