June 8, 2015

When analytics disappoints (and what to do about it) - Technical aspects

The diagnostics for analytics can take many forms, but the challenge lies in where to start troubleshooting first. One approach is to simplify the source of the problem - errors can lie either in the data collection or in the report settings of your analytics solution. Let's look at a few typical occurrences.

Checking data collection and settings

Addressing data collection issues first is usually a good place to start. They are symptomatic of technical issues with an analytics code installation. These technical issues usually appear as sudden changes of data trends in analytics reports - Spikes or steep drops - or implemented changes in tag details, such as mistyped regular expressions.

First, eliminate the basic analytics installation concerns. Verify that the site pages contain the latest supported version of the web analytics solutions being used. If a site has not been updated in a while, it can be easy to have features or settings that are no longer useful in place. And analytics providers sunset features, so for the latest information, follow a well-supported resource from the solution provider.

Next, troubleshoot to confirm that the analytics page tag is installed correctly. Manually you can always look at the source code of a webpage just to to see if its there. No code means no data regarding the page is being reported in the analytics solution. Developers typically add the main analytics code at the top or bottom of a webpage.

A manual search of a page for an analytics is acceptable for simple sites with a page or two; For sites with a large number of pages, a few tools are available to make verification easier. One easy tool is the notification alert. Located in the upper right of the screen, the notification alert reflects what should be corrected in the settings (You can learn more about each indicator here in this Zimana post)

A world of debuggers and tag checkers

Given the steady onrush of JavaScript, coupled with design configurations that increasingly account for mobile, a bevy of debuggers are available to manager issues related to code. Among the most popular is WASP, a debugger created by Stephane Hamel of Cardinal Path, is an example, considwred the gold standard for checkers. It was launched on 2006 and has steadily been refined over the years; a WASP Chrome plugin is now available.

There is also Tag Assistant, a Google Chrome plugin which works with Google Tag manager and other enterprise-level tagging as well as Google Analytics.

You can also use a web development JavaScript checker like JSLint to check if there is an interference with another JavaScript code being called on an inspected page. This requires pasting the code into the checker.

JSLint

Other tools that can help for issues regarding tags and cookies include Firebug (which works mainly with Firefox; Firebug Lite, a version for Google Chrome, is available) and Google Chrome DevTool. These can focus on potential issues that impact how the website code is being interpreted in the browser.

Tools are helpful for verifying how tags and fired when webpages are rendered. Packet sniffers come in handy for webpage render inspections, revealing, for example, where bots crawled without executing a Javascript call. Fiddler and Charles are the best known packet sniffers. In general if the data volume seems lowers than usual, inspect any design changes that impact data collection.

You can also record how data is being collected - Fiddler and Charles have recording features, which show a number of server calls for site elements when a webpage is rendered during a session.

Having a review of what tags have been added to a site can help highlight what tags should be in place at a website’s launch, and what has impacted the collected data from a site.

Confirm Report Settings

Some errors are associated to misinformation in the report settings. The result for analytic users is viewing trends that do not make intuitive sense. The errors are usually consistent - examining over different time periods or reviewing dimensions can reveal hints about what may be going on.

1. View if data is appearing in the report

Use real time features for quick confirm basic functionality for receiving data. Most analytics reports takes up to 24 hours to display results online, but many sub features can be available within a few hours depending on site visit volume.

2. Note misalignment of goals

Know how your site elements are related to conversion goals. It’s the core for what should be provided – apps. information, and the social media that compliments it.

3. Review environment in which reporting is issued.

This connects to number 2 - review what conditions are being created. Is there a period in which a spike occurred? What marketing occurred? Most of the details may not tell you why a spike happened, but you can better focus on how to deploy some possible next steps to correct the problem

4. Examine the syntax of filters to make sure they are functioning.

Filters sort our data that is reported through the use of regular expressions - text characters that filter characters in a sequence. These are tricky because it takes an imagination of the text pattern being filtered. Planning for regular expressions means accounting for ongoing anomalies. Thus user error or a sequence overlook can occur.

There are a few tools available for verifying the validity of regular expressions - RegExpal is an open source checker, though you may have to check if the platform is supported.

Also this is a good reminder to keep an untouched profile. For example, Google suggest maintaining an unfiltered analytics profile alongside one that contains filtered. Doing so permits quick identification of data anomalies and discrepancies.

5. Journal any changes made by admins

Mark each update or setting change with Google Annotation. Doing so can help everyone understand the history of what was added to a website or if a setting was changed. This can eliminate choices that have been tried before, and maintain focus on what can be corrected.

6. Note data changes greater than 10% as a starting point for diagnostics

This is more of a judgment call, but swings in data more than 10% can indicate a problem in data collection or a difference in a trend. This guide can help zero in on areas in which questions can be asked and action taken. Results in E-commerce and Adwords reports, for example, usually are based on traffic that clicked tagged URLs - like a paid search ad URL or a landing page for a commerce offer. Because these clicks rely on multiple tags or a complex site structure, like a subdirectory, click losses greater than 10% can be indicative of a tag firing concern.

7. Conduct periodic team review of report status and concerns

The best diagnostics step in maintaining the report settings is plan an occasional review with the team together and see how the reporting is used. This breaks silos - imaginary limitations of personal knowledge of what is occurring on a website. The end result is the right result, getting team members to communicate.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram