There are 7 big marketing mistakes made when agencies deliver reporting to corporate clients. I’m excluding small businesses here. The small business has a very different perspective and shouldn’t normally use strategies that are suitable for large corporates.
Now of course many of these mistakes aren’t deliberate. They are probably accidental – but that doesn’t stop them hurting both your agency and the client.
1 Regarding analytics as separable from technical skills.
Many people see analytics and data as separate skills from web development. Analytics may bore the brilliant web developer. Or they might be no good at it.
And the analyst might be no good as a programmer.
But knowledge of web development and programming stills matters for the analytics expert.
Why? Because modern analytics relies completely on data gathering facilitated by the web technologies.
And it’s not just the technologies you might expect.
So the analyst should understand the crucial role of the scripting languages. These help produce the HTML output. For instance scripting languages can have looping constructs. These can produce duplicates within the same page. These duplicates destroy the uniqueness a naive analyst might be relying on.
Understanding the network traces in development tools inside the browser is invaluable. They help diagnose the stubborn measurement errors. The analytics expert who can work with these is really useful.
There are other technologies that have a crucial impact on the data gathering. Cascading Style Sheets (CSS) are also an important mechanism. Tag Managers like Google Tag Manager use it. It enables them to try and retrofit the data collection to existing web pages.
Google Tag Manager can work well, but development expertise is still needed. For example the single page application jeopardises one of the fundamental tenets of web analytics. It uses a single page address (seen in the browser address bar) for many steps. This makes it hard for the analyst to identify the end of step 1, and the start of step 2. Developers often need to install extra code to sort this out
2 Making do with limited in house expertise
Is it harder than it should be?
Many of those working in marketing understand the concepts of analytics. The marketer may be able to interpret many items as a expert analyst would. For example looking at different channels, looking at the navigation paths aren’t hard. The challenge is that limited knowledge may well contribute to marketing mistakes. A deeper appreciation of the principles and technologies underpinning the reports will help. This means the agency can present a richer, more valuable view to the client. A better understanding will help avoid the client feeling it has been misled.
It’s like the classic 80/20 pareto analysis. The generalist may be able to deal with 80% of the analytics. But 80% of the value may lie in the other 20% and the underlying assumptions that aren’t as easily understood.
To pick some examples.
It’s fairly easy to install the eCommerce addins to Google Analytics. And the reports are relatively easy to read. But how does one know whether the numbers are correct?
One has to start with an external system. And the rules around currencies, tax and shipping applied by the business. These may not be simple. And it may be hard to reconcile these – particularly if there are time lags in the reporting.
For instance the Google Analytics account could be on US Pacific time. But the internal system could be on a different timezone. A specific transaction could appear on particular dates in the two systems.
Configuring goal funnels may be easy. But often the shopping cart itself interferes with the goal funnel configuration.
For instance the cart may append parameters to the pages. So one needs a regular expression to segregate wanted from unwanted pages. And regular expressions are a particularly awkward syntax, that is easy to get wrong.
Another example. The analytics may appear to be working correctly. But there may be intermittent failures. It takes more expertise to identify this and sort it out.
So wise agencies will consider obtaining “top up expertise”. This helps where the numbers of complex queries don’t justify an extra member of staff. The result can significantly improve the quality of delivery and value to the client.
3 Not trying to fix a broken infrastructure
Your agency may dream of winning that new big client.
Can you leave the can be worms alone?
Many big corporates have a history of mergers and acquisitions. Each acquired company typically brings websites and a variety of marketing techniques. The individual “child” sites may be re-skinned and modified. They may “appear to fit” into their new family. This may be the easy bit to achieve. It can be much harder to mesh the reporting for all these marketing techniques and sites. For instance each may have a separate google analytics account. Achieving a comprehensive overview doesn’t fit well with such a fragmented approach.
One can put a single “rollup” Google Analytics snippet on each website. This might replace or go alongside the existing snippet. Then all the reporting would come into one location. But without further work the visitor numbers to the home page would be wrong. The home page visitors to every site would simply be added together. An option is to insert the hostname as a prefix to the URL in Google Analytics. This stops misleading the reader with inflated numbers.
But a “rollup” snippet can still have vast numbers of pages. As we now have the traffic from all the original accounts in one new account. This could lead to sampling problems and other nasty side effects. The naming conventions for some of the customisations are likely to conflict.
The best solution is for an expert to design the reporting solution. This can bring all the individual children together into a coherent structure. Even then it won’t be easy – and the solution may be awkward to implement.
These problems don’t only afflict big corporates. Sometimes small growing companies can have a can of worms. These come from their “early years”. The business may still have IT systems that are not reliable for the greater scale now required.
The system may not fail completely. Instead the surrounding processes provide evidence of a “hand to mouth” existence. At this point many interrelated issues will affect the growing business. These can have serious implications for your agency.
- Processes and practices may need reform. In order that they become more robust and capable for the next phase of growth.
- Systems and code that relied on these processes and practices will need auditing. Bringing them to an acceptable baseline may be hard.
Sometimes the solution will be to start again from scratch. However the biggest problem is likely to be that no-one really knows whether a clean break or a fix is better. And in these circumstances people can swap inconclusive anecdotes. A long struggle can continue as people try and decide what to do for the best.
So the honest view has to be that it’s likely to be a hard slog to sort out the can of worms. But leaving the mess can poison the whole client relationship. And reliable reporting is likely to be a casualty.
4 Trying to do everything through Google Tag Manager (GTM)
Google Tag Manager and other tag management systems can be a really useful mechanism. The business advantage in separating data gathering from the development road map and test cycles is clear and obvious. However whilst GTM can lessen the linkage – it doesn’t break it completely.
So there is one simple non technical message to take away. Tag Managers aren’t foolproof – and because they rely on code, they can be broken by code.
Another problem with GTM can occur from the mismatch of objectives between web developer and analyst.
The web developer’s objective when using CSS is to control the look and feel of the elements. Achieving uniformity across the website is important. So for instance – the web developer wants all buttons to look the same and behave in the same way. This is a very different objective from that of the analyst. The analyst will want to know that button A has been clicked. The analyst must differentiate that button A click clearly from any clicks of button B. This difference of perspective means that the CSS designed for uniformity may be almost useless for the analyst using Google Tag Manager.
Without some revision the analyst using GTM may find themselves trying really convoluted and fragile techniques to identify particular elements. It’s a bit like trying to break into a house by hooking the keys out through a letter box. Fine if the keys are in view, but it they are round the corner on an unseen shelf – almost impossible.
These potential problems with GTM can mean that development changes are the best course of action. They can make an almost impossible situation much much better.
5 Trying to Avoid development changes that would enable smart data gathering
Smart data gathering doesn’t just happen. Sometimes we must invent it. I learnt to appreciate this in my early career as a mechanical engineer.
I had a project where I had to work out quality assurance criteria for Ford Motor Company. These criteria had to ensure assembled car suspension arms didn’t fall apart. These arms were fitted onto all the then current Ford Mondeos. A tape measure was useless for determining whether the arm would or wouldn’t stay intact. We needed to find the appropriate technique. In this case we monitored the forces used during assembly at particular points.
The challenges with collecting data from online systems are different – but often need the same kind of creative thinking.
So the approach I would recommend is:
- Identify the processes for which you want to count success and failures, or assess progress.
- Identify each step in the process.
- Create a means of identifying beyond doubt which step we’ve reached.
- Develop criteria to help evaluate whether each step succeeded or failed.
- Design and implement measurements for each step. The data collection must enable us to prove the failure or success of each step.
This probably doesn’t fit with the standard configuration that “came out of the box”.
The standard package can’t cater for all the unique challenges that your agency’s client has. We must design, implement and test particular data gathering techniques to fit that client.
In many cases these techniques may not be particularly challenging. Perhaps particular we need to count button clicks, or assess particular form entries to create a “world of measurement” that properly supports the client’s business. But they may well require development support to make them happen. The way Cascading Style Sheets and the code is constructed will determine whether Google Tag Manager is viable.
In some cases the techniques themselves aren’t challenging but they have implications for the rest of the process. The speed with which the website moves a user from one page to another might interrupt a data collection. In these cases review the process design.
6 Not recognising when a report “doesn’t work”.
The standard reports that “come in the box” with Google Analytics are certainly comprehensive. But that doesn’t mean that they provide answers to the questions the client has; nor that all of them are relevant.
It’s worth reflecting for a moment how assumptions would change if we didn’t have free or near free cloud based services. Back some 20 years ago, the comprehensive package like Google Analytics would have sold for many thousands of pounds. And the vendor wouldn’t just offer the software – they would have pushed hard to sell consulting services. These services would customise the package to suit the customer’s business. The current world enables Google to cross subsidise the software supply. But if we aren’t careful the customisation just disappears. The lack of a vendor with a commercial reason to sell it – misleads the user into thinking customisation isn’t required.
But a mass market standard offering like Google Analytics cannot cater for a customer’s key business processes. Proper customisation takes time and effort.
Some of the customisation will push valuable information into standard reports that were blank until customisation happened. So for instance there are no goal funnels until someone has configured the goals in the Analytics account. Using this stepwise notion of process and the “goal funnel” can achieve a good deal. It can make the standard package reflect the nuances of the unique businesses agencies work with.
Another pitfall arises when too many dimensions are shown in one report, or the wrong dimension is chosen as the primary index.
For instance I’ve seen attempts to show the various channels that web traffic uses. A mini bar chart shows the varying number of sessions for desktop vs mobile vs tablet device for each marketing channel.
Does this work? No
The problem here is that the numbers of sessions varies dramatically by channel – so what 100% represents is dramatically different.
It would be clearer to show how popular each channel was for a particular device type.
The way to clarify this is to recognise the perspective of the web visitor. A visitor doesn’t choose the marketing channel to which they wish to respond and then choose the device they would use. Instead a visitor selects a device. For instance they pick up their mobile and react to information found or searched for. Or they sit at their desktop etc.. So the likelihood respond to a particular marketing channel for a given choice of device is probably the clearer presentation, and more informative.
7 Failing to implement proper naming conventions
Naming is surprisingly important. There are two important criteria
- The names of the events, goals, concepts, stages and processes within the reporting need to be easily understood.
- The naming needs to show the user how to navigate any hierarchy that may exist. This is particularly important when users are searching for the answer to a question and are trying to locate the information that will help them.
In most cases the naming should relate specifically to the client’s business. Even if there are standard terms there are likely to be subtleties about the way a particular term is understood in a particular client. For instance marketing qualified lead carries with it a set of assumptions about what it took for the prospect to be “qualified”. How subjective or objective was that process?
There should be a glossary of terms that enables newcomers to the reporting to rapidly learn the new terminology. The stakeholders may well have come into the business from competitors who use the terms in a subtly different way. Or be from areas of the business that don’t share the same assumptions.
The challenge of getting naming conventions implemented is normally more about process than fundamental difficulty. However we have to accept that perfection and complete coherency is unattainable. We are simply aiming for a set of names that fulfill the two criteria above.
The developers or analysts need to have the detailed conversation with the stakeholders. This conversation needs to cover
- how to achieve a robust convention that will accept will new names later on without destroying the fundamental logic.
- how to communicate any subtleties to the user arising from the data gathering or processing
I hope this post about Marketing Mistakes in reporting is useful. I realise it’s long and involved. Please get in touch or comment below if you want.