This post isn’t about Brexit. Your views about Brexit shouldn’t be a barrier. Nor whether you regard Dominic Cummings as a hero or villain. There are huge ideas for Marketeers hidden inside Dominic’s January 2nd 2020 Blogpost.
Actually Dominic is assembling ideas rather than generating all of them. Other people had original ideas that he draws on. The ideas themselves have nothing to do with politics. So business should be rushing to investigate these ideas. Government and policy making is merely another application area.
I have to warn you there isn’t a ready to go solution (oven ready or not). But there is a hugely valuable chat that’s worth starting. Watch the video and drop your details in the box below – I’ve got a number of ideas for improvement and would like to start that chat.
Making sure your agency reporting contains compelling stories that engage your client, and build relationships is vital. It’ll help you upsell the next project. The following ideas are designed to help you do this.
1) Map the reporting onto the clients organisation.
Any global corporate has a complex structure. There are many regions, and product or service groups. Some of these will have come from past mergers and acquisitions.
So the individuals within the corporate have very different responsibilities and perspectives. And the reporting should reflect that.
Matching reports to responsibilities
Readers ignore reports that don’t help them deal with their own responsibilities. The challenge for an agency is to provide reports that every reader values highly.
Restricting the reporting to a small number of individuals is the wrong answer here. It deprives your agency of one of the most powerful marketing and sales aids it has (see below).
Each report should help specific roles within the corporate. It should answer their particular questions. This is far from simple. It is hard to discover all the people and roles who should consume information from reports. These roles should not just lie in marketing. They should extend into sales, support, finance and elsewhere.
This role identification then permits the design of meaningful reports. These reports should answer the key questions that confront each role.
You will guess the total programme here isn’t trivial or short. It is likely to stretch over months. Benefits will grow as the reporting gets more advanced and serves more roles.
This programme can transform the relationship between agency and client. It has the potential to improve the quality of decision making across the corporate. Why not configure the Google Analytics views so reports match the organisational structure?
A trivial example would be to configure the views within Google Analytics as follows.
The modern global corporate has a complex structure. And will have many product groups. This structure forms the basis of the roles – and the formal hierarchy. The websites will recognise this structure, even if poorly. So there are opportunities to customise the analytics package, creating custom reports.
But beyond this example lies huge opportunity. One can customise Google Analytics to reflect opportunities that your client’s business faces.
2) Simulate to improve the quality and mitigate the risks of reporting failure.
Agencies often recommend a new website or radical changes to the existing site. Perhaps the data gathering isn’t working well and requires radical change.
Testing the data gathering before go live dramatically reduces the risks. Marketing often push for Go Live to hit a seasonal opportunity and time campaigns to suit. If the reporting on a shopping cart fails then the business could be flying blind for a period. The business would be unable to judge what’s working or what’s failing. Simple errors in the data gathering and measurement infrastructure can lead to this. So tests before the crucial period can be invaluable.
There’s no reason to avoid testing the measurement infrastructure. The usability and facilities of the website would be before “go live”. But we need to simulate visitors because it isn’t economic to do this with human (live) visitors.
But beyond the risk of technical failure lie even more important considerations.
Dealing with the political risk
The proponent of radical change takes a considerable political risk. Particularly when it involves the data collection and analytics for a big corporate.
The customer inside the corporate must weigh the personal risk to their career. Often they will feel the risks to reputation or prospects are just too high. Many people within the corporate may have quietly accepted the existing situation. Many will accept the “reporting just doesn’t help”. These people may want a quiet life and want to avoid any change. Whilst others who like the current analytics setup could be seriously upset.
The roles adopted are like that of the prophet and an unwilling prospective follower. “If only the follower would accept the prophets’ message”. If they did this “the world would become a happier place”. But the cautious “follower” is torn between the sales message and fear. What if it all goes wrong? Fear of adverse consequences in a corporate scenario is a powerful motivator.
Simulation is the answer.
The best answer to help everyone is a well developed simulation. We use a “robot army” of visitors to prove in the safe “sandbox” that the proposed reporting system works. The simulation can prove the case for change at minimal risk. Adjustments can be made to deal with challenges or objections. The simulation doesn’t affect the live infrastructure or reporting regime.
The sceptical client can see that the change works for them and any stakeholders. And the client also can judge that the reporting improvements are worth any disruption. Any disruption can be explored and steps put in place to mitigate this.
3) Create reports that use the full richness of the web, not just a virtual billboard
Dramatic advances in technology haven’t really affected the common online “dashboard”. It’s still stuck in a timewarp. It often shows:
A single set of pictures to all consumers.
There is no way for consumers to comment or otherwise engage with the “broadcaster”.
This is a bit weird . There isn’t any audio, or video, and even text is rare. This strangles rich online communication amongst agency experts and client staff at birth.
The sticking plaster is the regular client meeting. Now this certainly allows the client to provide all sorts of feedback. But this has serious disadvantages.
Explaining what is being shown takes time.
Those who can’t make the meeting can’t hear the explanation or contribute.
Follow up doesn’t happen “in context”. It typically relies on email. Follow up can get lost amongst a mass of other messages. The link between the follow up and the original information on the reports is weak. And those who aren’t included on the email don’t see it.
Now there are some conspiratorial reasons for restricting the reporting to this limited set of features – but that’s covered in my book (see elsewhere for details)….
4) Use a distribution system to build and enhance relationships
The typical agency and client relationship relies on a small number contacts. Your agency will prosper if many client personnel want to engage with your agency. You can stimulate this urge.
Many corporate staff have little or no information to help predict forward workload. The marketing agency has information that could make a serious difference here. By providing this you can shatter perceptions about the value of marketing. This would help both the agency and its marketing contacts.
The value of marketing that only sees its responsibility as gaining leads is much less to a client.
But if marketing tries to improve the entire customer journey. If a marketing agency works to increase client revenue and profit. If the agency works to provide real value to the client; the scope is huge.
Marketing often facilities the website(s), customer communication and support systems. Post sale support staff could use data about usage of these systems. These aren’t the traditional contents of a marketing report pack. But they can be really valuable.
5) Weaponise reporting to differentiate your agency and repel potential competitors
The Business Marketing Club’s B2B Barometer 2018 survey was interesting. 67% of the larger B2B agencies wanted to improve the marketing of their agency.
Reporting is normally a contractual obligation. The agency has to do it. But why should your agency accept a tickbox approach? Why adopt a hurdle criterion of just “making sure the reporting is adequate”? Should your agency limit its ambition in that way? What about considering how reporting provides a means to differentiate your agency.
Your agency could expand the scope of its reporting for mutual benefit.
Your agency could expand the type of reporting. It could provide more expert insight and build closer relationships.
This could frustrate anyone suggesting your agency is like all the others.
Marketing struggles to get the C Suite to see them as important. They can struggle to get budget and approval for the next campaign. This puts the focus quite rightly on the history of wins. Digital Marketing agencies supply reporting. This is a key part of this history. So clear reporting really helps the Chief Marketing Office (CMO). It helps them win friends and carry the boardroom.
Buyers like retendering to cut costs.
Does your agency have an good defensive strategy?
How do you differentiate?
How do you make the pain of disconnection real?
Are you making it almost impossible to compare your agency with others?
Are you building networks across your clients.
Do clients value the relationships with your agency experts
Do clients value the reporting?
Relationships are fragile. Corporate marketing staff certainly leave and move regularly. A large robust network of influencers is crucial. Report distribution make it easier to do this. Creating these networks within your key accounts is a useful insurance policy. It’ll help retain clients long term.
But adopting the tactics and strategies that most agencies adopt doesn’t help. You need to be different.
Why SaaS is hurts digital marketing agencies
Self service helps SaaS vendors grow quickly. Some side effects of SaaS are subtle and nasty. Particularly when the service is marketing reporting. Someone still has to create the reports that the SaaS vendor dodged. If your agency takes this on then you battle with:
a workforce that isn’t excited by another drag and drop editor. Nor by another series of repetitive tasks.
Features that are constrained by the need to be self service. Why? A flexible data processing system is pretty abstract. Marketing users may be poor at using an abstract model of this kind. Data analysts and programmers are much better.
A vendor who acts like a truant. The vendors are remote from end customers. That severely limits the learning vendors gain. The devil is in the detail of the data. It’s hard to produce good marketing reports.
Other side effects of SaaS
Poor incentives. Product innovation tends to be dominated by technical capability push. And the influence of market pull is minimised. The product features aren’t informed by the end user of the marketing reports. So new features are created “because we can”. They are not created “because the end user wanted or valued it”.
Often systems simply re-display the data available from online systems. There is little or no transformation. This makes creating high value information much harder. Sometimes it’s impossible. This means persuasive stories are hard to tell.
Formal requirements documentation is rare. Monthly subscription services have discouraged it. When systems cost many £10,000s formal requirements were essential. Consultancy support was also a natural part. The buying organisation would ensure the purchased facilities were “fit for purpose”. Both in the medium and longer term.
Long term flexibility to deal with your evolving customer needs isn’t assured.
Standard distribution tactics won’t differentiate your digital marketing agency.
Why not create a situation where procurement finds it difficult to swap your agency out?
A subversive approach for agencies
Why not use marketing dashboards to build relationships across the client.
But not just any kind of standard dashboard. If senior management view the dashboard high quality reporting is required. And the skill with which you “tailor” the message to suit each recipient has to be good. The dashboard should quickly show “what they should care about and why”. There is a premium on quick access and clarity. Everyone wants information to be easily assimilated.
The prize ought to be obvious. Any agency wants relationships built and value appreciated. Most importantly by the movers and shakers across their corporate client. This moves your digital agency far away from being a “swappable commodity”.
It becomes harder for procurement to cast your agency aside. Particularly if the only gain is a small reduction in cost.
What kind of reporting do you need?
You’ll need a reporting system that:
Uses organisational context to select the report variant to show.
Enables controlled distribution to differing roles across the corporate client.
Shares compelling stories.
Uses any of the rich media that the web allows.
So what should Digital Marketing Agencies do?
Digital marketing agencies ought to look at solutions that are far from “minimal”. And a long way from the excel plus email mindset.
So a disciplined approach should see the reporting system as a strategic asset. And build on this investment in the long term.
A manufactured product would be introduced via :
Smaller scale usage.
Expanded usage – driven by benefit and organisational fit.
A long term programme that started small. And then expanded to realise major benefits to all concerned.
Clients would support this investment as they realised the advantages from higher quality reporting.
This is a radical approach. It would transform the relationship between marketing agencies and their clients.
There are 7 marketing mistakes agencies make when reporting to corporate clients. Many of these mistakes are accidental. But accidental marketing mistakes can still hurt both your agency and the client.
1. Separating analytics from technical skills.
Many people see analytics as a separate skill from web development. Analytics may bore the brilliant web developer. The developer might be no good at analytics. And the analyst might be no good as a programmer.
Web technologies and programming knowledge stills matters for the analytics expert. Why? The analyst is using data gathering that requires web technologies. It’s a marketing mistake to try and separate them. And it’s not just the web technologies you might expect.
Your analyst should understand the crucial role of the web scripting languages. As scripts help produce the HTML output. The scripts can include loops. And loops can produce duplicates on the same page. These duplicates destroy the uniqueness a naive analyst might be relying on.
Understanding a browser page load network traffic is invaluable. So the analytics can diagnose stubborn measurement errors. The analytics expert who can work with these is much more capable.
Other web technologies also affect the data gathering. Cascading Style Sheets (CSS) are used by Google Tag Manager. As CSS allows people to retrofit data collection to existing web pages.
Google Tag Manager works well, but development expertise is still needed. For example a single page application breaks web analytics assumptions. It uses a single page address (seen in the browser address bar) for many steps. This makes it hard for the analyst to know where step 1 ends, and step 2 starts. Developers often need to install extra code to sort this out.
2. Making do with limited in house expertise
Many marketing professionals understand the concepts of analytics. The marketer may be able to interpret many items as a expert analyst would. For example looking at different channels and bounce rates isn’t hard.
But limited knowledge contributes to marketing mistakes. A deeper appreciation of the principles and technologies underpinning the reports helps. An agency can use this to provide more value to its clients. This will help avoid any client feeling shortchanged.
It’s like the classic 80/20 pareto analysis. The generalist may be able to deal with 80% of the analytics. But 80% of the value may lie in the other 20% and explaining the subtleties.
To pick some examples.
It’s fairly easy to install the eCommerce addins to Google Analytics. And the reports are relatively easy to read. But how does one know whether the numbers are correct?
One has to start with an external system. And the rules around currencies, tax and shipping applied by the business. These may be complex. The outputs may be hard to reconcile.
There may be time zone differences. The Google Analytics account could be in the US Pacific timezone and the external system is in US Central timezone. A specific transaction could appear on different dates in the two systems.
Sometimes the shopping cart makes the goal funnel configuration hard.
For instance shopping carts may append parameters to the pages. These parameters can require regular expressions to sort things out. But sorting can be hard with the awkward syntax of regular expressions.
Another example. The analytics may appear to be working correctly. But there may be intermittent failures in the reports. It takes more expertise to identify this and sort it out.
Wise agencies may seek “top up expertise”. This helps where the numbers of complex queries don’t justify an extra member of staff. The result can significantly improve the quality and value delivered to the client.
3. Limping on with a broken infrastructure
Your agency may dream of winning that new big client. Can you leave the can be worms alone? To do so can be one of the biggest marketing mistakes.
Many big corporates have a history of mergers and acquisitions. Each acquired company has websites and different marketing techniques. The individual “child” sites may be re-skinned and modified. They may “appear to fit” into their new family. This may be the easy bit to achieve.
It can be much harder to mesh the reporting for all the websites. For instance each may have a separate google analytics account. Achieving a comprehensive overview is hard with a fragmented approach.
One can put a single “rollup” Google Analytics snippet on each website. This might replace or go alongside the existing snippet. Rollup accounts have all the reporting in one location. But without careful configuration the home page visitor numbers will be wrong. The home page visitors to every site would simply be added together. An option is to insert the hostname as a prefix to the URL in Google Analytics. This stops misleading the reader with inflated numbers.
But a “rollup” snippet can still have vast numbers of pages. As we now have the traffic from all the original accounts in one new account. This could lead to sampling problems and other nasty side effects. The naming conventions for some of the customisations are likely to conflict.
The best solution is for an expert to design the reporting solution. This can bring all the individual children together into a coherent structure. Even then it won’t be easy. The solution may be awkward to implement.
These problems don’t only afflict big corporates. Sometimes small growing companies can have a can of worms. These come from their “early years”. The growing business may have IT systems that can’t support the greater scale.
The system may not fail completely. Instead the surrounding processes provide evidence of a “hand to mouth” existence. At this point many interrelated issues will affect the growing business. These can have serious implications for your agency.
Processes and practices may need reform to become more robust and support growth.
Systems and code that relied on these processes and practices will need auditing. Bringing them to an acceptable baseline may be hard.
Sometimes people will want to start again. However serious marketing mistakes arise if evidence isn’t gathered. Swapping inconclusive anecdotes doesn’t help good decision making. A well founded decision for a fresh start or a fix is required.
It may be a hard slog to sort out the mess. But leaving it can poison the whole client relationship.
4. Trying to use Google Tag Manager (GTM) to avoid any development involvement.
Google Tag Manager can really useful. The business advantage of releasing marketing from the development road map and test cycles is obvious. But marketing mistakes occur if people assume it breaks the link completely.
Google Tag Manager and Analytics rely on the full page being loaded. So any loading problem can stop them working entirely.
Google Tag Manager uses website code to collect data. So if the browser fails to run all of the code on a page, data collection can fail.
Something called “Event Bubbling” can also interfere with a Tag Manager like GTM. This is because they listen to events which “bubble up” to the topmost element. If “event bubbling” is stopped early the event of interest doesn’t reach the tag manager.
So there is one simple non technical message to take away. Tag Managers aren’t foolproof. And because they rely on code, they can be broken by code.
The mismatch of objectives between web developer and analyst can cause problems with GTM.
A web developer uses CSS to control the look and feel of the elements. To achieve similar appearance everywhere. For instance all buttons should look and behave in the same way.
The analyst has a very different objective. The analyst wants to know if button A has been clicked. And not confuse this with clicks on button B. This difference of objectives means that the CSS designed for uniformity may be almost useless for the analyst using Google Tag Manager.
So a big marketing mistake is to stop any development activity.
5. Trying to avoid process changes that would enable smart data gathering
Smart data gathering doesn’t just happen. Sometimes we must invent it. I learnt to appreciate this in my early career as a mechanical engineer.
I had to work out quality assurance criteria for Ford Motor Company. We had to ensure assembled car suspension arms didn’t fall apart.
These arms were fitted onto Ford cars. A tape measure was useless for determining whether the arm would remain intact. We needed to find an appropriate technique. We decided to monitor the forces used during assembly at particular points.
The challenges with collecting data from online systems are different. We need the same kind of creative thinking to avoid marketing mistakes.
So the approach I would recommend is:
Identify the processes for which to count success and failures, or assess progress steps.
Identify each step in the process.
Create a means of identifying beyond doubt which step we’ve reached.
Decide how to evaluate whether each step succeeded or failed.
Design and implement measurements for each step. The data collection must prove the failure or success of each step.
This means your analytics no longer is restricted to the standard configuration that “came out of the box”. The standard package can’t cater for all the unique challenges that your agency’s client has. We must design, implement and test data gathering techniques to fit a specific client.
Our aim is to create a “world of measurement” that supports the client’s business. This may require development support. Their help in constructing Cascading Style Sheets and code will determine whether Google Tag Manager is viable.
In many cases these techniques may not be difficult. Perhaps we need to count button clicks, or assess particular form entries.
However the techniques may have implications for the rest of the process. The speed with which the website moves a user between pages might stop the data collection. In these cases review the process design.
6 Not recognising when a report “doesn’t work”.
The standard reports that “come in the box” with Google Analytics are certainly comprehensive. But marketing mistakes arise if it’s assumed they provide answers to all the questions the client has; or all of them are relevant.
It’s worth reflecting for a moment how assumptions would change if we didn’t have free or near free cloud based services.
Back some 20 years ago, the comprehensive package like Google Analytics would have sold for many thousands of pounds.
And the vendor wouldn’t just offer software package. It would have pushed hard to sell consulting services.
These consultancy services customised the package to suit the customer’s business. Google cross subsidises the software supply from digital advertising.
But it’s a bad marketing mistake to let customisation just disappear. The lack of a vendor with a commercial reason to sell customisation misleads people. Naively they can think customisation isn’t required.
But a mass market standard offering like Google Analytics cannot cater for a customer’s key business processes. And proper customisation takes time and effort.
Some of the customisation will push valuable information into otherwise empty standard reports. So for instance there are no goal funnels until goals are configured. Using this stepwise process definition means the “goal funnel” report can provide a lot of value. Value that makes Google Analytics reflect the nuances of an unique business.
Other marketing mistakes occur when too many dimensions are shown in one report, or the wrong dimension is chosen as the primary index.
For instance I’ve seen attempts to show the various channels that web traffic uses. A mini bar chart shows the varying number of sessions for desktop vs mobile vs tablet device for each marketing channel.
Does this work? No. The numbers of sessions per channel can vary dramatically. The number of sessions 100% is far from consistent. It would be clearer to show the popularity of each channel is by device type.
Recognising how the web visitor thinks clarifies this.
A visitor doesn’t choose the marketing channel to which they wish to respond. And choose a device afterwards.
A visitor selects a device first. And only afterwards reads information on it.
So the likelihood that a user responds to a marketing channel on a particular device is clearer.
7. Failing to implement proper naming conventions
More marketing mistakes occur with poor naming. Naming is surprisingly important. There are two important criteria:
The names within reporting need to be easily understood. These might be for events, goals, concepts, stages or processes.
Naming needs to show the user how to navigate the hierarchy. This is vital if users are searching for the answer to a question and are trying to locate information to help them.
In most cases the naming should reflect the client’s business.
There are probably subtleties about the way a particular term is understood by a particular client.
For instance marketing qualified lead carries with it a set of assumptions about what it took for the prospect to be “qualified”.
There should be a glossary of terms so newcomers can rapidly learn any new terminology.
Stakeholders may well have come from competitors who use the terms in a subtly different way. Or from other areas within the same business that don’t share the same assumptions.
Getting naming conventions implemented is normally more about process than difficulty. However we must accept that perfection and complete coherency is unattainable. We are simply looking for a set of names that fulfill the two criteria above.
Conversations about naming conventions need to cover:
how it can robust and accept future names without destroying the fundamental logic.
how it communicates any subtleties arising from the data gathering or processing
I hope this post about Marketing Mistakes in reporting is useful. I realise it’s long and involved. Please get in touch or comment below if you want.
Does the business executive in you sometimes long for the sense of freedom you had on the beach as a kid?
Where you could build the sandcastles you wanted.
Straightforward, or more elaborate, exploring options as you wished..
Pursuing your ambition…your vision
But few people working to make a business great, needing to use Analytics, will feel they’ve got much of that long lost summer freedom…
You see many people will suggest..
– Your data and reports are a result of the technology you’re using? [picture of high security prison fences]
– Proposing change is risky – particularly if the board could identify you as the culprit? [warning signs..]
– It’s safer to stick with what’s working (supposedly) than disrupt it all in the hope you get easy to understand reporting.
But what if talking to the right people could make all these imagined barriers disappear?
Setting everyone free to develop and realise their vision of what the reports and the business could grow into?
Using simulations to produce real Google Analytics reports..
Using this reality to excite people about opportunities. Stopping Google Analytics being part of the prison wall keeping your