Today’s media zeitgeist calls for new analytics and metrics
March 20, 2017
Sheri De Carlo
“This was a great day. Having practitioners from CPRS Hamilton and IABC Golden Horseshoe join us was a real pleasure. It is good to be able to share the McMaster residency experience with the professional communicators of the Greater Toronto and Hamilton area,” says Alex Sévigny, McMaster MCM Program Director and Past-President on the CPRS Hamilton Board of Directors.
“Now as communicators, while we may be working in this new zeitgeist, we’re still asked to provide metrics to people in our organization to measure what we do each day. If we work for a bank, we may be asked to show what is our share of voice around corporate citizenship, and we could present it like this. If we’re an energy company trying to build a pipeline, we may be asked to show how the level of favourability of our media coverage has changed over the last two weeks, and we may present it like this. Or if we’re an issues manager with a provincial Ministry, we may be asked to show the impact of our pro-active communications on different topics, and we show it like this.”
“Big data is good for me, and I think it’s good for communications generally, despite the complexity, said Dr. Laing. “It provides a wealth of channels and feedback loops that make it much easier to issue content across different platforms that, because of the varied use of those platforms, gives a much richer palette to work with, story-telling wise. But to measure it is problematic.”“When I started measuring communications in the 1990s, there was no big data, there was no complexity (although we didn’t know it at the time),” says Dr. Laing. “Now, big data means that communications managers are managing greater complexity. And that complexity lies in the relationship between media and audiences,” says Laing.
“Then there was a known audience defined by space and time through a limited number of large centralized media organization that controlled the messages by controlling content creation and distribution undertaking what was largely a one-way flow of communication. Now, there is a relatively unknown audience, unrestricted by space and time, operating through fragmented, unlimited media channels that form ‘filter bubbles’ or, more accurately, separate media communities, that participate in a two-way flow of communications.”
Martin Waxman, instructs digital communications at University of Toronto, and was in attendance at the event. He says “It’s time for PR professionals to stop saying we’re all about words and begin to understand how to approach and analyze big data. We have an opportunity to demonstrate our value to the C-Suite and show how the programs we execute help achieve business objectives. The challenge is getting over our fear of Excel and then learning how to cut through the massive amounts of data and uncover actionable insights.”
The go to within the industry has become the Media Relations Rating Points (MRP) system. “The problem with MRP is not so much calculation per say, it treats everything in isolation. It is largely designed as an idea to track public relations agencies activities. Every agency was using different metrics and MRP solved that,” says Laing. “Media impressions and score don’t speak to the organization’s larger objective. “What do you want to be known for is what organizations from a strategy viewpoint should be asking themselves. Every communications tactic should be tied to this objective.”
“We do the world of PR a disservice if it’s all about getting mentions. Effective campaigns change attitudes and behaviour,” says Dave Scholz, Executive VP at Leger.
According to Laing, there are principles that can guide any measurement system and that if you keep them in mind, they will allow you to build or design or maintain a system that is more sustainable, more relevant and most of all, provides more insight than if they are ignored.
The first principle has to do with how communicators approach measurement: how on the one hand, the activity, and on the other hand, the broader goals of the organization together. “Too often, I’m approached by someone – mid to lower level within the organization, they have this campaign they are about to do – we have a 30-seat bicycle that we are using across the country, having people ride on it, or we have a forty foot inflatable colon we’re touring in shopping malls around the province to inform people about colon cancer. All true. And they ask – how do I measure it?”
Already I was listening attentively to Laing’s observations, however, my ears perked up when I heard him mention the ColonCancerCheck campaign. As the communicator who wrote the public relations plan for the launch of the ColonCancerCheck program during my time at the Ministry of Health, I was keenly interested to hear how he would evaluate our success. At that time, I faced skepticism about my tactics, including dressing the Minister of Health in what Olympians wore to luge, but instead he looked interestingly enough like a friendly colon handing out information and shaking hands at high-profile places around Toronto. It wasn’t what you’d say typical in government. However, combining strategy with tactics, I and the rest of the team had made a strong case based on research for taking an innovative and memorable approach I thought; now this was getting interesting…
The second principle of guiding communications measurement, emphasize validity, the focus on true numbers over big numbers and involves what measures we choose. “When I started, people focused on equivalent advertising value – an idea that has fallen into disrepute but which still persists to this day because it speaks the lingua franca of money. Then it was impressions – the aggregated projected audience reached by content. Now that’s in disrepute,” says Laing. “The problem with media impressions or even equivalent advertising value for that matter isn’t so much the concept, as how they’ve been they’ve been misused, misapplied and miscalculated.”
“At the root of the problem is validity. In research, measures must adhere to two qualities – reliability (the quality of the measure to be replicated), and validity (the proximity of the measure – which is an artificial thing – to reality). In the case of impression calculations and validity, what we’re talking about is how close can we make them to reflect the true impact that they may be having on audiences. We are not looking for the biggest number possible, but the truest number possible, and so we have to take into consideration other factors – prominence and placement, time on site, reverb through other channels, to determine the real exposure of content to an audience.”
The bigger the media effect, the harder it is to measure. The third principle, is how to understand the effects relative to exposure. “All in this room will be under pressure to demonstrate that their initiative has had a demonstrable, measurable effect on audiences – often an actual behavioural effect – like people going out to get a colon cancer test after reading about the 40 foot inflatable colon at the local mall, touring it, and signing up for the test.”
“And here I caution communicators to think about this: as you increase the complexity of the effect of what you’ve done, the less likely you will be able to prove it, let alone measure it. So what do I mean. It means this: At its simplest and most direct, you can measure what is called “output” – the generation of content within channels: We got 10 newspaper articles, 15 radio hits, 50 Tweets and 47 shares on Facebook. It’s clear, it’s correct, but so what. Well, we can take it a step further, and talk about exposure – we calculate that with this content production, we reached 100,000 people. Better, but still, doesn’t meet the criteria, but did we reach those people – now we’re into conjecture, based on audience demographic data we may from sources like Numeris and Comscore etc., but other elements could have interfered with the actual exposure.”
“From exposure, we move to the next stage of complexity: affecting awareness. People were not only exposed to the content, but they reported being aware of that content. That awareness may have come from other sources, and determining that awareness is more expensive than simply measuring exposure, but it is something we were likely trying to achieve when we shipped that 40 foot giant colon around Ontario.”
“Next stage is opinion change. We know they were exposed, we showed that they were aware, and we even showed they changed their opinion about colon cancer testing, that it wasn’t so hard to get. But again, it’s more expensive, and even more factors may come into play that would lead to opinion change. “
“Finally, there is the most complicated media effect: behaviour change. They saw, they’re aware, they’ve changed their mind and they actually went out and got a test. Even more complicated, more expensive with media content, public opinion and possibly even real-world data thrown in on rates of colon cancer screening tests undertaken.”
His comments about the inflatable colon got a chuckle out of the crowd. At Dr. Laing’s observation, I couldn’t help but smile. When I wrote the public relations plan it was exactly what I’d had in mind. As an undergraduate student in communications, I fell for communications theorists such as Marshall McLuhan. The idea of how people interpret what they read and see, and what causes awareness and triggers change in their behaviour, has always been of interest to me. Sure for some there is a flashiness to public relations, but it is the academics of it that drew me in and keep me coming back. Communicators are often asked to go directly to tactics and don’t spend time on research, this can stunt creativity with purpose and result in an overall strategy and activities that do not reflect the organization’s objectives and key priorities. For instance, communicators are asked to make a big splash in the media, but there is little reason as to the objective behind the intended appearance, or they are asked to measure the effectiveness of a campaign, but there is little comprehension within the client’s organization as to what the numbers actually mean in relation to the organization’s objectives and strategy. When speaking with a junior consultant that was part of my team, she said she believed that the client would be happy because we had got media to cover the event. I explained to her, that it was awareness we were seeking, yes, but we were ultimately seeking for people to change their behaviour change and in this instance, take the colon cancer test. Her eyes widened at this idea. It’s official, I thought, underneath it all I’m a communications strategy nerd with the best intentions.