TopIC Thumbnail - How do you solve a problem like ESN measurement?
10th Jan 2020
3 Min Read

How do you solve a problem like ESN measurement?

Lisa Hawksworth
Lisa Hawksworth
Culture & Insights

IC consultant Lisa Hawksworth and head of digital Tony Stewart interrogate the data.

You can’t swing a cat in the world of internal communications without hitting someone fretting about measurement, Enterprise Social Networks (ESNs) or both. While ‘measurement’ might be labelled ‘metrics’, and ‘ESN’ could be subbed with ‘apps’ or ‘intranets’, the common question is always the same.

“How do we get meaningful data about our online employee engagement?”

It’s a fair enough query but it’s one I always counter with, ‘why bother?’ It may seem a rather dismissive reply for someone who specialises in, and champions, internal communications measurement. But there’s a valid reason.

Vanity sizing

Measurement is only useful if there’s a clear purpose. Too often, gathering data only supports a vanity exercise, packaging those results in a way that reinforces the status quo. True exploration of data should reveal opportunities for understanding and improvement – to build on what’s working and fix what isn’t.

We often see IC teams struggling to achieve meaningful measurement of their online platforms. It’s not for lack of data or dashboards. It’s more an absence of clarity around why they’re measuring in the first place.

Asking why usually helps surface the more valuable reasons:

  • To demonstrate the business value of our online internal comms to the C-Suite
  • To prove a need for investment into a particular area
  • To understand how effective our communications or campaigns actually are.

Objectives in place, we can interrogate our ESNs more strategically. But other limitations mean we still have one hand tied behind our back, according to Tony.

“ESNs are far from perfect. Active directory is a common frustration. While the data keeps the platform secure, having the ability to maintain quality of information and to split it in meaningful ways, can be nearly impossible.

Ambiguous data makes meaningful reporting difficult. ‘Active users’, ‘popular groups’, ‘trending content’, are useful terms for identifying activity, but what do these terms really mean? What makes someone ‘active’, and how is trending determined? Ambiguity makes it difficult to pin behaviours to outcome, hindering our ability to tell the success story of our platform.

Add to these limitations, complicated dashboards. Not everyone gets on well with numbers, and the data presented can often be locked up in complex excel spreadsheets. Even if there is a dashboard available, understanding how to present those numbers to the business in a meaningful measurement story can be a daunting task.”

Getting to the heart of data

Even relevant numbers are only half the story. ESN measurement becomes much more meaningful when viewed against the wider context of your people.

Data may show trends and popularity, but it doesn’t help you understand how your people feel about the platform, what value it gives them and whether it’s helping support the organisation’s ambitions around strategy, change and culture, day-to-day.

What’s the forecast?

Measurement efforts can suffer from interpretations either too broad or too niche to be meaningful. It can help to think of your ESN’s purpose and activities like climate and weather – linked, but distinctive.

Climate is the big picture purpose for the platform. It’s understanding the overall outcomes you’re looking to achieve and knowing the behaviours that will demonstrate that the platform is delivering.

Weather is the constantly changing day-to-day activity on that platform. There can be microclimates, weather patterns and sudden shifts. Your data can help you both understand what’s happening and how to react. A dip in active users – why? A trending topic – what’s on people’s minds and what would they like to see more of?

Making measurement meaningful

If we know why we’re measuring, and what we want to achieve, what does a good process look like?

1. Clarify your vision

For example: “We want to improve customer service”.

2. Identify the behaviours that support the vision

A specific example: “Tricky customer questions are posted on the platform to crowdsource answers”

3. Find data that proves the point

Some information will be irrelevant, for example, “Profile photo completion data doesn’t help us tell this story, so let’s not use or report it.” Other data will be essential, such as “Membership of customer service employees to a specific ‘answers’ group”

5. Fill the gaps

Auto-generated data won’t cut it, certainly not for the qualitative aspect, so create the sources you need.

“As a customer service employee, I feel better informed as a member of the Customer Response group”

6. Ask what success looks like

“Experts in the business post timely answers to questions in this group.”

7. Find proof

Gather the feedback that supports your activity.

“In our Customer Response group, 50 questions were posted this month, 35 were answered within 4 hours.”

“As a customer service rep, I think the group is a useful resource.”

8. Report back to the business

Package up this data in a way that’s easy to understand and report during executive meetings.

The importance of robust reporting is only going to intensify as leaders look for empirical proof that their investments are paying dividends. Increasingly, we’re helping our clients make sense of their digital data to understand where their platforms are hitting and missing the mark.

More on this TopIC

The Point.

The latest thinking from the team, direct to your inbox.
We’d love to hear from you

hello@scarlettabbott.co.uk
01904 633 399

AWARDS BADGES Agency Business white
York

The Old Chapel,
27a Main Street,
Fulford,
York,
YO10 4PJ

London

The Black & White Building,
74 Rivington Street,
London,
EC2A 3AY

© scarlettabbott 2024 Privacy Notice