Karine Joly No Comments

Content is everything online, but creating great content to break through the noise is not an easy task.

So, it’s crucial to measure the performance of your higher ed content to avoid wasting time and resources on what doesn’t reach, engage or move your target audience(s).

The 12 speakers of the 2019 Higher Ed Content Conference have been kind enough to share how content performance is measured or evaluated at their respective school. So, you can compare your own measurement process to theirs — and get comforted or inspired.

How To Measure Higher Ed Content

Corie Martin, Director Web Services & Digital Marketing – Western Kentucky University

Corie MartinEach month we refer back to the content we shared in the month prior. We look at the timing, messaging, platform, the content itself and we review analytics using Sprout Social, native social media insights tools and Google Analytics to determine what worked and what did not – from an engagement and conversion standpoint.

When reviewing content performance from Q4 of 2018, we discovered that some of our news pieces on Facebook were going out too early in the morning when our audience engagement and reach was historically low.

Because those posts were going out too early to reach our audiences, it actually drove our engagement rates down almost 8%. That might sound low, but every percentage point matters. Since the discovery, we have changed the timing of our content sharing on Facebook (and Twitter) and we have opted to not share certain things on Facebook at all. Our engagement is on the rise so far in Q1 of this year. We are slated to gain back the points we lost, and increase them if we stay on trend.

Shay Galto, Manager of Marketing Analytics – University of Denver

Shay GaltoAt the University of Denver we try to measure and evaluate our content holistically.

For our everyday projects, we first set key performance indicators, so we know exactly how the content is moving our mission forward. From there, we work to ensure that we have the correct tracking mechanisms in place, whether through UTMs, Google Tag Manager, or a different tool.

Then, we build dashboards in Google Data Studio that give a quick visualization of the success of our content.

One example of this is our newsroom stories. The stories all load into key KPIs that our Communications Director set. As the stories are written they are tagged with buckets that we can later use to segment. Before we send them through email, we add UTM tracking codes so we can analyze the acquisitions to our website. After the story is live, we view the data in Google Analytics, but also look at a weekly dashboard that shows the success of our content.

We then regroup and discuss what content resonated – through open rates, volume of users, scroll depth, goals achieved, and CTA clicks on the stories. We use those analytics to inform future content.

Rebecca Stapley, Assistant Director of Social Media – Nazareth College

Rebecca StapleyOur content performance is currently measured regularly and shared weekly at an all team content meeting. For specific social and website KPIs, we share these updates once a month offering a summary of key pageviews, engagement/behavior and conversions based on the raw data that my colleagues and I are tracking regularly. We will also check in weekly for specific enrollment metrics or active campaigns like admissions yield, and search.

Based on those metrics, we might adapt our tactics or re-focus our energy based on what the data is showing us. For example, when we saw that our early decision numbers were not exactly where we wanted them to be last fall, we went into overdrive re-focusing on creating new Early Decision content, stories and an added paid reminder campaign to a targeted inquiry list.

Jeanna Balreira, Creative Director – Trinity University

Jeanna BalreiraTrinity uses both quantitative and qualitative data to measure content success. With tools such as Google Analytics and different social tracking methods, we can determine the reach and impact of a certain piece of content. We know that content has really hit home when people are talking about it on campus or online—especially when they’re sharing something new they learned or are excited about.

To further gauge the impact of our content, we are actively working to put this data into perspective of our overarching strategic themes or stories we’re aiming to tell.

  • How can we group content to see what types of content perform better: by topic? theme? type?
  • How can we evaluate which of our audience members like to consume which type of content: by length? media? distribution channel?

At the end of the day (or month, or quarter), our goal is to have made an impact on our viewers, so that our story becomes an organic part of their story.

Jeff Bunch, Web Content Strategist – Gonzaga University

Jeff BunchWe measure our content across multiple dimensions: website traffic, social sharing activity, and public relations pickup (particularly among our e-newsletter subscribers).

On the web, we’ve gone beyond sheer pageviews to look at content performance by key themes (such as Admissions) and track metrics that show engagement across sections of the website.

On social, we track the sharing of our stories that are both shared from our channels and which are shared by others. We have campaigns that are created in our email platform which set up tracking on our website.

We also use a public relations & social media software suite to track major trends for our brand, especially sentiment. We then report these metrics to our division in a shared document which each of us update monthly.

Emily Mayock, Online Communications AVP – Case Western Reserve University

Emily MayockWe use a variety of quantitative measurements, especially for online content—Google Analytics and Crazy Egg, for example, as well as Siteimprove for insight into content quality, SEO and accessibility.

Web content is constantly changing, so having something to continually monitor our status and show us how we’re doing is critical. We also care about qualitative measurements, including user testing and surveys. It’s important to us to consider feedback we receive to see how—or if—we need to adjust.

For example, we’ve done website redesigns in which our user testing showed new layouts worked, but then our “client” would get confused calls from users. So we have to listen to that feedback and assess it: Is it a major issue for many users, or a small problem for a few?

It’s important to not take people’s gut reactions too seriously and immediately revert just because a few users didn’t react the way you expected. In general, the people who are vocal in their disapproval are the ones who will be displeased with any change.

Janet Gillis, Communications & Marketing Officer – USF College of Engineering

Janet GillisAll content is pushed out to social media through either direct posts or links to websites or videos. I measure the engagement rate and the effectiveness of the content based on the share, like and engagement numbers.

The engagement rate is the biggest tell of content interest. I’m often surprised at how different content draws interest. Most times it’s very predictable, but often the engagement is unexpectedly low or high. If I had a social media analyst on staff, I could delve a lot deeper into the relationship details of our followers and their followers which ultimately affect these measurements.

I also use Google Analytics to measure website content views, how users are finding us and where our users are located. The amount of overseas visitors helps us to make compatibility decisions.

Kelly Bennett, Manager of Social Media and Marketing Strategy – Miami University (Ohio)

Kelly BennettIn order to continuously improve, we keep a pulse on how our content is performing.

If something we share isn’t receiving the engagement we hoped for, we take a look at how else we could splice it. Perhaps the content in a news story should be shared as a podcast or a video with text over it.

By experimenting with different forms of content on the various platforms, we’re able to adjust and refine our content and become more strategic in our work.

Salma Nawlo, Assistant Director of Communications – Florida Southern College

Salma NawloThe stories we create from the content we collect are shared on the web and on social media.

We get many interactions and engagement rates on a social post that allows us to determine the success of the story.

We also gauge success when we receive feedback from parents and prospective student visitors.

The website analytics page views also gives us an idea how well each story/news piece/blog is doing by conversion rates, etc. The metrics differ and we are still working to perfect this type of evaluation.

Jon-Stephen Stansel, Digital Media Specialist – University of Central Arkansas

Jon-Stephen StanselFor us, it’s all about engagement. We want content that our audience is going to interact with.

For the GIFs, it’s a little difficult to measure as GIPHY’s analytics tools are anything but robust. So, we have to rely on our social listening skills to see how they are being used by students organically.

We’ve even polled students on Instagram to see what kind of GIFs they’d like to see. In fact, that’s where we first got the idea to have GIFs of the president! It’s what the students wanted!

Andrew Cassel, Social Content Strategist – University of Alaska Fairbanks

Andrew CasselI report the numbers of followers to all the accounts. But the real metrics that my bosses use to measure success is the percentage calculated from how many people saw posts in a given month and how much engagement happened on the page in that month.

The idea is that higher engagement means better content. We look at lifetime numbers for engagement to take into account that some people see stuff that was shared weeks or months ago and engage with it later.

Derek DuPont, Social Media Manager – The Ohio State University

Derek DuPontBased on the role organic social media plays in the funnel at Ohio State, we look to engagement metrics, such as rates, total engagements and engagements per follower relative to peer institutions, as our main KPIs for content on our social platforms.

We run monthly tests of paid content types to be sure we are continually optimizing our content. We test things such as video vs. photo, copy length, video length, post structure and subject matter.

Beyond these isolated tests, we consistently tag our posts and gather data to inform our content decisions, from subject matter to creative approach. We are constantly working toward a data-driven approach to content.

A conference focusing on higher ed content?

The Higher Ed Content Conference (now available on-demand) is a must-attend event for higher ed content professionals and teams looking for new ideas and best practices.

Read below what a few of your higher ed colleagues who attended the past editions of the Higher Ed Content Conference say about the experience.

Tags: , ,