This is the second of four posts in our content audits made easy series.
In the first article, we looked at the 7 reasons why you should audit your content. Now that we’ve considered the benefits, it’s time to prepare for an effective content audit.
Let’s dive straight in and look at the key stages of doing an audit.
Stage 1: take an inventory
Your content inventory is your starting point. It’s a list of all your content. You might only need to know about one or two sections of your site, or maybe you need to take stock of your entire site (or sites, plural, in some cases), e.g. you’re doing a site redesign and need to know how you’ll prepare content for migration.
Whatever the website project ahead of you, creating an inventory can be a grind. And it gets harder the more content you need to audit and the more places that the content exists in.
The manual approach
Like many content strategists, I’ve tended to take the manual approach in the past. That means starting at the home page and clicking through a website until you’re done. It’s long and arduous, and requires a serious amount of time to do it properly – but at the end you feel like you’re starting to know what the state of play is ahead of any analysis.
At this stage the aim is simply to find out what content you’ve got. You don’t need to capture much more than:
- Title of each piece of content
- Its URL
- Its position or depth in the hierarchy of the site
You may also want to give each piece of content its own numerical ID – an easy, unique identifier to help you refer to it later on.
A typical entry at this stage might be:
In this example W stands for ‘website’. 8 means it’s the 8th page in the audit. And the page sits at level 2 (the home page is always level 0).
The manual approach works the same for any online or offline channel. If you can find it manually, you can record it manually. The downside is: it can take a while.
The automated approach
There are generic web scraping products that can crawl your website and generate long lists of URLs, but it can be hard to discern much insight from this data, and sometimes takes just as long to clean up as a manual audit.
Stage 2: choose your quantitative data
Once you have a list of relevant content, now what? The simplest place to start is with quantitative data. These are the objective, unequivocal facts about your content.
Quantitative data typically includes:
Web and social media analytics
For example, how many visits does your content get? How long do people stay on each page? Where do people tend to enter and exit your site? How many social shares or likes has your content had?
For example, when was a piece of content first published? When was it last edited? When is it scheduled for review or deletion?
For example, who wrote a piece of content? Who’s its current editor? Who makes the strategic decisions about it?
For example, how many words are on each page? What’s the grade level or reading ease score for a piece of content?
Other metadata could include:
- Content format, e.g. text, image, video, PDF
- Page meta descriptions used to summarise content for search engines
- Any subject keywords or tags used to organise and find content
The list goes on: developers may want more data about things like page load times, accessibility, duplicate content; digital marketers may want more data about SEO, search rankings, inbound links and search terms.
The bottom line is you can’t measure everything, and you don’t have time for data you won’t use. So think before you start auditing. Discuss possible metrics with your team. Which ones are going to have the greatest bearing on your project? Which ones will help you take action?
You should also think about how you’re going to add this data to your audit – if you’re using a spreadsheet, factor in a reasonable amount of time to export data from other sources and marry it up to the right rows.
Stage 3: choose your qualitative data
The good thing about quantitative data is that it’s usually relatively easy to gather. Alas, the REALLY useful stuff – the data that tells you how well your content is performing, how good it is or whether your audience finds it useful – is harder to gather. Qualitative data tends to be quite subjective, and is sometimes best generated manually.
In her book Content Strategy for the Web, Kristina Halvorson breaks qualitative data down into six categories:
Is your content well structured? Is it broken up into readable chunks? Is it written in short sentences? Does it have clear, accessible links?
2. Knowledge level
How complex is your content? Do you need to be an expert to understand it? Or could a beginner get to grips with it?
How easy is it to find each piece of content on your site? Is it buried? Does it appear in Google searches or internal searches for the right keywords? Is it linked or related to other bits of relevant content?
What’s the purpose of each piece of content? What is it encouraging the user to do? And how well does it fulfil its purpose? How clear is the call to action?
Who’s the intended audience for a particular piece of content? If your organisation has personas, which one is each piece of content aimed at? If these personas have specific user needs, which needs does each piece of content meet?
Is your content factually correct? Is it up-to-date? Does it reflect what the organisation currently believes? Is it still relevant? Does it adhere to style, tone of voice, and brand guidelines?
Avoiding common pitfalls
All really useful stuff, but qualitative audits pose a few challenges:
- When the data is subjective, how can multiple auditors achieve consistency across all their answers?
- If each piece of data requires some analysis and thought, how can you prevent qualitative audits taking an unacceptable amount of time?
- How will you manage version control of your audit document to make sure there’s no overlap between auditors?
Use rating scales and leave notes
One way to make subjective decision making more consistent and easier to record is to use standard rating scales and leave commentary if further explanation is required. Auditors can therefore rate content quality as well as note their observations about it for future discussion.
Take usability, you might want to rank each page out of 5 so you can easily compare pages with low or high usability at the end of your audit.
Where more control is required, another option is to create predefined lists of categories you will tag pages with in the audit. A good example of this might be a list of your audience personas – tagging pages with the audience group they’re aimed at is a useful way to help focus on user need. At the end of the audit, you’ll also be able to see how well represented your various audiences are across all content on the website.
To give a different example, when auditing ‘actionability’, you might want to create a list with all your calls to action in it and tag pages that feature each.
You’re all set
You’ve thought about why you need to run an audit, you’ve got a clear view of the content on your website, and you know the data you’re going to use to measure its effectiveness.
In our next post, we’ll roll up our sleeves and look at running, analysing and acting on your audit.