I start every content request with a Screaming Frog crawl—it’s just that amazing of a tool. This tool provides a strong foundation for me to understand what’s really happening with the site and all the content. It also includes so much information that it starts exposing possible reasons for underperformance.
To save time, I leverage the APIs for Google Search Console and Google Analytics. So this becomes like a three-in-one tool. Google Search Console and Google Analytics are both invaluable stand-alone tools as well. From a content performance point of view, layering in data from both platforms helps me really understand search engine results interaction (impressions and clicks) and actual content engagement (amount of sessions and average session duration).
Once the crawl is complete, I have one document with a lot of data points to help me dive into content performance. To help keep information overload or analysis paralysis at bay, I start by asking myself a few short questions that will help me achieve my end goal. I create tabs for each question and cut and paste the URLs in that new tab. Here are a few basic questions to get you started:
- Which pages have 400- and 500-level response codes?
- Which pages have the least amount of clicks?
- Which pages have the least amount of sessions and average session duration?
The answers to the questions above help you understand which pages might have technical issues (response codes), which pages need a closer look at page titles and meta descriptions (clicks), and which pages your audience is engaging with the least (sessions). There is so much more you can do with the data, but this is a good starting point for correcting content underperformance.