If you work with an SEO strategist and want to have some fun with them, ask them to report on keyword metrics. It’s a question that’s likely to raise their blood pressure, because it provides a stilted view of the organic data on their end. On your end, it can create an obsession with individual keywords that can tie your campaign to the time-absorbing process of “boosting” a fallen keyword instead of to high-level SEO strategies.
Given that keyword research and selection is still an underpinning of almost every well-executed SEO strategy, it can be difficult to explain why individual keywords are hugely important from a process standpoint, but not from a measurement standpoint. I’m going to attempt it in this blog post. Here we go!
On-Page Optimization: Where Your Keyword Strategy Begins
Whether you work with a strategist or perform SEO yourself, your on-page optimization strategy likely begins with keyword mapping. This is the process of performing keyword research to find the 3-5 keywords per page that are an optimal combination of search volume (high), competition/difficulty (low), and relevance (high). Once the keyword research is performed for your priority pages, you should have a “map” of keywords that will provide you with insight into the architecture of your site and the way your content is (or will be) structured. This will help you fill in content gaps, reduce duplications, and decide if you need to split certain landing pages into distinct topics.
We’re not expecting a page to rank for 3-5 keywords alone.
We use keyword research to set healthy targets and give us a framework for optimization that has the highest possible likelihood of paying off, but here’s the fun part: we’re not expecting the page to rank for those 3-5 keywords alone. Instead, you can think of each keyword as a bucket that holds up to thousands of related keywords, questions and long-tail search queries. Aside from the obvious (paid vs unpaid), this “bucket” phenomenon, called Latent Semantic Indexing, is perhaps the biggest difference between SEO and PPC, and it’s why PPC is more well-suited to measuring keyword performance - for now. So if you’re used to tracking PPC campaigns, you’re going to have to throw a lot of your logic out the window to understand organic performance.
What is Latent Semantic Indexing?
Latent Semantic Indexing (LSI) is a document retrieval technique that has been around since the 1980’s (and “document retrieval” is a fancy way of saying “how to find things”). As it relates to SEO, LSI is still fairly new. It’s a method of search that uses context clues and synonyms to build complex word clouds of interrelated terms and ideas, so the search engine can be in a better position to respond to each searcher’s behaviors and needs.
For example, let’s say you run an ecommerce store that sells dyes for soap-making, and one of your pages targets the keyword “blue soap dye.” If someone Googles the keyword “blue soap lake,” LSI allows the search engine to determine that your page is going to be the most relevant and useful one to serve to the user - even if you didn’t use the word “lake.” Further, if somebody Googles “blue lake powder for soap,” the context helps Google decide that even though the most common definition of “blue lake” is “the beautiful, glistening body of water where you’d probably rather be right now,” the user is searching for a soap colorant.
15% of the searches Google sees every day are new queries.
Why the shift to a more context-based approach instead of the more straightforwardly keyword-driven Google of the past? Well, because of how people search today. Google processes trillions of searches per year, and despite that astronomical number, they still report that 15% of the searches they see every day are new queries. Let that sink in. Every single day, despite the fact that there are trillions of searches on record, people still somehow find a way to invent a staggering amount of brand new queries. So if Google processed 2 trillion searches last year, that means 300 billion searches last year never before saw the light of day, and may never see it again. 300 billion. I can’t even understand that number.
A searcher today is much more likely to use Google as an extension of their brain.
This doesn’t mean we’re all creative super geniuses who find completely novel things to look up every day. What people search for likely doesn’t change much, with the exceptions being the news (which is always, well, new), trends, and products or developments. But how they search is evolving rapidly. A searcher today is much more likely to use Google as an extension of their brain, which means they ask the search engine full questions, use slang or colloquialisms, and copy-paste language they’re curious about. And on mobile, we’ve seen a monumental shift into voice search, including the oft-garbled dictations to Siri or Alexa.
For example, I’m a perpetually anxious person who likes to look up every medical symptom I’ve ever experience. Before, if I had a rash on my ankle, I could Google “ankle rash” and immediately regret my decision. Now, I can Google “will my ankle rash give me cancer or am I dying” to get the information I want. Over on WebMD’s end, let’s say they have an SEO specialist who’s measuring organic performance month to month. If that specialist were tasked with reporting on keyword rankings, they might set up “ankle rash” as one of their tracked keywords, which means they’ll be able to monitor the traffic that comes in through that keyword. But I didn’t find the ankle rash page via “ankle rash.” I found it via a different, completely obscure search that WebMD’s poor specialist would never have a reason to track.
I’ll Just Go Ahead And Track Priority Keywords Anyway. It Can’t Hurt, Right?
The measuring itself can’t hurt anything, and it can give you a vague picture of what’s going on with your site from a content standpoint, especially if you notice keyword clusters or collections. For example, if all of your top-ranking keywords are related to blue soap dye, you might be able to determine that the content on your red dye page needs work. The problem is, it’s too easy to let your observations sidetrack or alter your SEO strategy.
If you take a look at your keyword data in Search Console and sort top to bottom by clicks, you’re probably going to notice some version of this: your top keywords will be branded. The next ten keywords will be unbranded searches that may look like your “money” keywords because of the amount of traffic they’re bringing to your site, but I’m going to guess that if you add up the traffic for those keywords, it’s going to be a very small percentage of your total organic traffic. Then you’ll see a smattering of keywords that bring in 100 visitors a month or less, and finally, at the bottom, a whole bunch of little keywords and long-tail searches that bring in one or two visitors each.
It’s very rare for an independent business to pull in a broader, 'big-ticket' keyword.
It’s very rare for an independent business to pull in a broader, “big-ticket” keyword that would change your traffic in a meaningful way with just a couple of rank changes up or down. Those keywords are reserved for much bigger companies with a much wider audience and an SEO budget that would make us mortals weep. The upshot of this is that it’s also rare for the drop of a single keyword to result in a meaningful loss of traffic, since you’re buoyed up by so many diverse search terms.
Think of your keyword strategy as a diversified stock portfolio. You could buy nothing but expensive Amazon stock, but if Amazon takes a hit, your entire investment is in jeopardy. However, if you spread your money throughout a wide range of stocks from different companies in different industries, then it doesn’t matter if you don’t have any “big ticket” stocks as long as the general trend is up. Any one of those companies can take a hit and you’ll still be in the safety net of the others. Can you imagine what would happen if you changed your entire investment strategy every time one stock had a bad month? That’s kind of what happens when you let keyword tracking rule your life.
So If Individual Keywords Don’t Matter, How Can I Have A Keyword Strategy at All?
They still matter! As a starting point, those 3-5 keywords per page are critical guideposts, and optimizing a page for strategically-chosen keywords absolutely will make a difference. They’re just really hard to track or measure, because you can’t unbreak an egg: you can engineer the process that initiates your keyword momentum, but if it’s done right, it’s going to be difficult to reverse-engineer. Let’s just run with another egg metaphor and say your eggs are in a whole lot of different keyword baskets, and that’s a good thing. It’s smart, sustainable, and it allows you to focus on the big-picture strategy and avoid getting caught in a state of reactivity every time a keyword ranking changes.
You can maximize the chances that you’ll pick up traffic from conversion-driven long-tail queries.
Diversity is healthy. It’s healthy for investments, for gene pools, for communities, and for SEO. Does this mean you should just start writing gibberish on your site and hope to capture every long-tail search you can? No, because they still have to be the right long-tail keywords. If you cast the net of your content strategy in a strategic way, you can maximize the chances that you’ll pick up traffic from conversion-driven long-tail queries. And that’s why content strategy is paramount to organic success: it’s the “many eggs in many baskets” method of optimizing your site for keywords, and it leverages the billions of keywords that are up for grabs while the big players are busy competing at the top. It also happens to be a lot of fun.