In 2012, Google hired noted futurist Ray Kurzweil with the goal of bringing “natural language understanding” to the company’s search engine and other products. Today, the results of Kurzweil’s hiring are more evident than ever — often, a page doesn’t even need to contain all of the words in a query to show up on Google’s results page.
To illustrate this, try searching Google for “that movie where the minor league baseball players need to choose wedding presents.” Google understands the intent of your query and displays “Bull Durham” on the results page. This phenomenon isn’t limited to movie-related searches; Google has expanded its language understanding to the point where the search engine can handle longer queries than ever — even when the user doesn’t know exactly what he is looking for. These changes in Google’s search algorithm are hardly new. In fact, they have been brewing for several years.
Keyword-Based SEO: 1995-2011
In early search engines such as Excite and AltaVista, the method used to provide results for queries was rather simple: search engines looked for pages containing the same keywords used in users’ queries. If a page contained more occurrences of a specific keyword, it was determined to be more relevant and was displayed closer to the top of the results page for that keyword. Publishers exploited this system by “stuffing” keywords into pages many times, not caring about the relevance of the content.
In 1997, Google launched with a more sophisticated technique for determining the relevance of search results: PageRank. Google’s search results were still keyword-oriented, but preference was given to websites that had a greater number of inbound links. Although Google provided search results superior to those given by older search engines, it was still possible to exploit their system through a combination of keyword stuffing and the creation of inbound links with anchor text matching the keywords for which the publisher wanted to rank.
2011: Panda Puts the First Nail in the Keyword Coffin
By 2011, publishers with large budgets had built massively profitable empires out of exploiting Google’s algorithms. A website with sufficient PageRank could be displayed prominently on Google’s results page for nearly any keyword simply by building a page targeting that keyword. So-called “content mills” hired armies of freelance writers to create articles targeting millions of keywords and soon cluttered the results pages for many search queries.
The problem with this strategy was the fact that many of the articles produced by “content mills” were poorly researched and didn’t satisfy users. Google introduced the “Panda” algorithm in 2011 to combat this trend by penalizing websites with low-quality content.
2012: Penguin Combats Keyword-Based Link Spam
Before Panda, the typical strategy for ranking well in Google was twofold. First, the publisher created a page targeting a keyword. The target keyword was used several times in various areas including the title, headings, images and body text. Next, the publisher created or purchased links on other websites that pointed to the article and used its target keyword as the anchor text. With a large budget and website portfolio, a “link spammer” could manipulate Google’s search results without obtaining a single natural inbound link to his content.
Google introduced the “Pengium” algorithm in 2012 to combat this by penalizing websites that utilized unnatural inbound links to manipulate search results. The algorithm also further penalized websites with low-quality content by lowering the ranking of websites with large advertisements that forced users to scroll down to read the content they wanted.
Today: Google Focuses on Intent, Not Keywords
With the release of the “Hummingbird” algorithm in 2013, Google’s hiring of Ray Kurzweil finally came to fruition. The purpose of the Hummingbird algorithm was to increase Google’s understanding of natural language queries and answer those queries based on the user’s intent. Rather than simply focusing on a few important words, Hummingbird tries to understand full sentences. Thanks to Google’s new semantic capabilities, the search engine can even handle queries spoken into a smartphone.
So, Are Keywords Outdated?
No, keywords aren’t entirely outdated and shouldn’t be completely discounted when developing a content marketing strategy. After all, Google’s index is text-based and will remain so for the foreseeable future. In order for Google’s algorithm to understand what your articles are about, you’ll need to use the appropriate keywords.
However, the SEO landscape has changed to the point where common SEO tactics such as stuffing keywords in a page’s URL and sub-headings have become outdated. Google’s understanding of language has evolved so that the search engine understands the topic of an article regardless of keyword usage.
When developing your website’s content for the current search engine climate, it’s time to stop focusing on how you use the keywords. Instead, think about what a user’s intent might be when he searches for those keywords. Is he looking for a little background information or a comprehensive overview? Is he comparing products, or is he ready to buy now? Writing with the user’s intent in mind won’t just help you develop better content — it’ll improve your search engine rankings as well. When you focus on writing great content, the keywords will fall into the content naturally.
Modernize Your Content Strategy
Do you need help developing new content and updating your existing content to improve your search engine rankings and better serve your users? Black Fin has helped law firms across the country dominate their markets since 2007. Contact us today by calling or filling out our contact form to find out how we can help you.