New changes in Google search results have improved the quality of search queries, helped to better detect black-hat spam, tighten up page rank guidelines, detect hack-sites and provided a better support system for webmasters.
Changes made to Google Search
In his YouTube video announcing the release of Penguin 2.0 and SEO changes made to Google search in late May, Matt Cutts, head of search spam at Google, discussed how the searches will be more accurate and user friendly for anyone who uses Google as their main search engine. For example, Google has developed a function that will make it less likely for users to see results from the same domain name, only if that domain name has appeared in previous results three to four times. Once a cluster of approximately four results appears from a specific domain name, the next pages will be less likely to show that domain name.
These types of changes, along with many others, are designed to make Google’s interface easier to navigate for the general public and webmasters.
Below are the changes that have been made:
- Penguin Update: The next generation of Penguin was employed (Penguin 4 a.k.a Penguin 2.0) — Penguin is a webspan change dedicated to locating black-hat webspam, targeting it and eliminating it. The version of Penguin 2.0 “goes a little bit deeper and has a little more of an impact” than it’s previous version.
- Advertorials: A stronger focus has been placed on advertorials and native advertising that violate Google’s quality guidelines. According to Cutts, If coverage or an ad is paid for, that ad shouldn’t float page-rank — some sites take money and link to websites and pass page-rank; in response, Google has strengthened their guidelines so that there is clear and conspicuous disclosure that something is paid for, rather than organic.
- Spammers: They have figured out new ways to go upstream to deny value to link-spammers and the technique less effective. Cutts also states that queries tend to be spammy in nature, such as [pay day loans] or some pornographic related queries, and Google’s search spam team were less likely to target these in the past.
- Hackers: Google is in the development stages of the next generation of hack-site detection software that’s more comprehensive and communicates more efficiently with webmasters — the aim is once someone realizes that they’ve been hacked, they can go to a one-stop-shop and use webmaster tools to point them in the right direction (if you’re doing high quality content on SEO, this shouldn’t be a big surprise).
- Authority: Giving websites that are leaders in a specific industry or area (such as travel or finance) an “authority boost”. This will create more traffic on your websites through related queries.
- Clarification: They’ve also been trying to find additional signals to help refine things for websites that are in the “grey zone” for quality and be able to decide if they’re candidates for high-quality web pages that could be presented in Google search results.
The main focus of these recent updates was to get more information to webmasters — this includes more concrete details, more information for sites that are hacked, better ways to assess things, and more example URL’s that webmasters can use to diagnose their sites.
The History of Domain Searches in Google
According to Marketingland.com, Google’s Penguin algorithm initially launched in May of 2012 and Panda’s algorithm launched a year earlier in February of 2011. The difference between the two programs is that Panda impacts a larger percentage of search results, while Penguin is designed to target specific links pointing to a page. Penguin has had the most amount of impact on SEOs, but only affected 3% of search results, as opposed to Panda’s 12%. The recent release on May 22nd, 2013, now only impacts 2.3% of queries, but is much more refined.
The goal of Google search results (and the development of Panda and Penguin) has always been to provide the most diverse results while delivering the highest quality search results that will prove to be useful to the searcher, according to Cutts. Older Google searches had no restrictions to how many results contained the same domain name per page, until they added “host clustering”, which prevents more than two results per domain name in the results.
This was soon surpassed by webmasters who placed content in subdomains, so Google expanded the host clustering to show three to four domain names per page. Then, they changed the search to show more diversity on the first page and less on the following. Now, Google has made a change to show less from the same domain, even on the following pages, that users have already seen approximately four times.
Newest Developments in Google search
Recently, Cutts answered questions at the Search Marketing Expo 2013 (SMX Advanced) in the popular “You&I” format, where he addressed specific questions from the audience on the newest Google search updates. He discussed how more than 500 algorithm changes are done every year, and how it’s always been a difficult thing to pick what to share with the public. He also noted how Penguin 1 affected entire sites, whereas Penguin 2 targets more specifically and can impact individual pages.
In terms of penalties, Cutts discussed how Google plans on using example URLS that will be sent manually, will contain two or three examples and show webmasters exactly what is wrong and what needs to be changed. If their is a change or update that will directly affect a webmaster’s site, they will be notified by Google Webmaster Central.
Cutts also sat down in an interview with Eric Enge of Stone Temple Consulting in early July to discuss the topics of link building and how to approach web marketing today. He stated that not all link building is bad; if the quality of the content is good, then more people will be attracted to it. Most of the time, it is approached in the exact opposite way, where people try to get high traffic before creating quality content.
The theme that emerged from the interview is that “SEO is moving steadily towards becoming more of a traditional marketing-type discipline”, according to Enge. Cutts stated:
“SEO used to be this thing that people thought of as happening in the dark corners of the web where you could do anything you wanted and magically, you’d get more search traffic. More importantly, they thought they could do anything they wanted and it wouldn’t impact their brand. For example, they could publish crappy content, and since no one ever read it, it didn’t matter”.
Enge supports Cutts statement by saying that many people do indeed focus on getting links to increase their Google search rankings, instead of focusing on creating a high-quality website that visitors will want to tell their friends about, whichis the secret to getting better, more reliable search rankings. Enge stated:
” There are a number of elements, but let’s talk about the concept that a good content marketing plan is using multiple channels for communication, like social media channels, content on a brand’s own site, guest posting on authoritative sites, giving and getting interviews and so on”.
Let’s be honest, SEO matters primarily for Google rankings, and Google has very particular ways of knowing that you’re important enough to be ranked. In Copy Bloggers e-book “The 10 Rock Solid Elements of Effective Online Marketing”, they bring awareness to the fact that Google knows when and how many people link to you, and they have links to index and rank those web pages.
Google also has a massive amount of big data that shows exactly what you spend time doing online, thanks to programs like Google Toolbar, Analytics, Feedburner, Google Reader, Gmail and many more that keep you logged-in to your Google account. All of this data is used to determine if whatever content you are creating is worthy of higher rankings in Google search. Matt Cutts states in the e-book, “To rank well, build a site so fantastic that it makes you an authority in your niche”.
With constant updates and new methods of enhancing the digital web for users and web masters, SEO becomes stronger and gains value in the list tools used to create higher traffic. And with a focus shifting towards content marketing, new steps must be taken in order to succeed.
Trevor Micklow is a business writer and content curator based out of Chicago, IL. US. He specializes in digital strategies, social media, psychology, executive education and business school related topics. He has been working and coordinating the general content of IntelligentHQ’s business school directory, which gives key information and programme details on the top business schools in the world. He has a BS, Psychology from Central Michigan University.