2012 has been an interesting year for SEO. It’s been more than a full time job trying to keep up with all the algorithm changes. Google wants to restrain on the number of people manipulating its algorithm and also improve search results and quality for the end user. However, the Pandas and Penguins have made the World Wide Web a difficult place to be at.
Google has changed their search algorithm more than 500 times over the past year and while most of them were very minor, some of the big changes had website owners clambering to cope up with these changes.
Let’s take a look at some of the major changes to the search algorithms that happened this year:
1. Search + Your World – January 10, 2012
Google announced a drastic change in personalization by aggressively pushing social data and user profiles into the SERPs from its own social network – Google+. This shift also facilitated Google to make more users active on Google+ which in turn helped them promote their social network. Here is the official post.
As a strategy, we aggressively started creating Google+ business pages for our clients. Apart from this, we also performed some activities as mentioned below.
- Adding related people / businesses to Google Plus circles
- Integration of Google +1 button on the website
- Gathering Google +1 votes via Google Search
- Posting content regularly on Google Plus business page
These activities were later added in our core deliverables. Working on these activities immensely helped to promote our clients websites in personalized search.
2. Ads Above The Fold – January 19, 2012
Google announced a change in its page layout algorithm to devaluate sites with too much ad-space above the “fold”. It was previously suspected that it is a Panda refresh since it was directly related to visible content on the web page. The update had no official name though, and was referenced as “Top Heavy” by some SEOs. Here is the official post.
As always, we focused on creating unique search engine friendly content and most importantly optimized the page by placing the content above the fold.
3. Venice Update – February 27, 2012
As part of their monthly changes, Google mentioned code-name “Venice” which was confirmed in this official post. This update appeared to be more aggressively localizing organic results and more tightly integrating local search data.
With this change, Google started displaying the local 7 pack as well as locally optimized websites from the region where the query was searched (even for generic searches). For example, if you search for “cosmetic dentists” with the location set to Chicago, IL; Google will display the local 7 pack as well as local websites (optimized ones) from that particular area. Local optimization factors became more important than ever. After doing an in-depth research on local search optimization factors, we developed strategies to be implemented for websites (with local target) which gave them a fair chance of appearing in the search results.
We have covered a comprehensive article on Local Search Audit Checklist in this newsletter.
4. Panda Refresh – March 23, 2012
Google announced another Panda update, this time via Twitter as the update was rolling out. Their public statements estimated that Panda 3.4 impacted about 1.6% of search results. Throughout 2012, Google proclaimed 12 Panda refreshes affecting a fair percentage of sites. The latest update occurred on November 5, 2012.
Here are some recommendations to protect websites from the Panda update:
- Content Freshness – Make sure to update the site at regular intervals with new pages, blog posts, etc.
- Number of Pages on the website should be higher. Point 1 should help this cause.
- Site Speed – Although we keep ignoring this, make sure that the site loads up quickly. We feel the 2 second limit set by Google is a bit harsh. We recommend the load time to be under 8 seconds.
- Thin Content – Do not use hidden content or content with tiny font size.
- Avoid duplication of text and page titles. Make sure they are unique.
- Avoid keyword stuffing and hiding.
5. Penguin Update – April 24, 2012
Google finally rolled out the “Webspam Update”, which was later renamed as “Penguin.” Penguin attuned a number of spam factors, including spammy link building, over optimized websites, keyword stuffing, etc. which affected majority of websites globally. Here is the official post.
Below are some of the preventive measures you can take to avoid the Penguin update:
- Do not stuff keywords on pages.
- Do not use spun or duplicate content. The content should be at least 85% unique and should be human readable.
- Since social signals are given some importance, make sure to link your website to your Google+ profile using rich snippets. Facebook likes, shares and Twitter retweets might also give a boost.
Although a full recovery is not possible after this update, we recommend the below measures to be taken if your site has been affected by Penguin:
- Build more positive / quality links. This will lower the number of negative links to your website. Make sure to use a combination of both generic (branded) and keyword targeted anchor text.
- If the inner pages are affected, copy content from the Penguin affected page to a new page. Start building quality links to the new page. Obviously, disallow the old page from indexing.
- If the home page is affected and nothing seems to be working, the only option is to create a new site and start link building for this new domain from scratch.
Throughout 2012, Google announced 2 more Penguin refreshes affecting some percentage of sites. The latest update occurred on October 5, 2012.
6. Domain Diversity and Shrinking First Page Results – August 14, 2012
Google made a significant change to the Top 10, limiting it to 7 results for many queries and in some cases displaying 5 to 6 results are from same domain. However, the latter was taken care of by Google. Even industry experts like Danny Sullivan and Dr. Pete criticized this discrepancy by Google.
7. Exact Match Domain (EMD) Update – September 27, 2012
Google announced a change in the way it was handling exact-match domains (EMDs). This led to large-scale devaluation of keyword rich domains. Although the change aimed to target low quality sites that might be riding on the basis of exact matching, some may have dropped unfortunately in ranking due to factors that had nothing to do with this algorithm update (and were only coincidentally just EMDs).
Well, we can say a full recovery might not be possible. However, according to some trusted resources, we can take a few steps to ensure that the rankings are partially back.
Take enough time to re-write the low-quality content on your site and make sure it meets the quality guidelines.
Make sure that the pages are well SEOd. Do not stuff or spam. Avoid excessive on-page optimization.
Work on getting quality backlinks for your domain. However, make sure to carefully choose the anchor texts this time. Use variations of the anchor text. Do not over use the “exact match” keyword.
Once the above things are cleared, file a reconsideration request to Google using the Webmaster Tools account.
Finally, if you think the above steps are not working out, the last option would be to get a new domain (a branded one this time) and perform a proper 301 redirect from the old domain to the new one. Follow the above 1 to 3 steps for this new domain.
As far as domain recommendation is concerned, make sure to avoid exact match or keyword rich domains. For optimization, unless the domain is well established with higher DA (Domain Authority), PA (Page Authority), Page Rank and strong link profile, it is a good idea to avoid these types of domains and go with a partial match or a branded domain.
Remember, Search Engine Optimization is still very important but far too often, website owners get caught up in black hat tactics and they forget the purpose of it all. Even if you move your way up the search results with the right techniques, you won’t keep human readers around or convert visitors to genuine customers unless you have something of value to offer.
Finally, as Google says – Always think from the End Users perspective!!
When you initiate a new local optimization strategy or look over an existing project, it can be quite difficult to know where to start from. This is exactly the reason we have decided to share our comprehensive local search audit checklist to avoid missing the important aspects along the way.
The following points can be used for a comprehensive review of any business’s local search presence and help you generate concrete plan of action to fix most problems. We deliberately made this checklist a bit generic, so that it applies to majority of businesses you come across.
1. NAP Accuracy & Consistency
The older the business, the more likely the NAP (Name, Address, or Phone) information is to be inconsistent. The more the irregularity, the more efforts you have to put in and therefore will take a longer time for success.
Questions to Answer:
- How much clean-up is needed?
- Does the address use correct USPS address format?
- Get an entire list of old phone numbers, old business addresses, old domain names. Go back at least 10 years (if applicable) for this information. You will need to review all of these to trace irregular NAP citations.
Typical problem areas that lead to inconsistent information on the web:
- Company / business is big and/or older.
- Business has multiple locations.
- Aggressive local search marketing (by unprofessional SEO companies or in-house marketers).
- Use of call tracking (for Internet Yellow Pages – IYP advertisers).
- Business has moved locations, changed phone number or uses many different numbers.
Useful Tools to check for NAP information:
- Whitespark – (be sure to enter all phone numbers)
- USPS Address Listing Checker – returns accurate NAP format
2. Online Profile & Citation List for Businesses
Incomplete listings is equal to more work. With multi-located businesses, you need to deal with this via data providers and even do bulk uploads in Google and Localeze for multiple locations. Spend some time monitoring and reporting duplicate listings. You also need to understand that a citation list for a local business is as important as an authoritative directory list for a website.
Questions to Answer:
- Do they exist in the right places?
- Are they complete?
- Are there duplicate listings on Google Maps?
- How do their citations compare to competitors?
Useful Tools to Check Local Business Profiles:
3. Check for Duplicate Listings
The more duplicate listings a business has, the more effort it will take to get rid of them. The process to identify all the listings for each business is outlined below.
Steps to Identify Duplicate Listings:
- Sign out of all accounts and switch to incognito browsing / private browsing mode.
- Set location to location of client and go to Google Maps search.
- Search name(s), address(es) and phone number(s). Be sure to put them in quotes.
- Be sure you also use that list of all the business’ old name(s), address(es) and phone number(s) when searching for duplicate/old listings.
- Be attentive to what Google suggests in the Maps search box as well.
4. Current Rankings & Competition
It’s very important that you research and understand the competition so that you can set realistic expectations with respect to results and time frames. In simple words – tougher competition equals to more work.
Questions to Answer:
- How do they rank in maps search results for their top keywords?
- How do they rank in the Organic search results for their top keywords?
Useful Tools to Check Competition Data:
- OpenSiteExplorer.org – Compare domain authority and links
- whitespark.ca/ – Compare their citations to those of the top rankers for their best term(s).
- Places Scout
- SEOmoz.org – Rank tracker
5. Best Practices for Local Optimization
If you think working only on Map listings and citations will get you at the top, you are most probably wrong. If you are not willing to do changes on the website, you may not be able to get into the Local Pack or move up into the top positions of the Local Pack, depending on the competition.
Questions to Answer:
- Are you using local best practices?
- How does domain strength compare to top rankers?
- How do links compare to top rankers?
Things website must essentially have:
- Use categories/subcategories, location, and business name in page titles? A generic format while creating titles can be – What? Where? Who?
- Place NAP in Schema.org format on every page (per location).
- Implement rich snippets for each location.
- Use location terms on pages (H1’s, text content, image alt tags) and in internal links. Remember these are considered keywords for organic search too.
Points to remember while handling multiple locations:
- Unique landing page for every location with NAP in Schema.org micro format.
- Footer links to (a few) location landing pages. The rest can be added to the HTML sitemap. The footer links should just be added on the home page (and not the inner pages) to avoid keyword spamming.
- Link from Google+Local page to appropriate location landing page.
6. Customer Reviews
Do you have enough Google Customer reviews are vital to any business that wants long term success online. Establishing where a company stands with its reviews is a very important step.
- reviews to compete in their nitch?
- Do you have an active profile in InsiderPages, CitySearch, Yelp, Yahoo Local, Foursquare, etc. with reviews?
- Are the reviews mostly positive or negative?
- What other online profiles do you have in place?
- getlisted.org shows reviews’ numbers and diversity.
- Google+Local pages show number and diversity.
So now you have not only a series of questions to get started with, but some very good procedures as well to complete a thorough Local Search Audit.
In October, Google finally released their official link disavow tool integrated in Webmaster Tools which allows site owners to markdown those shady links pointed at their websites. But as with all things in SEO, it is not quite that simple. The biggest problem with disavowing your links is that at first glimpse, it looks like an easy task. Instead of taking the effort to remove those “bad” backlinks manually, there will be people who assume this tool will do it for them. Unfortunately, it’s not that simple.
Penguin-penalized websites are the main users of the disavow links tool, but because it’s so user-friendly, people may be too quick to jump on the disavowing drift.
If you haven’t actually been penalized and you start disavowing your links, you’re likely to send a signal to Google that you manipulated their system. Make sure that you know you were penalized and it’s not just some random fluctuation in rankings, a sitemap or indexing problem, or an accidentally no-indexed page.
With the disavow links tool, you’re telling Google what you think are the links that might be spammy, but unfortunately there is no way to know if a link is actually hurting you or not.
Remember – with the disavow tool, you are giving Google more power. We believe Google will definitely use this tool to compile a list of top domains which come up frequently in disavow submissions. The next job would be to do a manual review of these complied domains and if deemed necessary discredit them.
So, what can be considered definitely as a “Bad” link or a link that can be Disavowed?
We conducted several internal sessions and had some lengthy debates on this provocative topic. After doing an in-depth research on this subject, we have come up with the top 10 types of links you should disavow (considering you are not able to get them removed even after several attempts).
- Site Wide Footer Links or Multiple Links from the Same Domain
- Blogroll Links
- Profile Links Generated through Forum Postings
- Links from Free Directories (which are often of low quality)
- Links from Same Class C IPs
- Links from Spun Content
- Links to Different Industry Websites from the Same Content especially Article Websites and Blogs
- Links from Link Farm Directories or Pages
- Links from Non-English Websites (for example – .ru, .jp, .cn, etc.)
- Blog Commenting – Multiple Comments with Anchor Texts Pointing to Non-Family Safe Sites (Adult, Casino, Pills, etc.)
If you are looking for a quick fix to help boost your SERP rankings after your site is hit by the Penguin update, the truth is that the Google Disavow Tool just isn’t going to help you.
The value of the tool is obvious, especially for genuine websites trying hard to remove the problematic links, but it does not mean that your website will recover overnight. Disavowing links will take time and genuine effort, but if you can’t get them removed yourself, the long-term benefits of using the Disavow Tool should surely be worth the work.
Google is proposing to layer the social-ness of Google+ into every aspect of its search and into every possible Google product. They will have a standing profile for every person and business that details everyone you’re associating and sharing with as well as how you interact with other people, brands and businesses.
At the moment, people simply do not have time to add yet another social network to their already busy lives, but by making “Plus” matter in search results and rankings, Google will force nearly every business to participate that wishes to succeed online. The internet is evolving at a rapid pace and those companies that ignore Google+ will risk themselves being left behind.
Google has truly remarkable plans for Google+Local for businesses. It intends to make a company’s Plus pages the hub for everything it does or ever hopes to do online. In addition to the social sharing features consider what Google already has planned for local businesses on Plus:
- Business listing management (what we’re used to doing in Google Places)
- Statistics dashboard (hopefully, with full Google Analytics integration)
- Deals and coupons (mobile coupons are already a reality)
- Contests & Games (via recently-purchased Wildfire)
- Events (including Google Calendar integration – invites, RSVPs, etc. which has already begun)
- Offline buyer loyalty reward program (Punchd)
- PPC Ads (AdWords Express)
- Integrated online payment system (Google Wallet)
- Post-purchase product delivery from local stores to buyers
- Customer service (via Video Chat, TalkBin SMS, Circles, HangOuts)
- Reviews (already integrated)
Google first announced that it would release much of this functionality by September 2012, but for such an ambitious undertaking, this was apparently an overly-optimistic goal.
That said, there were some issues as well. For example, many businesses created Google+Plus Brand or Google+Organization pages for their businesses when they should have created Google+Local pages, instead. Page types cannot be changed and trying to tie these to Places listings is now causing undue frustration.
Google is still working on the first piece for local businesses – integrating Places with. This will be done automatically in phases. If you try to force it and it does not go through, you are very likely to create more problems for yourself.
At the moment, the only businesses that should be merging the profiles are businesses that go to the customer, like plumbers, electricians and carpet cleaners with a single location that have set up a Google+Local page for their business using the Google account that holds their Places listing.
As a final point, new business listings should be created and verified directly at Google+ and the page type option should be Local.
Last month, Dan Petrovic from DejanSEO carried out a successful experiment on how he hijacked a few pages in Google to show his copied version over the original version of the page!!
For example, he was able to confuse Google to display a page hosted on his website dejanseo.com.au instead of the original page located at marketbizz.nl
Strange, but how did he do it?
He simply copied the full page, source code and everything and put it on a new URL on his site. He pointed some links to the page and even +1’d it and the result worked days later. He literally baffled Google by making it believe that a relatively fresh page should basically appear in place of another be it on another domain. He is a picture of Google’s search results for the page using the title of the page:
The same thing was repeated for three other domains with varied success. In some cases, using a rel=canonical seemed to prevent it from hijacking the result fully but not in all cases. Dan Petrovic was even able to hijack the first result for Rand Fishkin’s name !! Check out the below screenshot:
The way this seems to work is that Google’s duplicate content system feels that the new URL is the more important page and thus replaces the original page with the more important page even if it is on another domain.
This is where Authorship can help. Once Author Rank comes into effect it’s going change the way content providers are getting ranked. These changes may be a concern for those relying on spammy linking schemes and unwanted content (like this for instance), but people providing valuable content will enjoy the benefits. Author Rank will allow for a more trustworthy and efficient way of ranking your articles or content in the search results.
A month ago, Matt Cutts released a video mentioning how guest blogging can be a good method for acquiring quality links. However, Matt Cutts created another video about spammy guest blogging and how Google will take action against it.
Some key points from the video:
- Do not involve yourself in posting on guest blogs having low quality content or spun content since it is a pretty bad indicator of quality.
- Because of its easy nature, Guest blogging is getting manipulated by spammers.
- Guest blogging and getting links from such spammy blogs can in fact hurt your sites reputation.
- As the site owner or as the person who’s trying to get links, you have to think about the quality of the links, the quality of the content, the amount of work that is put into it, and essentially whether users are going to be happy if they land on that page.
- Get links from such blogs only if they provide value to the users through high quality unique content.
The release of Bing Webmaster Guidelines (finally) came as a big news to the SEO community. It helps webmasters and publishers get their content found and indexed on the search engine.
With the release of the new Bing Webmaster Guidelines, the company is committing to being a webmaster-friendly service that puts their needs at the top-of-mind, and although the Guidelines aren’t much different than Google’s, they do provide a great refresher on what how Bing crawls, ranks and indexes sites.
Let’s take a look at some important points:
Webmasters must provide “clear, deep, easy to find content” on their sites that make it more likely to be found, and ultimately indexed and shown in search results. This also includes images, white papers and videos, among other types. Rich, content-heavy sites that engage users and provide valuable and refreshed information are always the most sought after.
Bing frequently points to social sharing signals as a major way that the search engine measures influence. And since Facebook has significantly more users than Google+, Bing may very well become the standard search engine for socially inclined search results.
Bing explains that the best ways to be indexed by the search engine is to either link your content to Bing or use features that come with the Bing Webmaster Tools (e.g. Submit URL or Sitemap Upload) to make them aware of your content.
This section of the Bing Webmaster Guidelines examines a variety of different structural aspects of a website that can affect how they are crawled, indexed and eventually ranked on Bing’s SERPs. These include page load time, robots.txt, sitemaps, site technology, redirects and canonical tags.
SEARCH ENGINE OPTIMIZATION
The section directly distinguishes the key areas to put emphasis on when optimizing their websites, which include title tags, meta description tags, alt tags, <h1> tags, internal links, outbound links, social sharing, “crawlability” (i.e. XML sitemaps, robots.txt, navigational structure, URL structure, etc.), site structure (i.e. links, clean URLs, content hierarchy, etc.) and rich media.
It also breaks down specific on-page SEO information about head copy, body copy, anchor text, content and links. For instance, it suggests using unique, relevant titles and descriptions that are around 65 and 160 characters, respectively. The Guidelines also inform webmasters that they should only use one <h1> tag per page, base their content on keyword research, don’t use images to house content, use targeted keywords as anchor text to support other internal pages, use “rel=canonical” tags to help engines understand which pages they should index and much more.
TACTICS TO AVOID
Bing Webmaster Guidelines also mention a handful of different tactics that webmasters should avoid. In particular, the Guidelines warn against cloaking, using link building schemes (i.e. link farms, three-way linking, etc.), meta fresh redirects, duplicate content and social media schemes.
While the suggestions laid out in the Bing Webmaster Guidelines aren’t revolutionary, they do work as a great source of information for webmasters as they work to optimize their sites for search engines, particularly if they want to improve their Bing rankings.