How to Keep Google’s Panda from Ruining Your Rankings

It used to be that Google let many crawling problems slide. Not anymore! Their Panda Updates, now almost 3 years old, penalize websites for communicating poorly with Googlebot. Panda 4.0 just rolled out last month, and has gotten quite a bit of press. Here are some tips to prevent a penalty on your clients’ sites. Panda is always evolving, but typically penalizes:

  1. “Thin” content: If you heard “thin is in,” think again: Google DISLIKES pages with little content. Before Panda, the recommendation was that articles should be around 250 words in length. After Panda, those were increased to a minimum of 450 words in length. As time has passed, some studies have shown Google favoring pages 1000 words in length! Of course, you shouldn’t sacrifice readability to meet such a quota: Keep content easy to browse and skim.
    .How do you Panda-proof content? Pages should be built out into 450-1000 words. Where that’s not possible, try consolidating content. And don’t forget to 301 redirect the old locations to the new URLs!
  2. Duplicate content: Google doesn’t like to find two pages that say the exact same thing. Google doesn’t like to find two pages that say the exact same… well, you get the point. It’s easy for sites to accidentally expose duplicate content to search engines: Tag pages, categories, and search results within a website can all lead to duplicate content. Even homepages can sometimes be found at multiple URLs such as:

    Connect With Your Best Prospects

    Connect With Your Best Prospects


    http://hyperdogmedia.com/index.html
    http://www.hyperdogmedia.com/index.htmlThis can be very confusing to Googlebot. Which version should be shown? Do the inbound links point to one, but onsite links to another?
    Never fear, there are easy fixes:
    a. Block Googlebot from finding the content – Check and fix your internal links. Try to prevent Google from discovering duplicate content during crawling. – Use robots metatags with a “NOINDEX” attribute and/or use robots.txt
    b. Use 301 Redirects to redirect one location to another. 301 redirects are a special redirect that passes on link authority one from URL to another. The many other kinds of redirects simply send a visitor to a new location, and are usually not the right solution for duplicate content issues.
    c. Canonical tags can also help These tags help Google sort out the final, canonical URL for content it finds. Where content is on multiple websites, canonical tags are still the solution: They work cross-site!

  3. Sitemap.xml files in disarray Google allows webmasters to verify their identity and submit this special xml file full of useful information. Webmasters can list the pages they want Google to index, as well as: – Define their pages’ modification dates – Set priorities for pages – Tell Google how often the page is usually updated Here we are able to actually define what Googlebot has been trying to figure out on its own for eons. But with great power comes great responsibility. For webmasters that submit (or have left submitted) an outdated sitemap.xml file full of errors, missing pages, duplicate or thin content the situation can become dire.
    The fix? Put your best foot forward and submit a good sitemap.xml file to Googlebot!
    a. Visit the most likely location for your sitemap.xml file: http://www.domain.com/sitemap.xml
    b. Are the URLs good quality content, or is your sitemap.xml file filed with thin, duplicate and missing pages?
    c. Also check Google Webmaster Tools: Is Google reporting errors with your sitemap.xml file in Webmaster Tools?
  4. Large amounts of 404 errors, crawl errors The sitemap.xml file is just a starting point for Google’s crawling. You should certainly have your most valuable URLs in there, but know that other URLs will indeed be crawled. Watch carefully in webmaster tools for crawl errors, and use other crawling tools such as MOZ.com to diagnose your website. Preparing your site for future Panda updates requires thinking like Googlebot. And once a website is in “tip-top shape,” ongoing vigilance is usually needed. In this age of dynamic websites and ever-changing algorithms, you can’t afford to rest!

PSST! Need a Free Link?
Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Like it? Share it!