Kamis, 27 September 2007

Improve snippets with a meta description makeover



The quality of your snippet — the short text preview we display for each web result — can have a direct impact on the chances of your site being clicked (i.e. the amount of traffic Google sends your way). We use a number of strategies for selecting snippets, and you can control one of them by writing an informative meta description for each URL.

<META NAME="Description" CONTENT="informative description here">

Why does Google care about meta descriptions?
We want snippets to accurately represent the web result. We frequently prefer to display meta descriptions of pages (when available) because it gives users a clear idea of the URL's content. This directs them to good results faster and reduces the click-and-backtrack behavior that frustrates visitors and inflates web traffic metrics. Keep in mind that meta descriptions comprised of long strings of keywords don't achieve this goal and are less likely to be displayed in place of a regular, non-meta description, snippet. And it's worth noting that while accurate meta descriptions can improve clickthrough, they won't affect your ranking within search results.

Snippet showing quality meta description




Snippet showing lower-quality meta description



What are some good meta description strategies?
Differentiate the descriptions for different pages
Using identical or similar descriptions on every page of a site isn't very helpful when individual pages appear in the web results. In these cases we're less likely to display the boilerplate text. Create descriptions that accurately describe each specific page. Use site-level descriptions on the main home page or other aggregation pages, and consider using page-level descriptions everywhere else. You should obviously prioritize parts of your site if you don't have time to create a description for every single page; at the very least, create a description for the critical URLs like your homepage and popular pages.

Include clearly tagged facts in the description
The meta description doesn't just have to be in sentence format; it's also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information -- price, age, manufacturer -- scattered throughout a page, making it unlikely that a snippet will capture all of this information. Meta descriptions can bring all this data together. For example, consider the following meta description for the 7th Harry Potter Book, taken from a major product aggregator.

Not as desirable:
<META NAME="Description" CONTENT="[domain name redacted]
: Harry Potter and the Deathly Hallows (Book 7): Books: J. K. Rowling,Mary GrandPré by J. K. Rowling,Mary GrandPré">

There are a number of reasons this meta description wouldn't work well as a snippet on our search results page:
  • The title of the book is complete duplication of information already in the page title.
  • Information within the description itself is duplicated (J. K. Rowling, Mary GrandPré are each listed twice).
  • None of the information in the description is clearly identified; who is Mary GrandPré?
  • The missing spacing and overuse of colons makes the description hard to read.

All of this means that the average person viewing a Google results page -- who might spend under a second scanning any given snippet -- is likely to skip this result. As an alternative, consider the meta description below.

Much nicer:
<META NAME="Description" CONTENT="Author: J. K. Rowling, Illustrator: Mary GrandPré, Category: Books, Price: $17.99, Length: 784 pages">

What's changed? No duplication, more information, and everything is clearly tagged and separated. No real additional work is required to generate something of this quality: the price and length are the only new data, and they are already displayed on the site.

Programmatically generate descriptions
For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation.

Use quality descriptions
Finally, make sure your descriptions are... descriptive. It's easy to become lax on the quality of the meta descriptions, since they're not directly visible in the UI for your site's visitors. But meta descriptions might be displayed in Google search results -- if the description is high enough quality. A little extra work on your meta descriptions can go a long way towards showing a relevant snippet in search results. That's likely to improve the quality and quantity of your user traffic.

Selasa, 18 September 2007

Quick security checklist for webmasters

Written by Nathan Johns, Search Quality Team

In recent months, there's been a noticeable increase in the number of compromised websites around the web. One explanation is that people are resorting to hacking sites in order to distribute malware or attempt to spam search results. Regardless of the reason, it's a great time for all of us to review helpful webmaster security tips.

Obligatory disclaimer: While we've collected tips and pointers below, and we encourage webmasters to "please try the following at home," this is by no means an exhaustive list for your website's security. We hope it's useful, but we recommend that you conduct more thorough research as well.

  • Check your server configuration.
Apache has some security configuration tips on their site and Microsoft has some tech center resources for IIS on theirs. Some of these tips include information on directory permissions, server side includes, authentication and encryption.

  • Stay up-to-date with the latest software updates and patches.
A common pitfall for many webmasters is to install a forum or blog on their website and then forget about it. Much like taking your car in for a tune-up, it's important to make sure you have all the latest updates for any software program you have installed. Need some tips? Blogger Mark Blair has a few good ones, including making a list of all the software and plug-ins used for your website and keeping track of the version numbers and updates. He also suggests taking advantage of any feeds their websites may provide.

  • Regularly keep an eye on your log files.
Making this a habit has many great benefits, one of which is added security. You might be surprised with what you find.

  • Check your site for common vulnerabilities.
Avoid having directories with open permissions. This is almost like leaving the front door to your home wide open, with a door mat that reads "Come on in and help yourself!" Also check for any XSS (cross-site scripting) and SQL injection vulnerabilities. Finally, choose good passwords. The Gmail support center has some good guidelines to follow, which can be helpful for choosing passwords in general.

  • Be wary of third-party content providers.
If you're considering installing an application provided by a third party, such as a widget, counter, ad network, or webstat service, be sure to exercise due diligence. While there are lots of great third-party content on the web, it's also possible for providers to use these applications to push exploits, such as dangerous scripts, towards your visitors. Make sure the application is created by a reputable source. Do they have a legitimate website with support and contact information? Have other webmasters used the service?

  • Try a Google site: search to see what's indexed.
This may seem a bit obvious, but it's commonly overlooked. It's always a good idea to do a sanity check and make sure things look normal. If you're not already familiar with the site: search operator, it's a way for you to restrict your search to a specific site. For example, the search site:googleblog.blogspot.com will only return results from the Official Google Blog.
They're free, and include all kinds of good stuff like a site status wizard and tools for managing how Googlebot crawls your site. Another nice feature is that if Google believes your site has been hacked to host malware, our webmaster console will show more detailed information, such as a sample of harmful URLs. Once you think the malware is removed, you then can request a reevaluation through Webmaster Tools.

  • Use secure protocols.
SSH and SFTP should be used for data transfer, rather than plain text protocols such as telnet or FTP. SSH and SFTP use encryption and are much safer. For this and many other useful tips, check out StopBadware.org's Tips for Cleaning and Securing Your Website.

Here's some great content about online security and safety with pointers to lots of useful resources. It's a good one to add to your Google Reader feeds. :)

  • Contact your hosting company for support.
Most hosting companies have helpful and responsive support groups. If you think something may be wrong, or you simply want to make sure you're in the know, visit their website or give 'em a call.

We hope you find these tips helpful. If you have some of your own tips you'd like to share, feel free to leave a comment below or start a discussion in the Google Webmaster Help group. Practice safe webmastering!

Jumat, 14 September 2007

Subscriber stats and more

We're unrolling some exciting new features in Webmaster Tools.

First of all, subscriber stats are now available. Webmaster Tools now show feed publishers the number of aggregated subscribers you have from Google services such as Google Reader, iGoogle, and Orkut. We hope this will make it easier to track subscriber statistics across multiple feeds, as well as offer an improvement over parsing through server logs for feed information.


To improve the navigation and look and feel, we've also made some changes to the interface, including:
  • No more tabs! Navigate through the new sidebar.
  • Breadcrumbs in the page title for easier product navigation.
  • A sidebar that expands and contracts to show and hide options based on your current goal.
  • New sidebar topics: Overview, Diagnostics, Statistics, Links, Sitemaps, and Tools.
And last but not least, Webmaster Tools is now available in 20 languages! In addition to US English, UK English, French, Italian, Spanish, German, Dutch, Brazilian Portuguese, Traditional Chinese, Simplified Chinese, Korean, Russian, Japanese, Danish, Finnish, Norwegian, Swedish, and Polish, Webmaster Tools are now in Turkish and Romanian.

Sign in to see these changes for yourself. For questions or feedback, please post in the Google Webmaster Tools section of our Webmaster Help Group.

Update: some of the functionality described in this post is no longer available. More information.

Rabu, 12 September 2007

Google, duplicate content caused by URL parameters, and you



How can URL parameters, like session IDs or tracking IDs, cause duplicate content?
When user and/or tracking information is stored through URL parameters, duplicate content can arise because the same page is accessible through numerous URLs. It's what Adam Lasnik referred to in "Deftly Dealing with Duplicate Content" as "store items shown (and -- worse yet -- linked) via multiple distinct URLs." In the example below, URL parameters create three URLs which access the same product page.


(click to enlarge)

Why should you care?
When search engines crawl identical content through varied URLs, there may be several negative effects:

1. Having multiple URLs can dilute link popularity. For example, in the diagram above, rather than 50 links to your intended display URL, the 50 links may be divided three ways among the three distinct URLs.

2. Search results may display user-unfriendly URLs (long URLs with tracking IDs, session IDs)
* Decreases chances of user selecting the listing
* Offsets branding efforts


How we help users and webmasters with duplicate content
We've designed algorithms to help prevent duplicate content from negatively affecting webmasters and the user experience.

1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.

2. We select what we think is the "best" URL to represent the cluster in search results.

3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL.

Consolidating properties from duplicates into one representative URL often provides users with more accurate search results.


If you find you have duplicate content as mentioned above, can you help search engines understand your site?
First, no worries, there are many sites on the web that utilize URL parameters and for valid reasons. But yes, you can help reduce potential problems for search engines by:

1. Removing unnecessary URL parameters -- keep the URL as clean as possible.

2. Submitting a Sitemap with the canonical (i.e. representative) version of each URL. While we can't guarantee that our algorithms will display the Sitemap's URL in search results, it's helpful to indicate the canonical preference.


How can you design your site to reduce duplicate content?
Because of the way Google handles duplicate content, webmasters need not be overly concerned with the loss of link popularity or loss of PageRank due to duplication. However, to reduce duplicate content more broadly, we suggest:

1. When tracking visitor information, use 301 redirects to redirect URLs with parameters such as affiliateID, trackingID, etc. to the canonical version.

2. Use a cookie to set the affiliateID and trackingID values.

If you follow this guideline, your webserver logs could appear as:

127.0.0.1 - - [19/Jun/2007:14:40:45 -0700] "GET /product.php?category=gummy-candy&item=swedish-fish&affiliateid=ABCD HTTP/1.1" 301 -

127.0.0.1 - - [19/Jun/2007:14:40:45 -0700] "GET /product.php?item=swedish-fish HTTP/1.1" 200 74

And the session file storing the raw cookie information may look like:

category|s:11:"gummy-candy";affiliateid|s:4:"ABCD";

Please be aware that if your site uses cookies, your content (such as product pages) should remain accessible with cookies disabled.


How can we better assist you in the future?
We recently published ideas from SMX Advanced on how search engines can help webmasters with duplicate content. If you have an opinion on the topic, please join our conversation in the Webmaster Help Group (we've already started the thread).

Update: for more information, please see our Help Center article on canonicalization.

Kamis, 06 September 2007

Webmaster Central gets a new look

Written by David Sha, Webmaster Tools Team

We launched Webmaster Central back in August 2006, with a goal of creating a place for you to learn more about Google's crawling and indexing of websites, and to offer tools for submitting sitemaps and other content. Given all of your requests and recommendations, we've also been busy working behind the scenes to roll out exciting new features for Webmaster Tools, like internal/external links data and the Message Center, over the past year.

And so today, we're unveiling a new look on the Webmaster Central landing page at http://www.google.com/webmasters. You'll still find all of the tools and resources you've come to love like our Webmaster Blog and discussion group -- but now, in addition to these, we've added a few more you might enjoy and find useful. We hope that the new layout will make it easier to discover some additional resources that will help you learn even more about how to improve traffic to your site, submit content to Google, and enhance your site's functionality.

Here's a brief look at some of the new additions:

Analyze your visitors. Google Analytics is a free tool for webmasters to better understand their visitor traffic in order to improve site content. With metrics including the amount of time spent on each page and the percentage of new vs. returning visits to a page, webmasters can tailor their site's content around pages that resonate most with visitors.

Add custom search to your pages. Google Custom Search Engine (CSE) is a great way for webmasters to incorporate search into their site and help their site visitors find what they're looking for. CSE gives webmasters access to a XML API, allowing greater control over the search results look and feel, so you can keep visitors on your site focused only on your content.

Leverage Google's Developer Tools. Google Code has tons of Google APIs and developer tools to help webmasters put technologies like Google Maps and AJAX Search on their websites.

Add gadgets to your webpage. Google Gadgets for your Webpage are a quick and easy way for webmasters to enhance their sites with content-rich gadgets, free from the Google Gadget directory. Adding gadgets to your webpage can make your site more interactive and useful to visitors, making sure they keep coming back.

We'd love to get your feedback on the new site. Feel free to comment below, or join our discussion group.