Selasa, 28 November 2006

Viva, Webmasters in Vegas

Thanks for visiting us at WebmasterWorld PubCon in Las Vegas couple weeks ago. Whether it was at the panel sessions, the exhibitor hall, or the Safe Bets event, we had a blast meeting you and sharing with you the many Google products that are available to webmasters to enhance and drive traffic to your site. For those who weren't able to join us, here are answers to some of the top questions that we heard:

Q: How do I increase the visibility of my site in search results?
A: There are many factors that can impact visibility of your site in search results. We outlined just a few tips that can make a big difference to increasing your site's visibility in Google search results. First, you should ensure your site has quality content that is unique. Second, have quality sites link to your site. Third, submit a Sitemap to let us know about all the URLs on your site. Fourth, sign up for a webmaster tools account to get information how about Google sees your site, such as crawl errors, indexing details, and top queries to your site. Lastly, you can visit Webmaster Central and Webmaster Help Center for more webmaster related questions and resources.

Q How much do I have to pay to create a Google Custom Search Engine?

A: Nothing -- it's free. In addition to being able to create your own custom search engine on your site, you can make money on your site using AdSense for Search.

Q: Why is it better to create gadgets rather than create feeds?
A: First, gadgets are much more flexible. As a publisher, you control the format of your content. Second, gadgets are by nature more interactive. They can be built with flash, HTML or AJAX, and are generally much more interesting than feeds. Finally, your users can customize a gadget to their liking, making your content a lot more targeted.

Q: What is this new ad placement feature for AdSense and how come I don't see it in my account?
A: Ad placements are publisher-defined groups of ad units that advertisers will see when searching for places where they can target their ads. If you don't yet see it in your AdSense account, it's because we've been slowly rolling out this feature to everyone. This exciting feature will be available to all publishers in the next few weeks, so be sure to keep an eye out.

Q: What's the easiest way to put a searchable Google Map on my web page?
A: Use the Map Search Wizard to design a Google Map for your page. The wizard will write all of the code for you; all you need to do is copy and paste the code into your web page, and your users will see your location on a map.

For more information about Google products for webmasters, you can check them out here:
We also wanted to share some photos from PubCon. If you look closely enough, you may be able to see yourself.


Thanks for stopping by, on behalf of the 25 Googlers in attendance!

Senin, 20 November 2006

Introducing Sitemaps for Google News

Good news for webmasters of English-language news sites: If your site is currently included in Google News, you can now create News Sitemaps that tell us exactly which articles to crawl for inclusion in Google News. In addition, you can access crawl errors, which tell you if there were any problems crawling the articles in your News Sitemaps, or, for that matter, any articles on your site that Google News reaches through its normal crawl.

Freshness is important for news, so we recrawl all News Sitemaps frequently. The News Sitemaps XML definition lets you specify a publication date and time for each article to help us process fresh articles in timely fashion. You can also specify keywords for each article to inform the placement of the articles into sections on Google News.

If your English-language news site is currently included in Google News, the news features are automatically enabled in webmaster tools; just add the site to your account. Here's how the new summary page will look:

The presence of the News crawl link on the left indicates that the news features are enabled. A few things to note:
  • You will only have the news features enabled if your site is currently included in Google News. If it's not, you can request inclusion.

  • In most cases, you should add the site for the hostname under which you publish your articles. For example, if you publish your articles at URLs such as http://www.example.com/business/article123.html, you should add the site http://www.example.com/. Exception: If your site is within a hosting site, you should add the site for your homepage, e.g., http://members.tripod.com/mynewssite/. If you publish articles under multiple hostnames, you should add a site for each of them.

  • You must verify your site to enable the news features.

We'll be working to make the news features available to publishers in more languages as soon as possible.

Rabu, 15 November 2006

Joint support for the Sitemap Protocol

We're thrilled to tell you that Yahoo! and Microsoft are joining us in supporting the Sitemap protocol.

As part of this development, we're moving the protocol to a new namespace, www.sitemaps.org, and raising the version number to 0.9. The sponsoring companies will continue to collaborate on the protocol and publish enhancements on the jointly-maintained site sitemaps.org.

If you've already submitted a Sitemap to Google using the previous namespace and version number, we'll continue to accept it. If you haven't submitted a Sitemap before, check out the documentation on www.sitemaps.org for information on creating one. You can submit your Sitemap file to Google using Google webmaster tools. See the documentation that Yahoo! and Microsoft provide for information about submitting to them.

If any website owners, tool writers, or webserver developers haven't gotten around to implementing Sitemaps yet, thinking this was just a crazy Google experiment, we hope this joint announcement shows that the industry is heading in this direction. The more Sitemaps eventually cover the entire web, the more we can revolutionize the way web crawlers interact with websites. In our view, the experiment is still underway.

Selasa, 14 November 2006

Badware alerts for your sites

As part of our efforts to protect users, we have been warning people using Google search before they visit sites that have been determined to distribute badware under the guidelines published by StopBadware. Warning users is only part of the solution, though; the real win comes from helping webmasters protect their own users by alerting them when their sites have been flagged for badware -- and working with them to remove the threats.

It's my pleasure to introduce badware alerts in Google webmaster tools. You can see on the Diagnostic Summary tab if your site has been determined to distribute badware and can access information to help you correct this.

If your site has been flagged and you believe you've since removed the threats, go to http://stopbadware.org/home/review to request a review. If that's successful, your site will no longer be flagged -- and your users will be safer as a result of your diligence.

This version is only the beginning: we plan to continue to provide more data to help webmasters diagnose issues on their sites. We realize that in many cases, badware distribution is unintentional and the result of being hacked or running ads which lead directly to pages with browser exploits. Stay tuned for improvements to this feature and others on webmaster tools.

Update: this post has been updated to provide a link to the new form for requesting a review.


Update: More information is available in our Help Center article on malware and hacked sites.

Senin, 13 November 2006

Las Vegas Pubcon 2006

As if working at Google isn't already a party, today I'm traveling to Las Vegas for WebmasterWorld PubCon 2006! But instead of talking bets and odds, I'll be talking about how Google can help webmasters improve their sites. I love chatting with webmasters about all the work that goes into creating a great website. Several other Googlers will be there too, so if you have a burning question or just wanna talk about random stuff feel free to stop us and say hi. Besides the sessions, we'll be at the Google booth on Wednesday and Thursday, so come by and introduce yourself.

Here's the list of Google events at PubCon:

Tuesday 14

10:15 - 11:30 SEO and Big Search Adam Lasnik, Search Evangelist

1:30 - 2:45 PPC Search Advertising Programs Frederick Vallaeys, Senior Product Specialist, AdWords

2:45 - 4:00 PPC Tracking and Reconciliation Brett Crosby, Senior Manager, Google Analytics

Wednesday 15

10:15 - 11:30 Contextual Advertising Optimization Tom Pickett, Online Sales and Operations

11:35 - 12:50 Site Structure for Crawlability Vanessa Fox, Product Manager, Google Webmaster Central

1:30 - 3:10 Duplicate Content Issues Vanessa Fox, Product Manager, Google Webmaster Central

5:30 - 7:30 Safe Bets From Google Cocktail party!

Thursday 16

11:35 - 12:50 Spider and DOS Defense Vanessa Fox, Product Manager, Google Webmaster Central

1:30 - 3:10 Interactive Site Reviews Matt Cutts, Software Engineer

3:30 - 5:00 Super Session Matt Cutts, Software Engineer

You can view this schedule on Google Calendar here:

Come to "Safe Bets From Google" on Wednesday 5:30-7:30pm -- it's a cocktail party where you can mingle with other webmasters and Googlers, learn about other Google products for webmasters, and in typical Google style enjoy some great food and drinks. I'll be there with some other engineers from our Seattle office. Don't miss it!

Jumat, 10 November 2006

New third-party Sitemaps tools

Hello, webmasters, I'm Maile, and I recently joined the team here at Google webmaster central. And I already have good news to report: we've updated our third-party program and websites information. These third-party tools provide lots of options for easily generate a Sitemap -- from plugins for content management systems to online generators.

Many thanks to this community for continuing to innovate and improve the Sitemap tools. Since most of my work focuses on the Sitemaps protocol, I hope to meet you on our Sitemaps protocol discussion group.

Kamis, 09 November 2006

The number of pages Googlebot crawls

The Googlebot activity reports in webmaster tools show you the number of pages of your site Googlebot has crawled over the last 90 days. We've seen some of you asking why this number might be higher than the total number of pages on your sites.


Googlebot crawls pages of your site based on a number of things including:
  • pages it already knows about
  • links from other web pages (within your site and on other sites)
  • pages listed in your Sitemap file
More specifically, Googlebot doesn't access pages, it accesses URLs. And the same page can often be accessed via several URLs. Consider the home page of a site that can be accessed from the following four URLs:
  • http://www.example.com/
  • http://www.example.com/index.html
  • http://example.com
  • http://example.com/index.html
Although all URLs lead to the same page, all four URLs may be used in links to the page. When Googlebot follows these links, a count of four is added to the activity report.

Many other scenarios can lead to multiple URLs for the same page. For instance, a page may have several named anchors, such as:
  • http://www.example.com/mypage.html#heading1
  • http://www.example.com/mypage.html#heading2
  • http://www.example.com/mypage.html#heading3
And dynamically generated pages often can be reached by multiple URLs, such as:
  • http://www.example.com/furniture?type=chair&brand=123
  • http://www.example.com/hotbuys?type=chair&brand=123
As you can see, when you consider that each page on your site might have multiple URLs that lead to it, the number of URLs that Googlebot crawls can be considerably higher than the number of total pages for your site.

Of course, you (and we) only want one version of the URL to be returned in the search results. Not to worry -- this is exactly what happens. Our algorithms selects a version to include, and you can provide input on this selection process.

Redirect to the preferred version of the URL
You can do this using 301 (permanent) redirect. In the first example that shows four URLs that point to a site's home page, you may want to redirect index.html to www.example.com/. And you may want to redirect example.com to www.example.com so that any URLs that begin with one version are redirected to the other version. Note that you can do this latter redirect with the Preferred Domain feature in webmaster tools. (If you also use a 301 redirect, make sure that this redirect matches what you set for the preferred domain.)

Block the non-preferred versions of a URL with a robots.txt file
For dynamically generated pages, you may want to block the non-preferred version using pattern matching in your robots.txt file. (Note that not all search engines support pattern matching, so check the guidelines for each search engine bot you're interested in.) For instance, in the third example that shows two URLs that point to a page about the chairs available from brand 123, the "hotbuys" section rotates periodically and the content is always available from a primary and permanent location. If that case, you may want to index the first version, and block the "hotbuys" version. To do this, add the following to your robots.txt file:

User-agent: Googlebot
Disallow: /hotbuys?*

To ensure that this directive will actually block and allow what you intend, use the robots.txt analysis tool in webmaster tools. Just add this directive to the robots.txt section on that page, list the URLs you want to check in the "Test URLs" section and click the Check button. For this example, you'd see a result like this:

Don't worry about links to anchors, because while Googlebot will crawl each link, our algorithms will index the URL without the anchor.

And if you don't provide input such as that described above, our algorithms do a really good job of picking a version to show in the search results.