Rabu, 30 September 2009

Translate your website with Google: Expand your audience globally

(This has been cross-posted from the Official Google Blog)

How long would it take to translate all the world's web content into 50 languages? Even if all of the translators in the world worked around the clock, with the current growth rate of content being created online and the sheer amount of data on the web, it would take hundreds of years to make even a small dent.

Today, we're happy to announce a new website translator gadget powered by Google Translate that enables you to make your site's content available in 51 languages. Now, when people visit your page, if their language (as determined by their browser settings) is different than the language of your page, they'll be prompted to automatically translate the page into their own language. If the visitor's language is the same as the language of your page, no translation banner will appear.


After clicking the Translate button, the automatic translations are shown directly on your page.


It's easy to install — all you have to do is cut and paste a short snippet into your webpage to increase the global reach of your blog or website.


Automatic translation is convenient and helps people get a quick gist of the page. However, it's not a perfect substitute for the art of professional translation. Today happens to be International Translation Day, and we'd like to take the opportunity to celebrate the contributions of translators all over the world. These translators play an essential role in enabling global communication, and with the rapid growth and ease of access to digital content, the need for them is greater than ever. We hope that professional translators, along with translation tools such as Google Translator Toolkit and this Translate gadget, will continue to help make the world's content more accessible to everyone.

Jumat, 25 September 2009

Using named anchors to identify sections on your pages

We just announced a couple of new features on the Official Google Blog that enable users to get to the information they want faster. Both features provide additional links in the result block, which allow users to jump directly to parts of a larger page. This is useful when a user has a specific interest in mind that is almost entirely covered in a single section of a page. Now they can navigate directly to the relevant section instead of scrolling through the page looking for their information.

We generate these deep links completely algorithmically, based on page structure, so they could be displayed for any site (and of course money isn't involved in any way, so you can't pay to get these links). There are a few things you can do to increase the chances that they might appear on your pages. First, ensure that long, multi-topic pages on your site are well-structured and broken into distinct logical sections. Second, ensure that each section has an associated anchor with a descriptive name (i.e., not just "Section 2.1"), and that your page includes a "table of contents" which links to the individual anchors. The new in-snippet links only appear for relevant queries, so you won't see it on the results all the time — only when we think that a link to a section would be highly useful for a particular query.

Senin, 21 September 2009

Google does not use the keywords meta tag in web ranking

Recently we received some questions about how Google uses (or more accurately, doesn't use) the "keywords" meta tag in ranking web search results. Suppose you have two website owners, Alice and Bob. Alice runs a company called AliceCo and Bob runs BobCo. One day while looking at Bob's site, Alice notices that Bob has copied some of the words that she uses in her "keywords" meta tag. Even more interesting, Bob has added the words "AliceCo" to his "keywords" meta tag. Should Alice be concerned?

At least for Google's web search results currently (September 2009), the answer is no. Google doesn't use the "keywords" meta tag in our web search ranking. This video explains more, or see the questions below.


Q: Does Google ever use the "keywords" meta tag in its web search ranking?
A: In a word, no. Google does sell a Google Search Appliance, and that product has the ability to match meta tags, which could include the keywords meta tag. But that's an enterprise search appliance that is completely separate from our main web search. Our web search (the well-known search at Google.com that hundreds of millions of people use each day) disregards keyword metatags completely. They simply don't have any effect in our search ranking at present.

Q: Why doesn't Google use the keywords meta tag?
A: About a decade ago, search engines judged pages only on the content of web pages, not any so-called "off-page" factors such as the links pointing to a web page. In those days, keyword meta tags quickly became an area where someone could stuff often-irrelevant keywords without typical visitors ever seeing those keywords. Because the keywords meta tag was so often abused, many years ago Google began disregarding the keywords meta tag.

Q: Does this mean that Google ignores all meta tags?
A: No, Google does support several other meta tags. This meta tags page documents more info on several meta tags that we do use. For example, we do sometimes use the "description" meta tag as the text for our search results snippets, as this screenshot shows:


Even though we sometimes use the description meta tag for the snippets we show, we still don't use the description meta tag in our ranking.

Q: Does this mean that Google will always ignore the keywords meta tag?
A: It's possible that Google could use this information in the future, but it's unlikely. Google has ignored the keywords meta tag for years and currently we see no need to change that policy.

Kamis, 17 September 2009

Spanish Site Clinic now live

The Google Webmaster Central blog in Spanish has launched a Site Clinic especially for the Spanish-speaking market. We're offering to analyze a series of websites in order to share some best practices with our community using real web sites. The plan is to offer constructive advice on accessibility and improvements that can lead to better visibility in Google's search results.
During this month, we will be receiving submissions from any legitimate website, but it must be primarily in Spanish. So before you submit your site, please visit the original post and if you want to participate fill out the form as soon as possible, because we will only be selecting 3-5 websites from the first 200 submitted for this Site Clinic, so don't miss out!

Selasa, 15 September 2009

Duplicate content and multiple site issues

Webmaster Level: All

Last month, I gave a talk at the Search Engine Strategies San Jose conference on Duplicate Content and Multiple Site Issues. For those who couldn't make it to the conference or would like a recap, we've reproduced the talk on the Google Webmaster Central YouTube Channel. Below you can see the short video reproduced from the content at SES:



You can view the slides here:



Senin, 14 September 2009

Recommendations for webmaster friendly freehosts.

Most of the recommendations we've made in the past are for individual webmasters running their own websites. We thought we'd offer up some best practices for websites that allow users to create their own websites or host users' data, like Blogger or Google Sites. This class of websites is often referred to as freehosts, although these recommendations apply to certain "non-free" providers as well.

  • Make sure your users can verify their website in website management suites such as Google's Webmaster Tools.

    Webmaster Tools provides your users with detailed reports about their website's visibility in Google. Before we can grant your users access, we need to verify that they own their particular websites. Verifying ownership of a site in Webmaster Tools can be done using a custom HTML file, a meta tag, or seamless integration in your system via Google Services for Websites. Other website management suites such as Yahoo! Site Explorer and Bing Webmaster Tools may use similar verification methods; we recommend making sure your users can access each of these suites.

  • Choose a unique directory or hostname for each user.

    Webmaster Tools verifies websites based on a single URL, but assumes that users should be able to see data for all URLs 'beneath' this URL in the site URL hierarchy.  See our article on verifying subdomains and subdirectories for more information.  Beyond Webmaster Tools, many automated systems on the web--such as search engines or aggregators--expect websites to be structured in this way, and by doing so you'll be making it easier for those systems to find and organize your content.

  • Set useful and descriptive page titles.

    Let users set their own titles, or automatically set the pages on your users' websites to be descriptive of the content on that page.  For example, all of the user page titles should not be "Blogger: Create your free blog".  Similarly, if a user's website has more than one page with different content, they should not all have the same title: "User XYZ's Homepage".

  • Allow the addition of tags to a page.

    Certain meta tags are reasonably useful for search engines and users may want to control them.  These include tags with the name attribute of "robots", "description", "googlebot", "slurp", or "msnbot". Click on the specific name attributes to learn more about what these tags do.

  • Allow your users to use third-party analytics packages such as Google Analytics.

    Google Analytics is free enterprise-class analytics software that can run on a website by just adding a snippet of JavaScript to the page.  If you don't want to allow users to add arbitrary JavaScript for security reasons, the Google Analytics code only changes by one simple ID.  If your let your users tell you their Google Analytics ID, you can set up the rest for them. Users get more value out of your service if they can understand their traffic better. For example, see Weebly's support page on adding Google Analytics. We recommend considering similar methods you can use for enabling access to other third-party applications.

  • Help your users move around.

    Tastes change.  Someone on your service might want to change their account name or even move to another site altogether.  Help them by allowing them to access their own data and by letting them tell search engines when they move part or all of their site via the use of 301 redirect destinations. Similarly, if users want to remove a page/site instead of moving it, please return a 404 HTTP response code so that search engines will know that the page/site is no longer around.  This allows users to use the urgent URL removal tool (if necessary), and makes sure that these pages drop out of search results as soon as possible.

  • Help search engines find the good content from your users.

    Search engines continue to crawl more and more of the web.  Help our crawlers find the best content across your site. Allow us to crawl users' content, including media like user-uploaded images.  Help us find users' content using XML Sitemaps.  Help us to steer clear of duplicate versions of the same content so we can find more of the good stuff your users are creating by creating only one URL for each piece of content when possible, and by specifying your canonical URLs when not.  If you're hosting blogs, create RSS feeds that we can discover in Google Blog Search.  If your site is down or showing errors, please return 5xx response codes.  This helps us avoid indexing lots of "We'll be right back" pages by letting crawlers know that the content is temporarily unavailable.

Can you think of any other best practices that you would recommend for sites that host users' data or pages?

Supporting Facebook Share and RDFa for videos

Have you ever wondered how to increase the chances of your videos appearing in Google's results? Over the last year, the Video Search team has been working hard to improve our index of video on the Web. Today, we're beginning the first in a series of posts to explain some best practices for sites hosting video content.

We previously talked about the importance of submitting a Video Sitemap or mRSS feed to Google and following Google's webmaster guidelines. However, we wanted to offer webmasters an additional tool, so today we're taking a page from the rich snippets playbook and announcing support for Facebook Share and Yahoo! SearchMonkey RDFa. Both of these markup formats allow you to specify information essential to video indexing, such as a video's title and description, within the HTML of a video page. While we've become smarter at discovering this information on our own, we'd certainly appreciate some hints directly from webmasters. Also, to maximize the chances that we find the markup on your video pages, you should make sure it appears in the HTML without the execution of JavaScript or Flash.

So, check out Facebook Share and RDFa and help Google find your videos!

Facebook Share:
<meta name="title" content="Baroo? - cute puppies" />
<meta name="description" content="The cutest canine head tilts on the Internet!" />
<link rel="image_src" href="http://example.com/thumbnail_preview.jpg" />
<link rel="video_src" href="http://example.com/video_object.swf?id=12345"/>
<meta name="video_height" content="296" />
<meta name="video_width" content="512" />
<meta name="video_type" content="application/x-shockwave-flash" />
RDFa (Yahoo! SearchMonkey):
<object width="512" height="296" rel="media:video"
resource="http://example.com/video_object.swf?id=12345"
xmlns:media="http://search.yahoo.com/searchmonkey/media/"
xmlns:dc="http://purl.org/dc/terms/">
<param name="movie" value="http://example.com/video_object.swf?id=12345" />
<embed src="http://example.com/video_object.swf?id=12345"
type="application/x-shockwave-flash" width="512" height="296"></embed>
<a rel="media:thumbnail" href="http://example.com/thumbnail_preview.jpg" />
<a rel="dc:license" href="http://example.com/terms_of_service.html" />
<span property="dc:description" content="Cute Overload defines Baroo? as: Dogspeak for 'Whut the...?'
Frequently accompanied by the Canine Tilt and/or wrinkled brow for enhanced effect." />
<span property="media:title" content="Baroo? - cute puppies" />
<span property="media:width" content="512" />
<span property="media:height" content="296" />
<span property="media:type" content="application/x-shockwave-flash" />
<span property="media:region" content="us" />
<span property="media:region" content="uk" />
<span property="media:duration" content="63" />
</object>

Selasa, 01 September 2009

Tips for News Search

Webmaster Level: All

During my stint on the "How Google Works Tour: Seattle", I heard plenty of questions regarding News Search from esteemed members of the press, such as The Stranger, The Seattle Times and Seattle Weekly. After careful note-taking throughout our conversations, the News team and I compiled this presentation to provide background and FAQs for all publishers interested in Google News Search.



Along with the FAQs about News Sitemaps and PageRank in the video above, here's additional Q&A to get you started:

Would adding a city name to my paper—for example, changing our name from "The Times" to "The San Francisco Bay Area Times"—help me target my local audience in News Search?
No, this won't help News rankings. We extract geography and location information from the article itself (see video). Changing your name to include relevant keywords or adding a local address in your footer won't help you target a specific audience in our News rankings.
What happens if I accidentally include URLs in my News Sitemap that are older than 72 hours?
We want only the most recently added URLs in your News Sitemap, as it directs Googlebot to your breaking information. If you include older URLs, no worries (there's no penalty unless you're perceived as maliciously spamming -- this case would be rare, so again, no worries); we just won't include those URLs in our next News crawl.
To get the full scoop, check out the video!