Kamis, 26 November 2009

GENERIC CIALIS on my website? I think my site has been hacked!

How to use "Fetch as Googlebot", part 1
Webmaster level: Intermediate

Has your site ever dropped suddenly from the index or disappeared mysteriously from search results? Have you ever received a notice that your site is using cloaking techniques? Unfortunately, sometimes a malicious party "hacks" a website: they penetrate the security of a site and insert undesirable content. Sophisticated attackers can camouflage this spammy or dangerous content so that it doesn't appear for normal users, and appears only to Googlebot, which could negatively impact your site in Google's results.

In such cases it used to be very difficult to detect the problem, because the site would appear normal in the eyes of the user. It may be possible that only requests with a User-agent: of Googlebot and coming from Googlebot's IP could see the hidden content. But that's over: with Fetch as Googlebot, the new Labs feature in Webmaster Tools, you can see exactly what Googlebot is seeing, and avoid any kind of cloaking problems. We'll show you how:

Let's imagine that Bob, the administrator of www.example.com, is searching for his site but he finds this instead:



That's strange, because when he looks at the source code of www.example.com, it looks fine:



With much surprise Bob may receive a notice from Google warning him that his site is not complying with Google's quality guidelines. Fortunately he has his site registered with Webmaster Tools, let's see how he can check what Googlebot sees:

First Bob logs into Webmaster Tools and selects www.example.com. The Fetch as Googlebot feature will be at the bottom of the navigation menu, in the Labs section:



The page will contain a field where you can insert the URL to fetch. It can also be left blank to fetch the homepage.



Bob can simply click Fetch and wait a few seconds. After refreshing the page, he can see the status of the fetch request. If it succeeds, he can click on the "Success" link...



...and that will show the details, with the content of the fetched page:



Aha! There's the spammy content! Now Bob can be certain that www.example.com has been hacked.

Confirming that the website has been hacked (and perhaps is still hacked) is an important step. It is, however, only the beginning. For more information, we strongly suggest getting help from your server administrator or hoster and reading our previous blog posts on the subject of hacked sites:


If you have any questions about how to use the Fetch as Googlebot feature, feel free to drop by the Webmaster Help Forum. If you feel that your website might be hacked but are having problems resolving it, you might want to ask the experts in our "Malware and Hacked sites" category.

PS Keep in mind that once you have removed hacked content from your site, it will generally still take time for us to update our search results accordingly. There are a number of factors that affect crawling and indexing of your content so it's impossible to give a time-frame for that.

Rabu, 25 November 2009

Hard facts about comment spam

Webmaster Level: Beginner

It has probably happened to you: you're reading articles or watching videos on the web, and you come across some unrelated, gibberish comments. You may wonder what this is all about. Some webmasters abuse other sites by exploiting their comment fields, posting tons of links that point back to the poster's site in an attempt to boost their site's ranking. Others might tweak this approach a bit by posting a generic comment (like "Nice site!") with a commercial user name linking to their site.

Why is it bad?

FACT: Abusing comment fields of innocent sites is a bad and risky way of getting links to your site. If you choose to do so, you are tarnishing other people's hard work and lowering the quality of the web, transforming a potentially good resource of additional information into a list of nonsense keywords.

FACT: Comment spammers are often trying to improve their site's organic search ranking by creating dubious inbound links to their site. Google has an understanding of the link graph of the web, and has algorithmic ways of discovering those alterations and tackling them. At best, a link spammer might spend hours doing spammy linkdrops which would count for little or nothing because Google is pretty good at devaluing these types of links. Think of all the more productive things one could do with that time and energy that would provide much more value for one's site in the long run.


Promote your site without comment spam

If you want to improve your site's visibility in the search results, spamming comments is definitely not the way to go. Instead, think about whether your site offers what people are looking for, such as useful information and tools.

FACT: Having original and useful content and making your site search engine friendly is the best strategy for better ranking. With an appealing site, you'll be recognized by the web community as a reliable source and links to your site will build naturally.

Moreover, Google provides a list of advice in order to improve the crawlability and indexability of your site. Check out our Search Engine Optimization Starter Guide.

What can I do to avoid spam on my site?

Comments can be a really good source of information and an efficient way of engaging a site's users in discussions. This valuable content should not be replaced by gibberish nonsense keywords and links. For this reason there are many ways of securing your application and disincentivizing spammers.
  • Disallow anonymous posting.
  • Use CAPTCHAs and other methods to prevent automated comment spamming.
  • Turn on comment moderation.
  • Use the "nofollow" attribute for links in the comment field.
  • Disallow hyperlinks in comments.
  • Block comment pages using robots.txt or meta tags.
For detailed information about these topics, check out our Help Center document on comment spam.

My site is full of comment spam, what should I do?

It's never too late! Don't let spammers ruin the experience for others. Adopt security measures discussed above to stop the spam activity, then invest some time to clean up the spammy comments and ban the spammers from your site. Depending on you site's system, you may be able to save time by banning spammers and removing their comments all at once, rather than one by one.

If I spammed comment fields of third party sites, what should I do?

If you used this approach in the past and you want to solve this issue, you should have a look at your incoming links in Webmaster Tools. To do so, go to the Your site on the web section and click on Links to your site. If you see suspicious links coming from blogs or other platforms allowing comments, you should check these URLs. If you see a spammy link you created, try to delete it, else contact the webmaster to ask to remove the link. Once you've cleared the spammy inbound links you made, you can file a reconsideration request.

For more information about this topic and to discuss it with others, join us in the Webmaster Help Forum. (But don't leave spammy comments!)

Jumat, 20 November 2009

'New software version' notifications for your site

Webmaster level: All

One of the great things about working at Google is that we get to take advantage of an enormous amount of computing power to do some really cool things. One idea we tried out was to let webmasters know about their potentially hackable websites. The initial effort was successful enough that we thought we would take it one step further by expanding our efforts to cover other types of web applications—for example, more content management systems (CMSs), forum/bulletin-board applications, stat-trackers, and so on.

This time, however, our goal is not just to isolate vulnerable or hackable software packages, but to also notify webmasters about newer versions of the software packages or plugins they're running on their website. For example, there might be a Drupal module or Joomla extension update available but some folks might not have upgraded. There are a few reasons a webmaster might not upgrade to the newer version and one of the reasons could be that they just don't know a new version exists. This is where we think we can help. We hope to let webmasters know about new versions of their software by sending them a message via Webmaster Tools. This way they can make an informed decision about whether or not they would like to upgrade.

One of the ways we identify sites to notify is by parsing source code of web pages that we crawl. For example, WordPress and other CMS applications include a generator meta tag that specifies the version number. This has proven to be tremendously helpful in our efforts to notify webmasters. So if you're a software developer, and would like us to help you notify your users about newer versions of your software, a great way to start would be to include a generator meta tag that tells the version number of your software. If you're a plugin or a widget developer, including a version number in the source you provide to your users is a great way to help too.

We've seen divided opinions over time about whether it's a good security practice to include a version number in source code, because it lets hackers or worm writers know that the website might be vulnerable to a particular type of exploit. But as Matt Mullenweg pointed out, "Where [a worm writer's] 1.0 might have checked for version numbers, 2.0 just tests [a website's] capabilities...". Meanwhile, the advantage of a version number is that it can help alert site owners when they need to update their site. In the end, we tend to think that including a version number can do more good than harm.

We plan to begin sending out the first of these messages soon and hope that webmasters find them useful! If you have any questions or feedback, feel free to comment here.

Rabu, 18 November 2009

Running desktop and mobile versions of your site

(This post was largely translated from our Japanese version of the Webmaster Central Blog )

Recently I introduced several methods to ensure your mobile site is properly indexed by Google. Today I'd like to share information useful for webmasters who manage both desktop and mobile phone versions of a site.

One of the most common problems for webmasters who run both mobile and desktop versions of a site is that the mobile version of the site appears for users on a desktop computer, or that the desktop version of the site appears when someone finds them from a mobile device. In dealing with this scenario, here are two viable options:

Redirect mobile users to the correct version
When a mobile user or crawler (like Googlebot-Mobile) accesses the desktop version of a URL, you can redirect them to the corresponding mobile version of the same page. Google notices the relationship between the two versions of the URL and displays the standard version for searches from desktops and the mobile version for mobile searches.

If you redirect users, please make sure that the content on the corresponding mobile/desktop URL matches as closely as possible. For example, if you run a shopping site and there's an access from a mobile phone to a desktop-version URL, make sure that the user is redirected to the mobile version of the page for the same product, and not to the homepage of the mobile version of the site. We occasionally find sites using this kind of redirect in an attempt to boost their search rankings, but this practice only results in a negative user experience, and so should be avoided at all costs.

On the other hand, when there's an access to a mobile-version URL from a desktop browser or by our web crawler, Googlebot, it's not necessary to redirect them to the desktop-version. For instance, Google doesn't automatically redirect desktop users from their mobile site to their desktop site, instead they include a link on the mobile-version page to the desktop version. These links are especially helpful when a mobile site doesn't provide the full functionality of the desktop version -- users can easily navigate to the desktop-version if they prefer.

Switch content based on User-agent
Some sites have the same URL for both desktop and mobile content, but change their format according to User-agent. In other words, both mobile users and desktop users access the same URL (i.e. no redirects), but the content/format changes slightly according to the User-agent. In this case, the same URL will appear for both mobile search and desktop search, and desktop users can see a desktop version of the content while mobile users can see a mobile version of the content.

However, note that if you fail to configure your site correctly, your site could be considered to be cloaking, which can lead to your site disappearing from our search results. Cloaking refers to an attempt to boost search result rankings by serving different content to Googlebot than to regular users. This causes problems such as less relevant results (pages appear in search results even though their content is actually unrelated to what users see/want), so we take cloaking very seriously.

So what does "the page that the user sees" mean if you provide both versions with a URL? As I mentioned in the previous post, Google uses "Googlebot" for web search and "Googlebot-Mobile" for mobile search. To remain within our guidelines, you should serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device. It's fine if the contents for Googlebot are different from the one for Googlebot-Mobile.

One example of how you could be unintentionally detected for cloaking is if your site returns a message like "Please access from mobile phones" to desktop browsers, but then returns a full mobile version to both crawlers (so Googlebot receives the mobile version). In this case, the page which web search users see (e.g. "Please access from mobile phones") is different from the page which Googlebot crawls (e.g. "Welcome to my site"). Again, we detect cloaking because we want to serve users the same relevant content that Googlebot or Googlebot-Mobile crawled.

Diagram of serving content from your mobile-enabled site


We're working on a daily basis to improve search results and solve problems, but because the relationship between PC and mobile versions of a web site can be nuanced, we appreciate the cooperation of webmasters. Your help will result in more mobile content being indexed by Google, improving the search results provided to users. Thank you for your cooperation in improving the mobile search user experience.

Selasa, 17 November 2009

Pros and cons of watermarked images

Webmaster Level: All

What's our take on watermarked images for Image Search? It's a complicated topic. I talked with Peter Linsley—my friend at the 'plex, video star, and Product Manager for Image Search—to hear his thoughts.

Maile: So, Peter... "watermarked images". Can you break it down for us?
Peter: It's understandable that webmasters find watermarking images beneficial.
Pros of watermarked images
  • Photographers can claim credit/be recognized for their art.
  • Unknown usage of the image is deterred.
If search traffic is important to a webmaster, then he/she may also want to consider some of our findings:
Findings relevant to watermarked images
  • Users prefer large, high-quality images (high-resolution, in-focus).
  • Users are more likely to click on quality thumbnails in search results. Quality pictures (again, high-res and in-focus) often look better at thumbnail size.
  • Distracting features such as loud watermarks, text over the image, and borders are likely to make the image look cluttered when reduced to thumbnail size.
In summary, if a feature such as watermarking reduces the user-perceived quality of your image or your image's thumbnail, then searchers may select it less often. Preview your images at thumbnail size to get an idea of how the user might perceive it.
Maile: Ahh, I see: Webmasters concerned with search traffic likely want to balance the positives of watermarking with the preferences of their users -- keeping in mind that sites that use clean images without distracting artifacts tend to be more popular, and that this can also impact rankings. Will Google rank an image differently just because it's watermarked?
Peter: Nope. The presence of a watermark doesn't itself cause an image to be ranked higher or lower.

Do you have questions or opinions on the topic? Let's chat in the webmaster forum.

Jumat, 13 November 2009

Help Google index your mobile site

(This post was largely translated from our Japanese Webmaster Central Blog.)

It seems the world is going mobile, with many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn't easy. Mobile sites not only use a different format from normal desktop site, but the management methods and expertise required are also quite different. This results in a variety of new challenges. As a mobile search engineer, it's clear to me that while many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly. I'd like to help ensure that your mobile site is also available for users of mobile search.

Here are troubleshooting tips to help ensure that your site is properly crawled and indexed:

Verify that your mobile site is indexed by Google

If your web site doesn't show up in the results of a Google mobile search even using the 'site:' operator, it may be that your site has one or both of the following issues:
Googlebot may not be able to find your site
Googlebot, our crawler, must crawl your site before it can be included in our search index. If you just created the site, we may not yet be aware of it. If that's the case, create a Mobile Sitemap and submit it to Google to inform us to the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, in the same way as with a standard Sitemap.
Googlebot may not be able to access your site
Some mobile sites refuse access to anything but mobile phones, making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Our crawler for mobile sites is "Googlebot-Mobile". If you'd like your site crawled, please allow any User-agent including "Googlebot-Mobile" to access your site. You should also be aware that Google may change its User-agent information at any time without notice, so it is not recommended that you check if the User-agent exactly matches "Googlebot-Mobile" (which is the string used at present). Instead, check whether the User-agent header contains the string "Googlebot-Mobile". You can also use DNS Lookups to verify Googlebot.

Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, we then check for whether the URL is viewable on a mobile device. Pages we determine aren't viewable on a mobile phone won't be included in our mobile site index (although they may be included in the regular web index). This determination is based on a variety of factors, one of which is the "DTD (Doc Type Definition)" declaration. Check that your mobile-friendly URLs' DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML. If it's in a compatible format, the page is eligible for the mobile search index. For more information, see the Mobile Webmaster Guidelines.

If you have any question regarding mobile site, post your question to our Webmaster Help Forum and webmasters around the world as well as we are happy to help you with your problem.

Rabu, 11 November 2009

Post-Halloween Treat: New Keywords User Interface!

Our team had an awesome Halloween and we hope you did too. Yes, the picture below is our team; we take our Halloween costumes pretty seriously. :)


As a post-Halloween treat, we're happy to announce a brand new user interface for our Keywords feature. We'll now be updating the data daily, providing details on how often we found a specific keyword, and displaying a handful of URLs that contain a specific keyword. The significance column compares the frequency of a keyword to the frequency of the most popular keyword on your site. When you click on a keyword to view more details, you will get a list of up to 10 URLs which contain that keyword.

This will be really useful when you re-implement your site on a new technology framework, or need to identify which URLs may have been hacked. For example, if you start noticing your site appearing in search results for terms totally unrelated to your website (for example, "Viagra" or "casino"), you can use this feature to find those keywords and identify the pages that contain them. This will enable you to eliminate any hacked content quickly.

Let us know what you think!

Rabu, 04 November 2009

New personalization features in Google Friend Connect

Webmaster Level: All

Update: The described product or service is no longer available.


Just a few weeks ago, we made Google Friend Connect a lot easier to use by dramatically simplifying the setup process. Today, we're excited to announce several new features that make it possible for website owners to get to know their users, encourage users to get to know each other, and match their site content (including Google ads) to visitors' interests.

To learn more about these new features, check out the Google Social Web Blog.

Selasa, 03 November 2009

Get your site ready for the holidays: Webmasters - make your list and check it twice!

Webmaster Level: All

Are the holidays an important season for your website or online business? We think so! And to help make sure you're in good shape, we wanted to invite you to our Holiday Webmaster Webinar.

This Webex will be hosted by Senior Search Quality Engineer Greg Grothaus, and AdWords Evangelist Fred Vallaeys. They'll be discussing a range of webmaster best practices and useful Google tools followed by a Q&A session to make sure you and your site are well primed for the holiday rush!

Topic: Holiday Webmaster Webinar
Date: Friday, November 13, 2009
Time: 10:00 am, Pacific Standard Time (GMT -08:00, San Francisco)
Meeting Number: 574 659 815
Meeting Password: webmaster

Please click the link below to see more information, or to join the meeting.

-------------------------------------------------------
To join the online meeting (Now from iPhones too!)
-------------------------------------------------------
1. Go to https://googleonline.webex.com/googleonline/j.php?ED=133402392&UID=0&PW=db339c4e641e0f525412171e5646
2. Enter your name and email address.
3. Enter the meeting password: webmaster
4. Click "Join Now".

-------------------------------------------------------
To join the teleconference only
-------------------------------------------------------
Call-in toll-free number (US/Canada): 866-469-3239
Call-in toll number (US/Canada): 1-650-429-3300
Toll-free dialing restrictions: http://www.webex.com/pdf/tollfree_restrictions.pdf

-------------------------------------------------------
For assistance
-------------------------------------------------------
1. Go to https://googleonline.webex.com/googleonline/mc
2. On the left navigation bar, click "Support".