Rabu, 20 Juni 2007

Siamo tornati dall' SES di Milano!

Un paio di chiarimenti...

Ciao! Siamo appena rientrati da un breve soggiorno in Italia. Tempo fantastico! Abbiamo partecipato come spettatori al Search Engine Strategies conference a Milano nei giorni 29 e 30 maggio. La conferenza è stato davvero una fantastica opportunità per parlare con molti di voi! Ci ha fatto molto piacere esserci e vorrei ringraziare tutti quelli che si sono fermati semplicemente a salutare o a discutere di strategie dei motori di ricerca. Abbiamo avuto la possibilità di parlare con diversi dei partecipanti e con alcuni dei più importanti attori del mondo SEO e Web Search Marketing in Italia. Discussioni utili e fruttuose per molti aspetti. Si e' parlato di come il mercato Web si stia sviluppando in Italia, di strategie SEO e di evangelizzazione (la traduzione italiana suona veramente forte).

Un buon numero di voi è saltato fuori con domande interessanti, e mi piacerebbe ora esporre un caso per poi fornire un paio di chiarificamenti che siano chiari e concisi.

Allora partiamo. Questa è la situazione in cui un webmaster potrebbe ritrovarsi: ho ottimizzato questo sito utilizzando tecniche non in accordo con le linee guida di Google. Ce la siamo cavata per un po', e questo ci ha aiutato a raggiungere la seconda posizione nei risultati di ricerca per alcune parole chiave. Ad un certo punto però, abbiamo ricevuto una email dal team della qualità della ricerca di Google che diceva che il nostro sito non sarebbe stato momentaneamente più presente nell'indice (nelle email c'è sempre almeno un esempio delle tecniche utilizzate). Abbiamo allora sistemato il sito togliendo tutto ciò che non era conforme alle linee guida e dopo alcuni giorni il nostro sito era di nuovo presente nell'indice. Come è possibile che non è più posizionato in seconda posizione nonostante il fatto che abbiamo rimosso tutto ciò che non era conforme alle linee guida?!

Va bene, lasciatemi fare un paio di domande prima di rispondere.

  • Non avete ottimizzato il sito utilizzando quelle tecniche al fine di posizionarlo il meglio possibile artificialmente?
  • Non pensavate che quelle tecniche avrebbero funzionato, almeno in una prospettiva di breve periodo?

Quindi se c'è stato un utilizzo di tecniche spam, incoraggiamo il sito che ha ricevuto la notifica da Google a prendere la cosa seriamente. Molti ripuliscono il proprio sito dalle tecniche scorrette di ottimizzazione dopo aver ricevuto una nostra notifica, ma noi dobbiamo anche tenere in considerazione che oltre a quelle presenti sul sito (per esempio testo nascosto, redirecting doorway page, etc) spesso ci sono anche tecniche utilizzate al di fuori del sito stesso come link popularity artificiali per guadagnarsi un’ottima posizione nelle pagine dei risultati di ricerca di Google.

Quindi, per rendere la questione più chiara possibile, una volta che ognuna delle manipolazioni sopra citate, inserite ai fini del posizionamento, e’ stata rimossa, il sito torna ad occupare la posizione che merita sulla base dei suoi contenuti e della sua link popularity naturale. C'è in oltre da evidenziare che il posizionamento del vostro sito dipende anche dagli altri siti relazionati al vostro per argomento trattato e tali siti nel frattempo potrebbero essere stati ottimizzati correttamente, va da sé che questo avrebbe un impatto anche sulla vostra posizione.

Notate che non c’è alcun tipo di penalizzazione preventiva applicata a quei siti che, ora puliti, hanno però visto in precedenza un utilizzo di tecniche non consentite. E questo è un punto a cui teniamo particolarmente: non rimangono né malus né macchie nella storia di un sito.

E' per questo motivo che insistiamo fermamente nel consigliare di lavorare sodo sui propri contenuti in modo che siano una risorsa che abbia valore per gli utenti, essendo proprio il buon contenuto una delle risorse più importanti che alimentano una link populary naturale e tutti dovremmo ormai sapere quanto una tale popolarità possa essere solida.

Qualità della ricerca, qualità dei contenuti e l'esperienza dei tuoi lettori.

Tra le varie conversazioni sulla qualità della ricerca, una su tutte ricorreva più spesso. Mi riferisco alle landing page e come scrivere per i motori di ricerca, due temi che spesso viaggiano in coppia quando si parla di risultati organici di ricerca.

Pensiamo allora al tuo visitatore che ha cercato qualcosa con Google e ha trovato la tua pagina. Ora, che tipo di accoglienza gli stai riservando? Una buona esperienza di ricerca consiste nel trovare una pagina che contiene l'informazione necessaria per rispondere alla domanda posta all'inizio.

Tuttavia un errore frequente nello scrivere per motori di ricerca é dimenticare proprio il visitatore e focalizzare l'attenzione solo sulla sua domanda. In effetti potremmo sostenere, "é con quella chiave di ricerca che hanno trovato la mia pagina!"

Alla fine dei conti, esasperare un comportamento del genere potrebbe portare a creare pagine fatta "su misura" per rispondere a quella ricerca ma con ben poco contenuto. Pagine del genere spesso utilizzano tecniche quali, tra l'altro, pure ripetizioni di parole, contenuti duplicati e in generale minimo contenuto. Ricapitolando, possono anche essere a tema con la domanda posta - ma per il tuo visitatore, sono inutili. In altri termini, hai finito per creare una pagina scritta solo per i motori di ricerca e ti sei dimenticato del visitatore. Il risultato é che l'utente trova pagine all'apparenza a tema ma in realtà completamente insignificanti.

Queste pagine "insignificanti", fatte artificialmente per generare traffico dai motori, non rappresentano una buona esperienza di ricerca. Anche se non adottano tecniche scorrette, quali ad esempio testo o links nascosti, sono fatte solo ed esclusivamente per posizionarsi per specifiche parole chiave, o combinazioni di parole, ma in realtà non offrono autonomamente alcun valore come risultato di una ricerca.

Un primo approccio per capire se stai causando una cattiva esperienza di ricerca ai tuoi utenti é controllare che le pagine trovate siano davvero utili. Queste pagine avranno contenuto a tema, che risponde alla domanda originalmente posta dall'utente ed in generale sono significative e rilevanti. Potresti cominciare con il controllo delle pagine che ricevono più visite e passare poi a rivedere tutto il sito. E per concludere, un consiglio: in generale, anche quando si vuole ottimizzare la pagina affinché il motore la trovi facilmente, bisogna ricordarsi che i visitatori sono il tuo pubblico e che una pagina scritta per i motori di ricerca non soddisfa necessariamente le aspettative del visitatore in termini di qualità e contenuti. Allora se stai pensando a come scrivere per il motore di ricerca, pensa invece ai tuoi utenti e a qual é il valore che stai offrendo loro!

We're back from SES Milan!

...with a couple of clarifications

Ciao everybody! We just got back from Italy—great weather there, I must say! We attended SES in Milan on the 29th and 30th of May. The conference was a great opportunity to talk to many of you. We really had a good time and want to thank all the people who stopped by to simply say "hi" or to talk to us in more detail about search engine strategies. This gave us a chance to talk to many participants and many of the big Italian actresses and actors in the SEO and web marketing worlds. We discussed recent developments in the Italian internet market, SEO strategies and evangelizing.

A number of you have raised interesting questions, and we'd like to go through two of these in more detail.

This is a situation a webmaster might find himself/herself in: I optimized this site using some sneaky techniques that are not in accordance with Google´s Webmaster Guidelines. I got away with it for a while and it helped me to rank in second position for certain keywords. Then, suddenly, I got an email from Google saying my site has been banned from the index because of those techniques (in these emails there is always an example of one of the infractions found). I now have cleaned up the site and after some days the site was back in the index.
Why on earth doesn't my site rank in the second position anymore, even though I've already paid for the sneaky techniques we used?

OK, before answering let me ask you a couple of questions:

  • Didn't you optimize your site with those techniques in order to artificially boost the ranking?
  • Didn't you think those techniques had worked out (in a short term perspective at least)?

So, if there has been spamming going on, we encourage a site that has gotten an email from Google to take this notification seriously. Many people clean up their sites after receiving a notification from us. But we must also take into account that besides the shady SEO techniques used on a particular site (for instance hidden text, redirecting doorway pages, etc) there are often off-site SEO techniques used such as creating artificial link popularity in order to gain a high position in Google's SERPs.

So, to make it straightforward, once those manipulations to make a site rank unnaturally high are removed, the site gains the position it merits based on its content and its natural link popularity. Note that of course the ranking of your site also depends on other sites related to the same topic and these sites might have been optimized in accordance to our guidelines, which might affect the ranking of your site.

Note that a site does not keep a stain or any residual negative effect from a prior breach of our webmaster guidelines, after it has been cleaned up.

That is why we first and foremost recommend to work hard on the content made for the audience of your site, as the content is a decisive factor for building natural link popularity. We all know how powerful a strong natural link popularity can be.

Search quality, content quality and your visitor's experience.

During our conversations about search-related issues, another topic that came up frequently was landing pages and writing for search engines, which are often related when we consider organic search results.

So, think of your visitors who have searched for something with Google and have found your page. Now, what kind of welcome are you offering? A good search experience consists of finding a page that contains enough information to satisfy your original query.

A common mistake in writing optimized content for search engines is to forget about the user and focus only on that particular query. One might say, that's how the user landed on my page!

At the end of that day, exaggerating this attitude might lead to create pages only made to satisfy that query but with no actual content on them. Such pages often adopt techniques such as, among others, mere repetition of keywords, duplicate content and overall very little value. In general, they might be in line with the keywords of the query – but for your visitor, they’re useless. In other words, you have written pages solely for the search engine and you forgot about the user. As a result, your visitor will find a page apparently on topic but totally meaningless.

These “meaningless” pages, artificially made to generate search engine traffic, do not represent a good search experience. Even though they do not employ other not recommendable techniques, such as for examples hidden text and links, they are very much made solely for the purpose of ranking for particular keywords, or a set of keywords, but actually are not offering a satisfying search result in itself.

A first step to identify if you are causing a bad search experience for your visitor consists of checking that the pages that he or she finds are actually useful. They will have topical content, that satisfies the query for which your visitor has found it and are overall meaningful and relevant. You might want to start with the pages that are most frequently found and extend your check up to your entire site. To sum up, as general advice, even if you want to make a page that is easily found via search engines, remember that the users are your audience, and that a page optimized for the search engine does not necessarily meet the user's expectations in terms of quality and content. So if you find yourself writing content for a search engine, you should ask yourself what the value is for the user!

Google's email communication with webmasters

Posted by Ríona MacNamara, Webmaster Tools Team

In den letzten Tagen gab es nochmals Versuche, deutsche Webmaster durch falsche E-Mails von Google zu verunsichern. Diese E-Mails stammen nicht von Google. Seit einigen Wochen hat Google die Benachrichtigung von Webmastern durch E-Mails eingestellt. Google arbeitet derzeit an einem zuverlässigeren Webmaster-Kommunikationsprozess.

We've noticed that someone is again trying to spoof the emails that Google sends to webmasters to alert them with issues about their site. These emails are not coming from Google, and in fact several weeks ago we temporarily discontinued sending these emails to webmasters while we explore different, secure ways of communicating with webmasters. Watch this space for more news - but in the meantime, you can safely assume that any email message you receive is not, in fact, from us.

Senin, 18 Juni 2007

Revamping the Webmaster Tools Help Center



A while ago, I posted in the Webmaster user group, looking for feedback on our Help Center and how we can improve the assistance we provide to our webmasters. And wow, did we get a a lot of feedback - both in the group and in the blogosphere. I'm amazed at the webmaster community and your willingness to share your thoughts with us: thank you!

Here's a selection of what we're hearing:

You want Help to be more discoverable

  • It's not as easy as it should be to find the information you're looking for. You'd like Google to do a better job of surfacing the answers to the most common questions. The browse structure doesn't make it easy for users to find help, and sometimes search depends on users knowing exactly the right term to search for.
  • You like the idea of context-sensitive help - on-the-spot assistance (often shown in a tooltip that appears when you hover over an item) that doesn't require you to click to a different Help page.
  • Right now, it's not clear when new Help information - or new features - are added, and you'd like Google to look at calling these out.

You want Help to be more useful

  • You'd like Google to look at adding videos and graphics
  • You'd like us to providing the kind of information that's relevant to the average webmaster, who may not have a deep knowledge of SEO techniques. You're looking for good and understandable answers to common questions.
  • You'd like us to expand the actual content, and do a much better job in explaining potential reasons why sites may have dropped the rankings.

What's next?

Well, over the next several weeks, we'll be working on lots of changes to the Help Center, both in its content and its organization. We'll be looking at all the feedback we've gotten, and we're taking it very seriously: believe me, I have a long task list for this area. But it can always grow: if you have some great thoughts or ideas, jump into the discussion, or just leave a comment right here.

Rabu, 13 Juni 2007

Expanding the webmaster central team

You've probably already figured this out if you use webmaster tools, the webmaster help center, or our webmaster discussion forum, but the webmaster central team is a fantastic group of people. You have seen some of them helping out in the discussion forums, and you may have met a few more at conferences, but there are lots of others behind the scenes who you don't see, working on expanding webmaster tools, writing content, and generally doing all they can for you, the webmaster. Even the team members you don't see are paying close attention to your feedback: reading our discussion forum, as well as blogs and message boards. We introduced you to a few of the team before SES NY and Danny Sullivan told you about a few Googler alternatives before SES Chicago. We also have several interns working with us right now, including Marcel, who seems to have been the hit of the party at SMX Advanced.

I am truly pleased to welcome a new addition to the team, although she'll be a familiar face to many of you already. Susan Moskwa is joining Jonathan Simon as a webmaster trends analyst! She's already started posting on the forums and is doing lots of work behind the scenes. Jonathan does a wonderful job answering your questions and investigating issues that come up and he and Susan will make a great team. Susan is a bit of a linguistic genius, so she'll also be helping out in some of the international forums, where Dublin Googlers have started reading and replying to your questions. Want to know more about Susan? You just never know what you find when you do a Google search.

Duplicate content summit at SMX Advanced

Last week, I participated in the duplicate content summit at SMX Advanced. I couldn't resist the opportunity to show how Buffy is applicable to the everday Search marketing world, but mostly I was there to get input from you on the duplicate content issues you face and to brainstorm how search engines can help.

A few months ago, Adam wrote a great post on dealing with duplicate content. The most important things to know about duplicate content are:
  • Google wants to serve up unique results and does a great job of picking a version of your content to show if your sites includes duplication. If you don't want to worry about sorting through duplication on your site, you can let us worry about it instead.
  • Duplicate content doesn't cause your site to be penalized. If duplicate pages are detected, one version will be returned in the search results to ensure variety for searchers.
  • Duplicate content doesn't cause your site to be placed in the supplemental index. Duplication may indirectly influence this however, if links to your pages are split among the various versions, causing lower per-page PageRank.
At the summit at SMX Advanced, we asked what duplicate content issues were most worrisome. Those in the audience were concerned about scraper sites, syndication, and internal duplication. We discussed lots of potential solutions to these issues and we'll definitely consider these options along with others as we continue to evolve our toolset. Here's the list of some of the potential solutions we discussed so that those of you who couldn't attend can get in on the conversation.

Specifying the preferred version of a URL in the site's Sitemap file
One thing we discussed was the possibility of specifying the preferred version of a URL in a Sitemap file, with the suggestion that if we encountered multiple URLs that point to the same content, we could consolidate links to that page and could index the preferred version.

Providing a method for indicating parameters that should be stripped from a URL during indexing
We discussed providing this in either an interface such as webmaster tools on in the site's robots.txt file. For instance, if a URL contains sessions IDs, the webmaster could indicate the variable for the session ID, which would help search engines index the clean version of the URL and consolidate links to it. The audience leaned towards an addition in robots.txt for this.

Providing a way to authenticate ownership of content
This would provide search engines with extra information to help ensure we index the original version of an article, rather than a scraped or syndicated version. Note that we do a pretty good job of this now and not many people in the audience mentioned this to be a primary issue. However, the audience was interested in a way of authenticating content as an extra protection. Some suggested using the page with the earliest date, but creation dates aren't always reliable. Someone also suggested allowing site owners to register content, although that could raise issues as well, as non-savvy site owners wouldn't know to register content and someone else could take the content and register it instead. We currently rely on a number of factors such as the site's authority and the number of links to the page. If you syndicate content, we suggest that you ask the sites who are using your content to block their version with a robots.txt file as part of the syndication arrangement to help ensure your version is served in results.

Making a duplicate content report available for site owners
There was great support for the idea of a duplicate content report that would list pages within a site that search engines see as duplicate, as well as pages that are seen as duplicates of pages on other sites. In addition, we discussed the possibility of adding an alert system to this report so site owners could be notified via email or RSS of new duplication issues (particularly external duplication).

Working with blogging software and content management systems to address duplicate content issues
Some duplicate content issues within a site are due to how the software powering the site structures URLs. For instance, a blog may have the same content on the home page, a permalink page, a category page, and an archive page. We are definitely open to talking with software makers about the best way to provide easy solutions for content creators.

In addition to discussing potential solutions to duplicate content issues, the audience had a few questions.

Q: If I nofollow a substantial number of my internal links to reduce duplicate content issues, will this raise a red flag with the search engines?
The number of nofollow links on a site won't raise any red flags, but that is probably not the best method of blocking the search engines from crawling duplicate pages, as other sites may link to those pages. A better method may be to block pages you don't want crawled with a robots.txt file.

Q: Are the search engines continuing the Sitemaps alliance?
We launched sitemaps.org in November of last year and have continued to meet regularly since then. In April, we added the ability for you to let us know about your Sitemap in your robots.txt file. We plan to continue to work together on initiatives such as this to make the lives of webmasters easier.

Q: Many pages on my site primarily consist of graphs. Although the graphs are different on each page, how can I ensure that search engines don't see these pages as duplicate since they don't read images?
To ensure that search engines see these pages as unique, include unique text on each page (for instance, a different title, caption, and description for each graph) and include unique alt text for each image. (For instance, rather than use alt="graph", use something like alt="graph that shows Willow's evil trending over time".

Q: I've syndicated my content to many affiliates and now some of those sites are ranking for this content rather than my site. What can I do?
If you've freely distributed your content, you may need to enhance and expand the content on your site to make it unique.

Q: As a searcher, I want to see duplicates in search results. Can you add this as an option?
We've found that most searchers prefer not to have duplicate results. The audience member in particular commented that she may not want to get information from one site and would like other choices, but for that case, other sites will likely not have identical information and therefore will show up in the results. Bear in mind that you can add the "&filter=0" parameter to the end of a Google web search URL to see additional results which might be similar.

I've brought back all the issues and potential solutions that we discussed at the summit back to my team and others within Google and we'll continue to work on providing the best search results and expanding our partnership with you, the webmaster. If you have additional thoughts, we'd love to hear about them!

Selasa, 12 Juni 2007

More ways for you to give us input

At Google, we are always working hard to provide searchers with the best possible results. We've found that our spam reporting form is a great way to get your input as we continue to improve our results. Some of you have asked for a way to report paid links as well.

Links are an important signal in our PageRank calculations, as they tend to indicate when someone has found a page useful. Links that are purchased are great for advertising and traffic purposes, but aren't useful for PageRank calculations. Buying or selling links to manipulate results and deceive search engines violates our guidelines.

Today, in response to your request, we're providing a paid links reporting form within Webmaster Tools. To use the form, simply log in and provide information on the sites buying and selling links for purposes of search engine manipulation. We'll review each report we get and use this feedback to improve our algorithms and improve our search results. in some cases we may also take individual action on sites.

If you are selling links for advertising purposes, there are many ways you can designate this, including:
  • Adding a rel="nofollow" attribute to the href tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
We value your input and look forward to continuing to improve our great partnership with you.

Kamis, 07 Juni 2007

More details about our webmaster guidelines

At SMX Advanced on Monday, Matt Cutts talked about our webmaster guidelines. Later, during Q&A, someone asked about adding more detail to the guidelines: more explanation about violations and more actionable help on how to improve sites. You ask -- we deliver! On Tuesday, Matt told the SMX crowd that we'd updated the guidelines overnight to include exactly those things! We work fast around here. (OK, maybe we had been working on some of it already.)

So, what's new? Well, the guidelines themselves haven't changed. But the specific quality guidelines now link to expanded information to help you better understand how to spot and fix any issues. That section is below so you can click through to explore these new details.

Quality guidelines - specific guidelines

As Riona MacNarmara recently posted in our discussion forum, we are working to expand our webmaster help content even further and want your input. If you have suggestions, please post them in either the thread or as a comment to this post. We would love to hear from you!