Wednesday, December 17, 2008

Google Webmaster Event

There was a Google Webmaster event today at the Dubai Press Club in the Dubai Media City. Matthew Chewy Trewhella, a developer advocate from Google, talked about the working of common search engines.

When you specify META keywords on your page, modern search engines compare the keywords with the content of the page to identify web spam.

When Google displays the description for a search result, it usually picks it up from the META description. If this is not specified, Google picks it up from the Open Directory Project (ODP). If there isn't an ODP entry either, Google brings something up on it's own.

You often find websites that make use of Flash and Javascript navigation. While there's nothing inherently wrong with that, search engines would find it difficult to crawl through the site so ideally a text-only version of the site should be built too (did you ever notice the text-only link for Google Maps? That's for search engines!)

Some firms have a corporate website and a micro-website for a product or a project. Google can treat the two sites as one if they have enough cross-links, thus giving the micro-website the PageRank advantage that the parent website has.

The frequency with which Google crawls the website depends on how frequently the website is updated. The PageRank takes about 1-3 months to stabilize so when you launch a new website or change your domain, you would have to wait and watch for a while.

When you're implementing paid links or forums on your site, make sure you add a REL="nofollow" to prevent the target websites from negatively affecting your site's PageRank.

For more control over how search engines visit your website, use a robots.txt file on your website. You can specify the User-Agent, Allow and Disallow to indicate which paths should be avoided and which paths should be indexed by the search engine. A sitemap also helps Google with crawling the website.

The folks at Google are constantly making changes to the search. There are over 200 quality attributes that are used in computing the PageRank and each attribute is managed by a team of about 5 developers. When a tweak is made to a quality attribute, the results are measured and that change is either kept or rolled back.

Trawhella presented the Google Webmaster tools and explained the various options provided by it. There's a lot there that tweaks how Google . He also suggests using the Lynx/Links text browser to see what search engines see on your website.

I didn't stay for the second half in which Trawhella was to speak about Google Analytics and site optimization.


Nitin R.K. said...

I'm sorry - I accidentally spelled 'Trewhella' as 'Trawhella'.

You can find a pic of him at

Gerald Hibbert said...

Hey Nitin,

Chewy here!

I just wanted to clarify one thing.

You mention:

There are over 200 quality attributes that are used in computing the PageRank and each attribute is managed by a team of about 5 developers.

I actually said (or at least meant to say!)
"Imagine that each signal is maintained by a team of 5 engineers, who sometimes change it a little bit and measure the results".

Just wanted to make sure no-one got the wrong impression.

Thanks for the write-up, and I hope you found it useful.

Nitin R.K. said...

Hi Chewy!

Thanks for the clarification. I must've been busy jotting down interesting statistics while you were flipping through the slides so I think I missed the word "imagine" :-(

BTW, I've been asked to talk about the event at the office, along with the 2 other guys from work who dropped by at the Dubai Press Club :-)