Advertisment

Get your Website Rated Higher

author-image
PCQ Bureau
New Update

With over two trillion Web pages and more being added, removed and updated every moment, how do you ensure that your site turns up when someone goes searching? We give you some tips on getting your website on a smart search engine’s first

page.

Advertisment

First the basics. When you submit a site to a search engine, it gets registered in its database. After that a spider periodically visits your site and indexes your pages. This is called spidering. Spiders today are quite advanced and can even index Word documents, Acrobat PDF files and PowerPoint slides. 

Each engine uses its own (but not necessarily proprietary) algorithms to determine how important a page is in relation to what the user is looking for. It then orders the results page before showing it to you, when you search for something.

Now let’s look at some ways of making your site show up higher in search-engine results.

Advertisment

Tell the search engine about your website



The first thing you need to do is to tell the search engine that your website exists. The simplest way to do this is to find the ‘Add URL’ Web page on a search engine’s website and submit your website’s URL there. If you have several websites or sub-domains running out of your domain, consider submitting each one separately, even if they are linked to one another.

Don’t use certain META tags



Many websites (especially the obnoxious ones) use META tags with misleading information in a bid to draw higher traffic to them. Since developers of search engines and spiders are now aware of this, you don’t need to use the description and keywords META tags.

Don’t create link/banner exchange URLs and Web-rings



Good spiders, such as Google, recognize link and banner exchange URLs as link farms and simply ignore such links. The same is the case with Web rings, where many websites of a similar interest category link to each other, creating a ring. So, creating such links, banners and Web rings will not get the spider to notice you. 

Advertisment

Have more pages on the website 



The more pages your website has, the higher the Web spiders rank it. But, a key requirement is that the pages must have real content. Moreover, since a spider can only index and rank pages that it knows, each page of your site must have atleast one link from any other page of your site, pointing to it.

Have inward links



Try to have as many inward links from other websites and pages as they count towards the ranking of your own page. What is also important is the rank of the page that has the link to your website. In fact, search engines do not recognize inward links from known ‘bad’ or blacklisted sites.

Have a ‘no frames’ alternative



While frames are a nice way to present information when parts of the page change randomly, spiders stay away from following any content in them. So, you should be prepared with a ‘no frames’ alternative. Use the tag to send such information automatically. Another way around would be to detect the spider signature early and send the spider a specially constructed page with the protected text. This method can be used to spider pages with library or reference information that would normally require the user to sign in. For example, on the PCQuest website, content from the last three published issues is protected. To have the spider index these pages, a script in the global.asa file (since this is an ASP-application website) detects the spider signature and allows access to the latest content. However, when you visit the page through a Web browser, you will get a login form to sign in. But, this method requires knowing all the spider signatures in advance. There is a Registry of Spiders available at<br></br> www.robotstxt.org/wc/ active.html.

Advertisment

Send browser invisible text



A spider also ignores things it does not recognize to be valid links. Examples include everything that is not inside SRC or HREF attributes. Therefore, if you have constructed a website that provides navigation through JavaScript or Flash-based menus, the chances are that the spider will never find them and therefore nothing beyond your homepage will appear. Therefore, you must resort to other means to get the spider to follow such links. The most favorite is to send the browser invisible text (that is, in the same color as the page-background) that contains such information that the spider cannot normally see. It is not of much use to keep these in an HTML comment as spiders now a days will filter all comments from HTML before parsing them.

More user clicks give higher ranking



Each time your website comes up in a user’s search and gets clicked, the spiders recognize these clicks and your website’s rank goes up. Therefore, more user clicks result in higher ranking. Of course, to visibly notice your page moving from the thousandth to the first page may keep you awake and online for a couple of lifetimes, since other people would be clicking on one or more of the other links as well. 

To sum up, a combination of pertinent content on and about the Web page and what you say elsewhere about the page (the META tags, the title text and even the URL of the page) must come together to get a favorable review from your favorite search engine. The more often this happens, the better is the chance that the intelligent Web spider will recognize your page and serve it as a valid result for a search.

Advertisment

You can visit these websites for more information:

http://webworkshop.net/pagerank.html



www.robotstxt.org/wc/active.html


www.google.com/technology/index.html

Sujay V Sarma, Developer Support .NET, EMEA Group

Advertisment