SEARCH ENGINE OPTIMIZATION

Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results for targeted keywords. Usually, the earlier a site is presented in the Search Engine Results Pages (SERPS) or the higher it "ranks", the more searchers will visit that site. SEO can also target different kinds of searches, including image search, local search, and industry-specific vertical search engines.

As a marketing strategy for increasing a site's relevance, SEO considers how search algorithms work and what people search for. SEO efforts may involve a site's coding, presentation, and structure, as well as fixing problems that could prevent search engine indexing programs from fully spidering a site. Another class of techniques, known as black hat SEO or spamdexing, use methods such as link farms and keyword stuffing that tend to harm search engine user experience. Search engines look for sites that employ these techniques and may remove them from their indices.

The initialism "SEO" can also refer to "search engine optimizers", terms adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems, URLs, and shopping carts that are easy to optimize..

How search engine work?

The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways.


Crawler-Based Search Engines

Crawler-based search engines, such as Google, create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found.

If you change your web pages, crawler-based search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role.


Human-Powered Directories

A human-powered directory, such as the Open Directory, depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted.

Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.


"Hybrid Search Engines" Or Mixed Results

In the web's early days, it used to be that a search engine either presented crawler-based results or human-powered listings. Today, it extremely common for both types of results to be presented. Usually, a hybrid search engine will favor one type of listings over another. For example, MSN Search is more likely to present human-powered listings from Look Smart. However, it does also present crawler-based results (as provided by Inktomi), especially for more obscure queries.


The Parts of a Crawler-Based Search Engine

Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes.

Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information.

Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine.

Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant. You can learn more about how search engine software ranks web pages on the aptly-named How Search Engines Rank Web Pages page.


Major Search Engines: The Same, But Different

All crawler-based search engines have the basic parts described above, but there are differences in how these parts are tuned. That is why the same search on different search engines often produces different results. Some of the significant differences between the major crawler-based search engines are summarized on the Search Engine Features Page. Information on this page has been drawn from the help pages of each search engine, along with knowledge gained from articles, reviews, books, independent research, tips from others and additional information received directly from the various search engines.

Now let's look more about how crawler-based search engine rank the listings that they gather

Difference in top search engine ranking criteria?

The most recent Internet size estimate is 1 billion pages - and growing. 85-90% of all Internet users rely on search engines to locate sites, but only 7% of them look past the first three pages of search results. Those top slots are valuable and competition for them is intense.

Let's look at several ways to move your site to the top.

Content, Content, Content

You've heard it before, but we can't stress it enough: the three most important factors in your search engine rank are: content, content, content.

Good content is critical to a good search engine score because many elements of search engine algorithms rely on page content to score Web sites. It also increases the probability that Yahoo or other popular directories will list your site.

Good content is vital. It's fundamental to every legitimate search engine strategy.

Link Popularity

Common optimization techniques (TITLE tags, META tags, and keyword frequency) are important because most search engines rely on them to score pages. Automated tools can help you simplify this task and leave you free to focus on another increasingly popular ranking strategy: link popularity (the total number of Web sites that link to yours). This technique requires no additional coding - just old-fashioned networking. Content is still critical since your site must contain valuable information that other sites want to share with their visitors.

Search engines determine your link popularity score by counting the number of outside links to your site (your internal page links don't count). Some use more complex algorithms that consider link importance - they rank the importance of the links and calculate a weighted link popularity score. Sites linked to "important" sites are more likely to be ranked higher. For instance, if Web Developer's Journal were to link to your site, that link could be worth more than 20 links from your friends' personal Web pages. In fact, it may be worth more since some search engines refuse to include links to free sites (like Geocities homepages), because spammers can use them to set up bogus links.

Many search engines are giving link popularity greater weight in their algorithms because they believe it indicates quality. After all, other sites are most likely to link to a site that displays good content, design, and usability. Google relies heavily on link popularity to rank sites. Other search engines factor it into their algorithms.

Look at how some of the largest search engines use link popularity:

Search Engine

Link Popularity

AltaVista

Uses link analysis and ranks sites based on "good" link popularity. Tends to ignore links generated through "link exchange" programs.

Excite

Uses link popularity and quality data to determine relevancy.

Inktomi

Link popularity is one ranking criteria.

GO

Link popularity is one ranking criteria.

Google

Uses weighted link popularity and analyzes link content almost exclusively to determine site rankings. Recently partnered with Yahoo - the largest directory.

Infoseek

Link popularity is considered in the new retrieval algorithm.

Site rankings based on link popularity impose huge penalties on new sites that haven't accumulated many links. This is where schmoozing counts. When you contact webmasters, offer to link to their site in return for a link and remind them how important link popularity can be to their overall ranking. While you're building links, remember to pay close attention to your HTML tags, keywords, and content. Until you have a large number of "good" links, those basic techniques are your best bet to improve your ranking.

Avoid Spam and HTML "Tricks"

As part of their continuing battle against spammers, many search engines have tightened their site eligibility policies. AltaVista recently instituted one of the most restrictive in the industry, banning sites for one or more of the following reasons:

· Using a hosting service that also hosts adult sites or documented spammers.

· Improper use of Gateway pages - also called Doorway or Jump pages.

· Submitting the same URL repeatedly or a large number of URLs from the same site.

· Excessive keyword repetition.

· Inserting keywords unrelated to the page's content.

· Hidden text.

· META refresh commands set to less than 30 seconds.

The first two items may surprise you. Most beginning webmasters look for a Web host based on cost first, then speed and reliability, when their provider's policy on adult sites may be just as important. AltaVista sometimes retaliates against adult sites' spam techniques by blocking those sites' underlying IP addresses entirely (as does GO.com). If you host with the same provider, your site may share that banned IP address. Choose your host carefully: you are judged by the company you keep!

AltaVista also seems to be taking a hard line against gateway pages - which they define as spam if the pages contain little or no real content. This is controversial; AltaVista dropped some sites that thought they were using them legitimately. Actually, the legitimacy of gateway pages has long been an issue with search engines. Many engines tolerated - but did not encourage - the pages. Now that AltaVista has become openly hostile, avoid submitting gateway pages to them and monitor the other search engines' policies closely.

The balance of AltaVista's criteria is common throughout the industry. Some search engine algorithms are so strict that you can be penalized for innocent design mistakes such as inadvertently using hidden text. Learn the pitfalls before you submit. Several online tools will scan your Web pages and warn you about possible violations. Net Mechanic will analyze a page for free at http://www.netmechanic.com/powerpack/optimize.htm.

Be careful. Tricks may boost your site temporarily - then get it banned permanently.

Ask the Experts

You spend months tuning your Web site to achieve high rankings, and then have it drop. Or no matter what you try, your site never gets a good ranking. Do you know why? If you don't have the time or expertise to ferret out the reasons yourself, consider paying for expert advice.

Thousands of consultants are eager to advise you about all aspects of the Internet. You can even hire a consultant to advise you on which consultant to hire! Expect to pay a consulting firm anywhere from $35 per page to $10,000+ for complete site analysis and optimization of large sites. The quality of advice varies: carefully investigate the company's background and methods before committing.

· Look closely at their own Web site: is it professional and appealing?

· What exactly do they guarantee to do for you? Be skeptical of services that "guarantee your site a Top 10 listing!"

· What methods do they use? You certainly don't want to hire a consultant who uses techniques that can get you banned.

· Will they provide references?

Good consultants supply focused, personalized service, but many businesses can't afford the expense. Webmasters for smaller sites often find less expensive automated search engine tools to be an efficient way to tune their Web sites. This requires a more do-it-yourself approach. While a consultant might personally optimize your page code and content, most automated tools require you to make the changes yourself.

You can purchase or subscribe to tools that provide a full suite of search engine and page optimization services. The tools are simple to use and give you advice that is easy to understand and implement on your own. Some companies sell software packages for your PC that analyze your pages and monitor your search engine rankings, while others offer similar tools online. Net Mechanic’s Search Engine Power Pack is an online tool that tracks your site's ranking and provides keyword assistance to improve your position.

Expert advice can come at a high price, but it doesn't have to. Get the best value for your money by carefully researching your options and evaluating your requirements. If you need immediate, reliable advice, a well-designed online tool may be your most cost-effective investment.

Constant Monitoring Is Crucial

Search engine ranking strategy is an ongoing process that begins during design and never stops. You may spend more time tweaking your site than you spent designing it! At a minimum, you must:

· Monitor your site's rankings by keyword on a weekly basis.

· Experiment with keyword and page content modifications.

· Be alert to changes in search engine policies and requirements. Today's legal design technique may be spam tomorrow if search engine policies change.

If you depend on search engines to deliver traffic to your Web site, then a high search engine ranking is critical to your success. Think you can't afford to spend the time and effort it takes to get there? You can't afford not to.

Why is search engine important?

Important Search Engines

Search directories are a hugely important part of the web. It is truly impressive what you can turn up by searching for keywords on just about any subject. There are really two types of web index, the search spiders or worms which crawl over the web tracing and indexing URL's they find starting from those submitted, and the structured indices where each URL submitted is checked, and classified.

Despite constant updating, all of these directories index only a small fraction of the whole web which is growing at a phenomenal rate. The spiders seem to have a hard time just keeping up with the URL's which are submitted to them. I used to put a lot of effort into submitting the URL's of my pages to these directories and following how they were taken up. It always takes a few weeks for them to come round and even after a few months I had only managed to get about half my pages indexed on any of them. Then one weekend almost all my pages disappeared from some of the larger directories. I suppose that the CompuServe web page server must have been down when they visited so the URL's got wiped off.

Maintaining these directories must require a huge effort so those which work deserve applaud. When the web was new several of these services popped up. Now they have almost all either gone commercial or fallen into disuse. The best of what remains are listed here. Perhaps market forces will be sufficient to encourage the invention of a technique to make these directories more complete.