The Truth About Search Engines

Much incorrect information is disseminated about search engine positioning. Whether the information is intentionally misleading or just out of date, you need to know the facts in order to make intelligent decisions on how to spend your positioning dollars and effort.

Metatags Aren’t The Magic Answer

The popular myth is: all you need to score well in search engines is to find the ‘right’ metatags. This is just not true. The search engines are all different and optimizing for good placement at each of the 3 most important ones is a formidable task.

However, you do need good keyword phrases because the keyword phrases are considered within the context of your page titles and text to increase or decrease your rankings.

Relevancy and Rankings

When someone uses a search engine to find Web pages, the order of the pages returned is determined by a relevancy ranking assigned by the search engine.

How do search engines determine relevancy? They use algorithms/formulas which generally assume that pages where the keywords appear in the title of a page are the most relevant to the search requested. Consequently, you should spend time crafting your page titles.

Each page in your site should have a title that makes use of the most important phrases that you would like to rank at the top of the search engine. With real estate sites, for example, this is generally “your city realestate” or “your city real estate”. But it might also be “your city condos”, “your city listings”, “your city golf course homes”, “your city schools”, etc., depending on what your marketing niche.

In addition to title relevancy, search engines also check to see if your keyword phrases appear near the top of the page, in the headline, and in the first few paragraphs of text. They assume that the page most relevant to the topic will mention those words right from the beginning.

Frequency of keyword distribution on a page is a major factor in how search engines determine relevancy. A search engine will analyze how often keywords appear in relation to other words in a Web page. Those pages with a higher frequency are often considered more relevant than other Web pages.

Things NOT To Do

Search engines may penalize pages or exclude pages (or entire sites), if they detect search engine spamming or keyword stuffing. Keyword stuffing worked way back in the dark ages. It no longer works, so don’t waste your time or listen to anyone who recommends you do it.

What Is Keyword Stuffing?

  • Repeating the same keyword(s) over and over in metatags
  • Sticking keywords in “comments” visible only in the source code of the page
  • Adding Invisible text repeating the same keywords over and over, usually at the bottom of the page – e.g., white text on a white background (invisible in browsers) but visible to robots and to any human who looks at the source code.

Search engines watch for common spamming methods in a variety of ways. They sometimes even go so far as to follow up on complaints from your competitors!

Human Visitors vs. Robots – The Conundrum

Site owners want their Web sites to be eye-catching for site visitors. Assuming you and your competitors all have good content, a distinctive look or a touch of pizzazz is what differentiates one site from another. The great search engine conundrum is human visitors vs. robots.

While a few search engines (and all directories!) use people to examine your site for inclusion, most search engines send out spiders or robots to collect material from your site in order to index it for the search engine. Unlike people, these robots can easily navigate only simple, straightforward site construction. They don’t like frames, javascripting navigation, flash, use of extra small font sizes and other design elements meant for humans. Just about anything that makes your site “spiffy” is anathema to spiders and robots. Pages designed to be pleasing to people are frequently not completely acceptable to spiders or robots.

A good site optimization specialist knows what to do to counteract the conundrum. The services of these folks are not inexpensive. They are paid for their very specialized knowledge and all the research time it takes to stay on top of the frequent changes in search engine ‘rules’.

Special Doorway Pages

Creating a page that is maximized for robots can result in a page that’s boring, boring, boring for people.

Some ill-advised search engine optimization techniques include sophisticated and expensive technology based on cgi scripting or php programming to “serve” pages to spiders and robots that are different from the pages they serve to browsers (site visitors). This is called cloaking and/or IP delivery. If you encounter a highly ranked competitor site with no apparent reason for the high ranking (no metatags, bare bones titles, etc.), the site is probably cloaking. While cloaking can be effective in achieving top placement, many search engines will ban any URL that is discovered practicing this method.

Cloaking sometimes depends on installing a meta refresh tag that automatically redirects a human visitor to another page. Some search engines strongly object to redirects and will ban sites that use them as a tool to fool spiders.

Sites That Seem to Violate All the Rules

When you start examining your competition to see why they get good rankings, you will find sites that violate all the guidelines we’ve stated. Some sites which use old techniques–keyword stuffing for example–are simply grandfathered in (accepted before the current rules went into effect) and have not yet been purged from the search engine. Some search engines almost never purge; others do it infrequently. The moral of the story – make sure you are looking at a recent submission should you choose to emulate techniques that look like they are working despite what we have told you here. Emulating an old submission under new rules will not get you accepted and may even get you banned.

Sometimes, the page you see in your browser is a bait and switch – a different page was originally submitted to the search engine but swapped out once the site was accepted for inclusion.

Just How Important Are The Search Engines?

We’d never want to say search engines are unimportant, but many site owners are obsessed with the rankings issue when they really should be spending time and money to promote their sites in other ways.

In general, you should expect to get anywhere from 15 to 50 percent of your traffic from search engines. That’s not a scientific number — it’s just based on the opinion of people in the Web marketing industry.

If you get very little traffic from search engines, you should hire a search engine positioning/optimization specialist to improve your rankings. Don’t overlook methods besides search engine positioning to get traffic to your site! Especially if you need instant results, traditional search engine submission is not the way to get them. The exception that will get you instant results in search engines is pay per click programs (PPC) that some search engines and directories have implemented. If you pay and are accepted, you CAN get instant results.