background image

 

 
 

Website Promotion  

Tutorial for Search Engines and Directories 

 
 

 
 

Search engines and directories can be one of the most effective ways of 

attracting interested visitors to your website.  Learning this process can be a 

challenge because of the misinformation that has been perpetuated over time.  

Search engine and directory systems are constantly changing.  Optimization 

techniques that worked five and ten years ago may no longer be valid. This 

document has been created by search marketing experts with the most current 

information available and is a proven approach to effectively promote your 

website through search engines and directories. 

 

After you complete this tutorial, you should be able to: 

1)  get your website listed (indexed) with search engines and directories; 

2)  make your website more crawler-friendly in support of the natural crawl 

search engines; 

3)  promote your website listings within the search engine results pages 

(SERPS); 

4)  prevent problems that could cause your site to be delisted or banned from 

a search engine; and 

5)  receive more targeted visitors from search engine referrals. 

 
 
 

 
 

Page 1    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

 

Index 

Introduction to search engines.......................................................................... 4 

Search Engines, Directories and PPC Engines ...................................................................... 4

 

Introduction .............................................................................................................................. 4

 

Four steps toward search marketing success ......................................................................... 4

 

Formulate a long term strategy................................................................................................ 4

 

Optimizing your website .................................................................................... 5 

Search engine optimization ...................................................................................................... 5

 

Introduction to search engine optimization .............................................................................. 5

 

Introduction to search engine spiders...................................................................................... 6

 

Meet the spiders ......................................................................Error! Bookmark not defined.

 

Keep your website online......................................................................................................... 6

 

Validate HTML ......................................................................................................................... 7

 

Stay away from “alternative-HTML” ......................................................................................... 7

 

Links and navigation.................................................................................................................. 7

 

Introduction to links and navigation ......................................................................................... 7

 

Use hypertext that can be followed by search engines ........................................................... 8

 

Image maps or navigation buttons........................................................................................... 8

 

Flash, Macromedia and Java Applets...................................................................................... 8

 

Javascript and DHTML ............................................................................................................ 9

 

Non-HTML formats (PDF, MS Word etc)................................................................................. 9

 

Text Content and Meta data ...................................................................................................... 9

 

Introduction to Text Content and the Meta data ...................................................................... 9

 

Introduction to keyword research........................................................................................... 10

 

Adding keywords to your website .......................................................................................... 10

 

Focus on a few keywords for each Web page....................................................................... 11

 

Important places to use your keywords ................................................................................. 11

 

Introduction to page titles....................................................................................................... 12

 

How to write good page titles................................................................................................. 12

 

Introduction to META data ..................................................................................................... 12

 

META data containers that are important to use ................................................................... 12

 

Adding META data to your Web pages ................................................................................. 13

 

Writing good META descriptions ........................................................................................... 13

 

Writing good META keywords ............................................................................................... 13

 

Will META keywords give me top rankings?.......................................................................... 14

 

Popularity measurement ......................................................................................................... 14

 

Introduction to Popularity ....................................................................................................... 14

 

Link popularity ........................................................................................................................ 15

 

Internal link popularity ............................................................................................................ 15

 

Common problems................................................................................................................... 16

 

Introduction to common problems ......................................................................................... 16

 

Frames ................................................................................................................................... 16

 

Dynamic websites and Web pages........................................................................................ 17

 

Flash and Macromedia Director............................................................................................. 17

 

Java Applets and other client side applications ..................................................................... 17

 

IP-delivery, agent-delivery and personalization ..................................................................... 18

 

Cloaking ................................................................................................................................. 18

 

Cookies .................................................................................................................................. 18

 

Submit and index pages................................................................................... 19 

Getting indexed ........................................................................................................................ 19

 

Introduction ............................................................................................................................ 19

 

Submitting to search engines ................................................................................................. 19

 

Introduction to submission ..................................................................................................... 19

 

Page 2    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

How to submit to search engines........................................................................................... 19

 

Do not over-submit................................................................................................................. 19

 

Paid Inclusion (PFI) programs ............................................................................................... 20

 

Excluding pages from getting indexed .................................................................................. 20

 

Introduction to exclusion of Web pages................................................................................. 20

 

Robots.txt ...............................................................................................................................20

 

META robots .......................................................................................................................... 20

 

FAQ .................................................................................................................... 21 

Can I pay the search engines to improve my rankings?........................................................ 21

 

Why do my Web pages not rank well? .................................................................................. 21

 

I have paid for inclusion, why do I not rank better? ............................................................... 21

 

How deep do search engines browse my website? .............................................................. 21

 

What do the search engines consider spam?........................................................................ 22

 

Can search engines index my dynamic website?.................................................................. 22

 

Can you tell me how the search engine algorithms work? .................................................... 22

 

The number of daily visitors from search engines has dropped – why? ............................... 23

 

Help! My site got dumped – what do I do? ............................................................................ 23

 

 

 
 

Page 3    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Introduction to search engines 

Search Engines, Directories and PPC Engines 

Introduction 
Search Engines are usually categorized in the following groups: 
 

1.  Search engines (crawler-based) 
2. Directories 
3.  PPC engines (pay-per-click) 

 
Search engines
 like Yahoo use sophisticated computer programs – called “spiders” – to browse 
the Web, analyze Web pages and programmatically index the content into large searchable 
databases.   
 
Search engines provide a database of website pages that are ranked by popularity and relevance 
for particular keywords or phrases and are displayed as a list ordered by popularity and 
relevance.  This ranking structure is based on mathematical equations that use many factors to 
determine Web page popularity and relevance.  By understanding these factors you can 
maximize the traffic and success of your website. 

 

Directories are human powered hierarchical structures of links to websites.  The most popular 
directories are: Yahoo! [

http://dir.yahoo.com/

], ODP [

http://dmoz.org/

]  

 
PPC engines are special purpose search engines that offer an auction-style advertising 
program.  Advertisers bid on keywords (or key phrases) for which their Web pages will be found 
during a search.  The term Pay-Per-Click (PPC) refers to the cost of delivering one visitor to your 
website, known as a “click-through”.   Typically, the highest bidder gets the top result, the next 
highest the second listing and so on.   
 
In this tutorial we will be focusing on the natural-crawl Search engines and how to optimize your 
website, to gain visibility in these search engines and attract more relevant, interested visitors. 

Four steps toward search marketing success 

This tutorial will guide you through four step process for effective search engine optimization: 

 
1.  Optimize your website 
2.  Ensure your pages are indexed 
3. Build 

popularity 

4.  Monitor and analyze results 

 
The Frequenly Asked Questions (FAQ) section listed at the end of this tutorial addresses some of 
the most frequent issues that are often encountered by experienced webmasters.  
 

Formulate a long term strategy 
Search engine optimization (SEO) is a journey and requires a longer term, wholistic view versus a 
quick fix to your search marketing needs.   
 
There are many campaign strategies that can be utilized to increase the number of targeted 
visitors your website.  Some popular tactics include: 
 

Page 4    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

•  optimize your website for ranking in the natural listings of search engines by increasing 

your websites’ relevance through the development of unique, high-quality content; 

•  optimize your website by increasing your websites’ popularity by establishing quality in-

links (links to your website);  

•  pay for inclusion in natural listings using paid inclusion programs;  

•  leverage search listings that can be purchased through auctions within search engines 

supporting PPC programs; and 

•  subscribe to free and paid directories with powerful advertising copy.  

 
This tutorial will focus specifically on search engine optimization (SEO).  However, you are 
encouraged to investigate other viable methods of getting qualified targeted visitors from search 
engines.  
 
In developing your strategy, you should focus on the methods which are most likely to generate 
the best results given the budget you have set in terms of money, time and resources.  Each 
method will vary on the quality of visitors.  There is not a way to predict the exact solution that is 
best for your business.  It is strongly recommended that you test each method carefully before 
using your entire budget (on potentially the wrong choice).  Take it step by step and see what 
works on a limited budget before increasing your spend to its maximum. 
 
When optimizing your site for the search engines, it is important to remember that the search 
engine itself will never buy anything you are selling.  Creating a site that is focused on usability 
and accessibility will greatly increase the chances of your visitors converting to customers.  
Creating a navigational structure that allows a user to find exactly what they are looking for from 
any page of the site will ensure a positive user experience.  Help visitors make choices by 
prompting them to perform the action you are hoping for with calls-to-action and benefit 
statements.  Calls-to-action is text or hypertext (links) that guides a vistor to “click here” or “buy x” 
or “learn about x” and help them understand what choice they should make.  By stating the 
benefit of the choice the likelihood that they will click through to the appropriate page is even 
higher.  An example of a benefit is “free shipping” or “30 day guarantee.” This whole process is 
called the path to “conversion” and should be measured and valued to the highest regard.  A path 
to conversion is the path a visitor follows from the moment they enter the website until they 
perform the action desired by the website owner, known as “converts.” 
 
A general suggestion is to focus primarily on building a usable and professional site while always 
heeding the optimization suggestions in this tutorial.  You will find that many of the features that 
characterize a usable website (i.e. well thought out navigation) are also key to optimizing your site 
for the engines.  Most engines provide a “guide” to follow when building your site and those 
suggestions will be included in this tutorial. 
 
Be realistic about how much work you are able to do in-house and how much of it you need to 
outsource.  Also, be sure to select methods that match your goals and are within your budget.  
 

Optimizing your website 

Search engine optimization  

Introduction to search engine optimization 
Every search engine has a proprietary method of determining the ranking of documents in search 
results. The ranking is the resulting ordered list that appears when a user enters a word or phrase 
and submits.  These words and phrases are referred to as “keywords” and “key phrases.”  An 
algorithm is (in this case) a complex set of computer-based rules meant to understand which Web 
pages on the internet are most relevant to a particular user, based on a given query.  For obvious 
reasons, search engines do not publicly reveal the detailed logic behind their algorithms.  

Page 5    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

 
There are a lot of similarities in the way the search engine algorithms function, even between 
competitors.  This tutorial will give you an introduction to the most fundamental parts, which 
should be enough to get you started on the right track.   
 
We will take you through the three most important elements of search engine optimization: 
 

• 

Iinks and navigation; 

• 

text and HTML elements (including Meta Data); and  

• 

popularity measurement. 

Introduction to search engine spiders 
Search engine web spiders (also known as web crawlers or web robots) are special purpose 
computer applications used to gather up-to-date information from the web.  Think of a search 
engine spider as a simple browser (e.g. Firefox), powered by a computer program.  It traverses 
the Internet like a human would do with a browser – just many times faster and fully automated.   
The web spiders follow navigational links and gather all of the readable information it encounters 
to be referenced in the Search engine Index.  It is critical to understand that the web spiders have 
some limitations in what they can process.  They readily process textual data and can only 
process some graphical data (images, gif, jpeg, mpeg, etc.) on a limited basis. 
 
Common text elements that are used by the spiders include, page titles, meta data, alt tags, as 
well as the on-page content.   
 
TIP: 
It is possible to monitor the activity of the search engine spiders on your website.  Most HTTP 
server access logs include the IP and User-Agent for all requests.  You can use this information to 
determine whether a particular spider has accessed your site and understand what pages the 
spider accessed and in what order.  For more information about the particular spiders and their 
User-Agents please follow the links provided in the following section. 

Keep your website online 
It is imperative that your site is accessible 24 hours day, 7 days a week, 365 days a year.  
 
To ensure that their Indexes are up-to-date, search engines visit billions of Web pages each 
month to capture new information.  The search spiders usually only have time to request each 
Web page once.  If your web server is not accessible at the time the search engine spider visits 
your website, your page(s) will not be included in the database and/or updates to your website will 
not be captured and subsequently will not be indexed. 
 
If this happens to you, do not panic! 
 
Most search engine web spiders crawl the web on a regular basis.  If some of your Web pages 
have been dropped because your server was down, they will probably get back into the index the 
next time the search engine visits your website. 
 
TIP:  
There are several online services and software packages available that can monitor your website 
and report any downtime.  If your site experiences drop-outs in search engines it might be a good 
idea to use one of these services to make sure your web server does not have too many 
prolonged drop-outs.   

Page 6    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Validate HTML 
HTML code is the most fundamental part of every Web page on the Internet.  When displaying 
your Web page with a browser, you typically do not see the actual HTML code but only the final 
result – how the Web page appears.  Most of the popular web browsers have a feature that will 
display the actual HTML source that is “behind” the visible Web page. 
 
Most browsers are good at interpreting HTML code, even code that may contain syntax errors.  
Even with fatal syntax errors some browsers will still show a nice looking page.  However, you 
never know how a search engine spider will interpret the same errors.  It may or may not index a 
Web page with syntax errors.  It may just skip the part that uses a wrong syntax or it may skip the 
entire page.  There is no way to predict what the web spider will do when it encounters HTML that 
it doesn’t understand.  
 
Although, you may want a site that is visually attractive, please keep in mind that search engines 
spiders are unable to see.  Spiders will not appreciate fancy graphics or the use of some 
techniques, like Frames.  Spiders do not translate the HTML code into a visual presentation like 
browsers do – they simple process and store textual content from the raw HTML code. 
 
If search engines cannot decode your website, follow its links and process the text, you will not 
get rankings.  It’s not enough to verify your Web pages by looking at them in your browser – the 
HTML syntax needs to be correct.  Therefore, it is very important to validate your HTML with an 
HTML validator.   
 
TIP: 
There are many commercial products available to do HTML editing and validation.  Use them 
religiously!  If you do not have access to any of these validators we strongly recommend you to 
use the public domain HTML validator at [http://www.htmlvalidator.com/csehtmlvalidator.php] or 
[

http://validator.w3.org/

]. 

 

Stay away from “alternative-HTML” 
Some search engine optimization “experts” will advise you to use non-standard HTML to try and 
squeeze in extra keywords in hidden places in your HTML-code.  Do not do that!  Search 
engines will penalize websites that use hidden keywords either by de-ranking the websites or 
removing it completely from their index.  Always stick to valid HTML. 

Links and navigation 

Introduction to links and navigation 
In this section we will look at the most important aspects of search engine friendly navigation and 
how to build links that will be followed by search engines spiders. 
 
By having a good navigation structure you will make it easier for the search engines to browse 
your website as well as making it easier for human visitors too.  
 
If it is easy for users to navigate your site they are more likely to become customers.  If it is easy 
for search engines to follow your links it is more likely that they will index many of your website 
pages.  
 
The more of your website pages that are indexed by the search engines, the more likely it is that 
one of your Web pages will be found in a search that can lead to a targeted visit to your website. 
 

Page 7    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Use hypertext that can be followed by search engines  
Search engine spiders will not follow every kind of link that web developers use.  In fact, the only 
kind of hypertext link that you can be certain all search engine spiders can follow is standard 
hypertext links.  
 
In HTML code, a standard hypertext link is written like this: 
 

<a href=”http://www.domain.com”>Link Text</a> 

 
The link will look like this on a Web page: 

Link Text

 
All other forms of links can be problematic for some search engines.  You can find more about 
what type of links the particular engines can interpret by visiting the URLs listed under the ‘Meet 
the Spiders’ section. 
 
TIP: 
If you are using a what-you-see-is-what-you-get (WYSIWYG) editor, or content management 
system, it will usually output the correct HTML code for you when you insert a text link.  You can 
verify that it is a valid text link by highlighting the link text in your browser.  If you can highlight it 
letter by letter like most other text on a Web page then the HTML code should be safe. 

Image maps or navigation buttons 
Sometimes designers want to use graphics for navigational links.  There are basically two ways of 
creating such graphical links:   
 

• 

Navigation buttons 

• 

Image maps 

 
Navigation buttons are single images that each link to a specific Web page.  Most search engines 
spiders will readily follow this type of link.  However, if you want to be absolutely sure that all 
search engines can properly interpret the images meaning, then utilize the “alt tag” as an added 
measure.  
 
Image maps can have different areas of one image linking to different Web pages. That is a very 
common way to build top- or side-navigation bars or country maps to click on.  The Image maps 
which are hypertext based are also good for visitors who do not use a graphic capable browser, 
or are surfing with graphics turned off to save bandwidth, or they simply do not understand that 
you want them to click on the image.  Individuals who are disabled or blind can use hypertext 
based links to understand the website by using a reader to “speak” the text on the page as well as 
to navigate the web.  Image maps add complexity that may confuse the search engine spiders. 
 

Flash, Macromedia and Java Applets 
Flash and Java Applets both require the user to have a special program installed on their 
computer to be able to open and play the files.  Most new browsers come with the necessary 
programs installed to play the most common file types.  However, search engines usually do not 
open such files.  
 
It is recommended that you build your navigation links without Flash or Java.  If you choose to 
build your Web pages using these techniques, you should only serve the high-tech version of your 
website to users that have the right plug-ins installed and have a low-tech version ready for all 
others. The low-tech version should include regular text-based links. This will work well with the 
search engines as well as the users that do not have these plug-ins installed. Javascript can be 
used to help browsers detect if a plugin is needed to display these parts of the site. 
 

Page 8    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Javascript and DHTML 
Javascript is widely used to perform simple client side operations (such as adding an item to a 
shopping cart).  The script is loaded into the users’ browser along with the HTML and can then be 
executed without having to contact the server again.  This works well, especially, if you utilize 
shopping carts and dynamic navigation schemes.  However, be aware, search engine spiders do 
not read Javascript!  
 
It is very important that your main navigation is NOT purely Javascript, alone.  If you choose to 
use Javascript for your main navigation be sure that the same links are also available as standard 
text links, in a sitemap or urllist.txt file. 
 

Non-HTML formats (PDF, MS Word etc) 
Even though some search engines have begun to read and index non-HTML file formats such as 
Microsoft Word, Microsoft Excel and Adobe Portable Document Format (PDF) files, none of the 
search engines will follow links within these formats.  If you want search engines to follow links on 
your website they should always be placed in your regular Web pages – the HTML documents. 
 
It highly recommended that you NOT place important content in these types of formats alone.  It 
will not be found by the search engines.  The content should be accessible in HTML format too – 
as a regular Web page.  Some search engines will index non-HTML formats; however, many will 
encounter more errors indexing them, and in most cases it is easier to have ordinary HTML-pages 
rank well. 
 
TIP: 
Sitemaps and other protocols can be created to help search engines easily navigate your website.  
A sitemap is a file consisting of hypertext-based links for each of your websites’ pages.   A 
URLlist.txt file is similar in nature but comprised solely of URL’s liste, one per line. 

Text Content and Meta data 

Introduction to Text Content and the Meta data  
If you want good rankings in search engines, then you need quality content. Content is King!  The 
more quality content you have the higher your chance of being found. 
 
One important element of HTML, the TITLE container, is given special ranking privilege. The text 
written in the TITLE container of a page appears in its search results (as the link), as bookmarks, 
taskbar buttons and ultimately in the browser window pane above the FILE menu when viewing 
the page. 
 
Text in other HTML containers is not as easily seen.

  

Usually, Meta data can only be found by 

viewing page source code.

  

Meta data is not typically very helpful for ranking in major search 

engines, but are useful nontheless and should be added.

  

The two important containers to have 

for each page are the Meta Keywords and Meta Description. 

 

ALT attributes of your images are hidden from view until you move your mouse to hover the 
cursor over images. The text will appear in a box if you hover for just a second or two.  Be careful 
to use text in ALT attributes of images that depict the associated image.  Using text that is 
misleading is particularly troublesome because the visually impaired have few other options to 
know what's going on.   

Page 9    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Introduction to keyword research 
Before you start optimizing your website you need to know what keywords your target audience is 
using to find your products and/or services.  Keywords are the words and phrases that people 
might use to find your products, brands, services or information, via search engines. 
 
For example, if your business is in the travel industry, text such as: “travel”, “vacation”, and 
“holiday” would be considered important keywords.  This is just a small sample; deep research 
into the topic of “travel” would probably show more than 100,000 different keywords people use 
when they search for travel information.  Your success in this area will depend on your ability to 
get into the minds of your target audience, to best predict their search behaviors. 
 
You may find out that the majority of searches for “travel” come from people using combinations 
of “travel” and a certain city, country or region.  Or you may verify that people use “travel” more 
than “vacation.” 
 
All research and insight should be used to help tailor your Web pages and the way you write your 
text, titles and META data.  If most people in your target audience use the term “travel” to 
describe your product, then so should you. 
 
TIP: 
Your most important keywords will often include some of the following: 
 

• 

Your company name if it is important for branding purposes 

• 

All product and brand names that are featured on your website 

• 

Countries, regions or cities you have an association with – often in combination with the 
one of the words above 

• 

Relevant generic terms for your business, brands or services (e.g. car, house or boat) 

• 

Combinations of 2 and 3 keywords – most people search for multi-word phrases rather 
than single words 

 
The following tool can be useful when doing keyword research: 
 

http://inventory.overture.com/d/searchinventory/suggestion/

Adding keywords to your website  
Once you have collected the keywords that will help bring targeted traffic to your website, it is time 
to incorporate these keywords into your Web pages. The question you should be asking yourself 
is wether or not, adding all these keywords will dramatically change the tone and feel of my site's 
text?  If so, you may want to compromise some to fit the best keywords from the list.  You are 
very fortunate if most of the keywords fit perfectly, or can otherwise easily fit your website.   
 
Some people confuse adding keywords to a Web page with simply adding Meta Keywords.  Meta 
Keywords is Meta data as described above in Text Content and Meta data.  Meta Keywords are 
mainly useful for you and only have a modest impact on rankings in major search engines.  They 
are still good to publish, and you will want a unique list for each page.  Use mainly words found on 
corresponding pages. 
 
Adding keywords to your Web page really means changing the page text.  Each element of HTML 
carries its own level of importance for rankings, and each search engine weighs these factors a 
little differently.  In the next section, we will examine each element of HTML as well as its relative 
importance for ranking in all major search engines. 
 
When adding keywords to your website, consider your target audience.  This is especially 
important if you are fully rewriting pages to incorporate new keywords.  Write text that has a 
natural style, not overtly dense with a keyword or two packed into every possible spot on the 

Page 10    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

page.   Also consider readability and write text that is easily comprehended by those you are 
trying to reach. 

Focus on a few keywords for each Web page 
In most cases a Web page will only rank well for a few keywords – the ones that the search 
engines determine to be the most relevant for that page.  You cannot expect to rank well for the 
thousands of keywords if you only optimize your front page.  Instead, you have to optimize each 
of the Web pages on your website that have content people are looking for – the keywords you 
have developed through your research. 

Important places to use your keywords 
Search engines weigh keywords according to their placement on a Web page.  There are some 
placements that are more important than others – places where you should always remember to 
use the most important keywords for each Web page.  Included are: 
 

• 

Page titles 
The title of your Web page is the single most important place to have your keywords.  
Keep titles concise (60 characters or less), well-written and use natural language.  
 

• 

Headlines 
Headlines can be important elements on your Web pages. The text found in headlines 
usually identify a particular theme or section of content.  The headlines can be formatted 
using the HTML elements <H1> to <H6> to be readily recocognized by search engines. 
 

• 

Body copy 
Many people lose sight that the Body copy is the most obvious place a search engine 
looks for relevant content.  You have to use the keywords in the main viewable text on 
your Web page. This is what the users will actually see, whether processed by humans or 
machines.  Generally speaking, if keywords are incorporated in the Body copy then they 
should appear on the viewable page.  
 

• 

Links 
The words on your Web page that are links to other Web pages are weighed heavily.  
Keep this in mind when building hypertext links from one page to another. 
 

• 

META data 
META keywords should contain words that appear on the Web page.  As a general rule, if 
it is on the page then the words can be included in the META data.  However, the page 
will not rank well on their use alone.  You can read more about META data in the next 
section.  Generally, the search engines will give weight to keywords that are used 
consistently across all of the areas described (e.g. titles, headlines, body copy, etc.). 
 

• 

ALT text 
The ALT attribute is also called “alternate text” and is used to describe images.  You can 
see the text when you hover your mouse over an image on a Web page (that is, if they 
have added the ALT attribute).  Some search engines recognize the text in ALT attributes 
but the weight (importance) given varies from engine to engine.  
 

 
Key point:  Remember to only use keywords that are relevant to each of your Web pages. 

Page 11    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Introduction to page titles  
The page title is the single most important piece of information on your page, when it comes to 
achieving good rankings in search engines. The words you use in your page title carry much more 
weight than anything else on your page.  Use this limited space wisely. 
 
Page titles are written this way in HTML code: 
 

<title>Page title goes here</title> 

 
The page title is placed inside the Head tag at the top of the HTML page: 
 

<head> 

<title>Page title goes here</title> 

</head> 

 
TIP: 
You should limit your page titles to 50-60 characters as most search engines do not consider 
more than this amount of characters. 
 

How to write good page titles 
First of all you should make sure that all your Web pages have unique titles that are highly 
relevant to the Web page.  Go through each of your Web pages and write a title that makes use of 
the most important keywords.  
 
Keep in mind that the page title is what the users will see as the first line of search results and will 
become the text link to your Web page.   The title text should compell searchers to click on the 
link and visit your site! 
 
The goal is to make titles that both make people click and that make use of your primary 
keywords for each page.  If you want a page to rank well on “dental services Boston” make sure 
to use those exact words in the title.  For example, a title such as  “Dental Services in Boston – 
open 24 hours a day”, would be appropriate and work well for you (that is, if you do in fact supply 
24 hour services). 
 
TIP: 
Find the right balance between the use of your keywords and writing catchy titles.  If you can’t 
afford a consumer panel, it is suggested that you write a few candidate titles for each page.  Do 
not think too much about each of them – just write them down on a piece of paper.  When you 
have a few good candidates, you can compare them side by side.  Ususally the best choice 
becomes obvious.  Consider combining some elements from multiple candidate tiltes, to form a 
hybrid.  Again, always keep in mind that our audience is both the search engine and a potential 
visitor. 

Introduction to META data 
META data containers are used to help classify or describe data on your page.  META data, 
which means “data about data” – simply means “data about your page.”  
 
META data can help search engines better understand and present links to your website. It is 
highly recommended that you populate the META data on all your Web pages.  
 

META data containers that are important to use 
You only need to focus on the META description and the META keywords. 
 

Page 12    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

They are written this way in HTML code: 
 

<META NAME=”DESCRIPTION” CONTENT=”Put your description here ….”> 
<META NAME=”KEYWORDS” CONTENT=”Put your keywords here ….”> 

 
The META data are placed inside the header tag at the top of the HTML code just below the page 
title: 
 

<head> 

<title>Page title here</title> 
<META NAME=”DESCRIPTION” CONTENT=”Put your description here ….”> 
<META NAME=”KEYWORDS” CONTENT=”Put your keywords here ….”> 

</head> 

Adding META data to your Web pages 
Adding META data to your Web pages will depend on how you maintain your website.  You 
should refer to the manual of the editor or content management system you are using for detailed 
instructions.  
 
In most cases there will be a simple box somewhere with two fields to fill out: The META 
description and the META keywords, or just “description” and “keywords.”  Your program will 
usually insert the necessary HTML code in the right place for you.  
 
If you are coding your Web pages by hand or in a simple HTML editor, make sure you do not 
make any syntax errors.  If the META data containers have not been coded correctly search 
engines will not be able to process them.  

Writing good META descriptions 
The META description is used by some search engines to present a summary of the Web page 
along with the title and link to the page.  
 
Many search engines use the META description, but not on a consistent basis.  Some search 
engines use text from your Web pages and some use a combination.  It is best to make sure you 
have valid META descriptions on all your Web pages so the search engines that do use them 
have something useful. 
 
TIP: 
You should limit your META descriptions to 150-200 characters as most search engines do not 
use more than this amount of characters. 

Writing good META keywords 
META keywords are very often misused and misunderstood.  Some webmasters put hundreds of 
keywords in this container hoping they will get good rankings for everything listed.  This is 
completely unrealistic.  Some webmasters add keywords that have absolutely no relevance to the 
actual Web page or website.  In reality, these practices are more likely to harm your placement in 
the search engines than help it.   
 
This has caused the search engines to not rely heavily on the META keywords as a determination 
of what a Web page is about.  However, some Web search engines and site search engines do 
use the META keywords, so we recommend that you craft relevant keywords for all your Web 
pages. 
 
Just as with META descriptions and titles you have to write a unique set of keywords for each 
Web page.  You should not copy the same set of keywords for all your Web pages.  
 

Page 13    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Do not add keywords that are not 100% directly related to each Web page. In fact, we 
recommend that you only use keywords that are found in the visual text of your Web pages to 
have the greatest effect.  
 
It is often discussed whether or not to use commas to demarcate keywords.  Some search 
engines remove the commas before reading the keywords and some search engines may use the 
comma.  To find the exact keyword and keyword phrase matches we recommend using commas, 
but you should be aware that you should not have too many instances of a single word.  Avoid 
going beyond 3 instances unless absolutely necessary.  You will never get penalized for either 
using or not using commas.  Some people use commas because it makes it easier to read in the 
raw code. This can be helpful if you want to edit the keywords at a later date. 
 
TIP: 
The limit for the META keywords container is 1000 characters; however, you should never add 
that many keywords.  Include the 3 to 5 most important keywords for each Web page – no more!  
The more keywords that you use (beyond the 3 to 5) the more diminished their value. 
 

Will META keywords give me top rankings? 
You will not achieve top rankings on keywords if you only use them in your META data.  META 
keywords are not magic bullets. 
 
In those search engines that use them, it gives you a better chance of ranking with relevant 
visitors when you have unique, optimized and relevant META keywords for all your pages. 
Remember, for the best results use only those words that appear in the main body of the page in 
the META keywords.  

Popularity measurement 

Introduction to Popularity 
When a search engine tries to determine which pages on the Web are the most relevant to a 
given search query they must consider more variables than just the content on each Web page.  
The search engines try to incorporate a measure of “popularity”.  To this end, most search 
engines today analyze both link structures and click-streams on the Web to determine the most 
appropririate Web pages to include in the SERPS. 
 
Getting high quality, relevant websites to link to your website is not something you do overnight.  
It takes time and effort.  By implementing a sound strategy to improve your Link popularity, you 
can gain a long-term advantage but it is not a quick solution to better rankings. It is like building a 
good reputation, a trusted brand – it takes time.  But once you have accomplished this goal, the 
ranking benefits are significant. 
 
Getting the right and relevant websites to link to you will not only build your Link popularity, as 
measured by search engines, it will lead to qualified visitors directly from the referring websites.  
 
Tip: 
 
The following list is a quick guide for where to look for relevant links: 
 

• 

All the major and local directories, such as Yahoo or ODP 

• 

All trade, business or industry related directories 

• 

Suppliers, happy customers, relevant sister companies, and partners 

• 

Related but not competing sites 

Page 14    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

 

Link popularity 
Link popularity considers the number of links between Web pages on the Internet.  There are two 
kinds of links to focus on: inbound links (also known as in links) and outbound links - links to your 
website and links from your website. 
 
Links can be thought of as a form of endorsement.  When you link to a website you are endorsing 
that site and recommending it to your visitors.  Likewise, when an external website links to your 
website they are recommending your site to their visitor.  This form of website endorsement adds 
to the Link popularity score of the website.  The higher the number of quality links from other 
popular websites, the more relevant the destination site is considered.  Some search engines 
consider this the single most important factor in determining relevance. 
 
 
 
 
TIP: 
Not all links count the same!  For example, links from recognized authorities in your industry count 
more than links from a small private website on a free host.  
 
Do not use free submission software or services to submit to hundreds of thousands of free 
directories and search engines just to gain more inbound links.  Links from most of these places 
won’t do you any good and there is even a risk that some of the links you get this way will harm 
your rankings.  
 
Do not participate in organized exchanging of unrelated links between websites, to boost your 
Link popularity factor.  Most search engines consider that to be harmful to relevance.  Instead, 
focus on getting inbound links from major directories and important related websites within your 
industry as they are the only ones that really count.  
 

Internal link popularity 
Links to and from external websites are important, but not the only consideration when developing 
your Link popularity strategy.  The link structure within your own website has an important role in 
determining the value of each of your Web pages. 
 
Your Web pages that are most often linked to will gain the most popularity.  If one of your Web 
pages has 500 internal pages pointing to it and another only has only 10, then the first Web page 
is more likely to be more valuable to users as there are more pages “voting” for that page. 
 
TIP: 
 
Typically, a website has a navigation bar of some kind that points to the 5-8 most important 
sections of the website.  This navigation bar will be on all Web pages on the website and 
therefore boost internal Link popularity on those sections.  Make sure to include links to the Web 
pages you most want to rank well in your cross-site navigation and make sure that the pages the 
navigation links to have good content.  The pages you link to in the navigation bar will be easier to 
rank well in search engines.  For large sites, a dynamic navigation can be a great solution.  One 
that changes based on the page you are currently viewing and offers related areas to further 
navigate. 

Page 15    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Common problems 

Introduction to common problems 
There are a number of things that can impede a search engine from indexing your website 
correctly.  The following section will address some of the most common problems. 

Frames  
Frames have always been problematic for search engines when they are crawling for information.  
Some studies have indicated that users also have difficulty performing simple navigational tasks 
when they are confronted with a website using frames. 

 

Jakob Nielsen, an acknowledged usability expert, addresses problems with frames in an 
important article titled: Why Frames Suck (Most of The Time) 
[http://www.useit.com/alertbox/9612.html/].  One of the peskiest problems with frames occurs 
when users want to bookmark the Web page.  When users save a Web page from a frames 
website into their browser’s bookmarks they are directed to the website’s homepage whenever 
they use the bookmark. 

 

Likewise, search engines cannot "bookmark" a deep page (using frames) of your website to 
reference it within a search listing.  Search engines will try to browse through the site and look for 
individual pages they can refer to, but users that follow search results can then get trapped 
without navigation!   So, search engines prefer to avoid ranking frames-based sites altogether.   
Only if there are some compensating factors (like substantial popularity) will a frames site perfom 
well with the search engines. 

 

We recommend avoiding frames if you want good visibility in search engines.  There is a work-
around for frames but problems remain.  A clever developer can use Javascript and re-fetch 
content pages when search engines index individual pages from a frames site.  This solves part 
of the equation.  Still, getting those pages indexed and ranking well is a bit of a problem. 

 

Some search engines record content in an alternate container called NOFRAMES.  NOFRAMES 
is meant for browsers that don't support frames technology.  Text and links in NOFRAMES can 
help a search engine index a frameset document.  However, a frameset document for every page 
of your site can be unweildy. 
 
The HTML code looks like this: 
 

<noframes>Place content here</noframes> 

 

Browsers – and search engines – not supporting frames, can display the content you place 
between the two NOFRAMES containers. 
 
The suggested use of this space is to:   

• 

Fill the section with the same text that you have on the visible Web page.  Use the raw 
text without images – but do not use any text that is not visible to the users. 

• 

Remember to add links in the NOFRAMES container too that can be followed by search 
engines. Use the same navigational links as you do on the page you show your visitors 
but remember to keep them as normal text-links. 

 
Using these suggestions is not a guarantee that your Web pages will get indexed and rank well.  
Many search engines will not read the content nor follow links in the NOFRAMES-section. 

 

Page 16    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Dynamic websites and Web pages  
Dynamic Web pages usually contain content (e.g., images, text, form fields, etc.) that can 
change/move without the Web page being reloaded.  These pages are often produced on-the-fly 
by server-side programs.   
 
A traditional static website is made up of a number of individual files usually ending with .html.  
For example, index.html, products.html etc.  Each page is a unique file and usually has unique 
“static” content. 
 
On the other hand, a dynamic website very often only has one or a few files –-called “templates.”  
The template dictates how to present the visible content but contains no content itself.  All the 
content is stored in a database.  To show a page on a dynamic website your template code 
knows what to load from the database.  Technically, that is done by adding “parameters” to the url 
while browsing page to page.  
 
If the template is called: pages.asp and you would want to load the content with ID #54 the URL 
could end up looking something like this: 
 

www.YourDomain.com/pages.asp?page_id=54 

 
This seems fairly simple, but can get very complicated when many parameters are used to 
support the navigation of the underlying database.  The same URL with a few more parameters 
may look like this: 
 

www.YourDomain.com/pages.asp?page_id=54?manufcturer_id=acmeco?color_code=brow
n?style_code=modern?size_code=xxl?upc=12345678 

 
This more complex URL (Web address) makes it difficult for search engines to crawl.  There is 
simply no way for a search engine to ascertain which parameters identify a new page and which 
parameters are just a sort order of the content, a navigation setting or something else that does 
not justify indexing the page as a unique Web page. 
 
There are other complicating factors related to having dynamic websites and websites built on 
content management systems indexed in search engines.  This tutorial is unable to cover all of 
them. 
 
There are a growing number of tools, techniques and services to help get dynamic websites 
indexed in the search engines.  For further information on how to get your dynamic website 
indexed we recommend that you hire an experienced search engine optimization expert with 
expertise in these areas.  Most often it takes a closer examination of your website to determine 
the best strategy. 

Flash and Macromedia Director  
Search engines do not read Flash files; therefore, content and links that you place in any of these 
formats will not be accessible to search engines. 
 
You can read more about Flash, Macromedia and Java Applets in the Links and Navigation 
section. 

Java Applets and other client side applications 
Search engines basically read regular text on a Web page.  They do not read text or follow links in 
Java Applets or any other formats that require the user to have a program, run-time environment 
or plugin installed.  
 
You can read more about Flash, Macromedia and Java Applets in the Links and Navigation 
section. 

Page 17    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

IP-delivery, agent-delivery and personalization 
It has become increasingly popular for websites to implement varying forms of personalization for 
their visitors.   These websites gather numerous pieces of information about individuals which can 
influence the way they will serve Web pages to individual users.  
 
A simple example is a server-side program that checks what browser people are using, and serve 
up a specially tailored version for that browser.  The same kind of program can also be used to 
check if people have specific plug-ins installed so that they get a version of the website they can 
read.  
 
A more advanced use would be to check what country the user is coming from to serve a local 
version of a Web page.  Some portals, search engines and cross-national websites have 
employed these techniques.  There are many legitimate reasons to do so including business 
strategies, marketing, and legal issues with products only allowed in certain countries. 
 
Growing sophistication in personalization is evidenced by some of the larger e-commerce 
websites that track a variety of individual information, like purchase history, buying behaviors, 
useage patterns, etc.   

Cloaking 
Similar techniques for tracking information about website visitors can be applied to the search 
engines that crawl a website.  A Web server can detect a visit from a crawler and selectively 
serve different content.  This technique is referred to as “cloaking”.    
 
There may be some legitimate reasons to use cloaking.  However, in most cases search engines 
do not like the use of cloaking and it is considered a spamming tactic.  We recommend that you 
do not use cloaking unless you have a very good reason to do so, that you fully master the 
necessary techniques and that you understand the possible consequences.  

Cookies 
A cookie is a small text file that a web server can save in a users’ browser for later retrieval when 
that same user has a subsequent visit.  It can be used to store log-in information or other 
preferences to make it easier or better for users to use a particular website.  
 
Cookies are safe to use in the sense that they cannot be read or shared across different users or 
websites.  If a cookie is set on a browser then the website that wrote it is the only website that can 
read it.  Also, other users of a website cannot get to the information in my cookie – it can’t be 
transferred or shared. 
 
Important to note is that Spiders do not accept cookies.  Therefore, if your website uses cookies 
you have to make sure that all of the Web pages that you want to have indexed can be accessed 
without accepting the cookie.  
 
Tip: 
You can turn off cookies in your own browser to test if your website can be accessed without 
them.  Refer to your manual or help files of the browser you are using.  This information can 
usually be found in “advanced options”. 
 
 
 

Page 18    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

Submit and index pages 

Getting indexed 

Introduction 
Search engines do not always include all Web pages from a website.  Usually they will only 
include a sample of your pages – the ones they determine to be most valuable.  
 
Some of your Web pages will be more important to have indexed than others.  A product 
information page is far more important to have indexed than a contact form, as it is more likely 
someone will search for your products than your contact form texts. 
 
Search engines do not always find the right pages to index by themselves.  Sometimes they need 
a little help and guidance.  In this section you will learn how to submit your Web pages to search 
engines to get them indexed. 

Submitting to search engines 

Introduction to submission 
Most search engines advertise a free feature to submit your website by adding your URL.  
Submitting a website to the search engines is not guarantee that it will be included in the Index.  
Unfortunately, a significant volume of undesirable websites are submitted in this fashion.  
Therefore, the search engines will generally evaluate a candidate website, before including it in 
their Index.   

How to submit to search engines 
Most search engines have a form you must fill out to submit your website.  You will usually find 
the link in the bottom of the front page labeled “Add URL”. 
 
Generally you should only submit your front page as the search engines will follow links from that 
page to the rest of your website.  
 
However, if you have important sections of your website that are not directly accessible through 
the regular navigation you can also submit them.  If you have a site map (a page with links to all 
the Web pages on your website) you can submit that too, to help the search engine spiders find 
all of your content. 
 
TIP:  
The easiest way to get your website into the search engines is by having it included in the major 
directories, such as Yahoo and ODP.  Many directories are used to provide input to the search 
engines.  Most search engines will consider a website that is included in the major directories to 
be of higher value. 

Do not over-submit 
You should never submit a Web page to a search engine if it is already indexed.  Some 
questionable SEOs will recommend that you resubmit your Web pages on a regular basis to keep 
your rankings.  This is simply not true and can cause undesireable results. 
 

Page 19    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

You can buy software or services that make it easy for you to submit all of your Web pages to 
hundreds of search engines as often as you like.  BUYER BEWARE, do not use such programs 
or services!  
 
The reality is that there are only a handful of major search engines and directories that serve the 
vast majority of all search users – either directly through their own search portals or through 
distribution of their indexes to other portals. 

Paid Inclusion (PFI) programs 
Some of the search engines (e.g. Yahoo) and their Partners offer a commericial service to submit 
your Web pages.  This is a time-tested service that offers some measureable benefits.  You can 
pick exactly the pages from your website you want to have indexed, including deep, dynamic 
pages.  As long as you have the direct address (URL) to the page, you can have it included 
through a Paid inclusion program.  Generally, these programs offer some ancillary benefits, like 
regular refreshes, performance reporting, etc.  The Yahoo Search Submit Basic program is such 
a service. 

Excluding pages from getting indexed 

Introduction to exclusion of Web pages 
There are instances when you may not want a website crawled and indexed by the Search 
engines.  This can be accomplished with one of the following methods: 

• 

Robots.txt 

• 

META robots code 

Robots.txt  
Robots.txt is a file that provides instructions for all robots that attempt to crawl your website.  It is 
placed in the root directory of your Web server.  The file uses a simple syntax to exclude specific 
types of user agents from parts of your website. You can either exclude specific search engine 
spiders or all spiders.  To exclude all search engine spiders as well as all other robotic user 
agents from all directories on your web server, the robots.txt file should contain the following: 

 
User-agent: * 
Disallow: / 

 
Note: This would disallow everything including your home page! 
 
Learn more about how to write robots.txt files at SearchTools.com 
[http://www.searchtools.com/robots/robots-txt.html/] 
 
TIP: 
We recommend that you validate your robots.txt file before uploading it.  There is no way to 
predict how a search engine will interpret a robots.txt file with errors.  

META robots  
META robots is a container that you can place in the HEAD container of your HTML documents.  
You can use the META robots container if you don’t have access to your web server’s root 
directory or if you want to exclude single pages on your website.  You can read more about how 
to use the META robots code at http://www.searchtools.com/robots/robots-meta.html
 
 

Page 20    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

 
 

FAQ 

Can I pay the search engines to improve my rankings? 
No, major search engines like Yahoo do not sell improvement of rankings.  Some of the search 
engines have inclusion programs in which you pay to get included in the index but you are not 
guaranteed a specific ranking.  
 
Furthermore, you can buy sponsor links and keyword targeted banner ads in most major search 
engines and this way get a premium position for your favorite keywords.  However, don’t confuse 
this with the regular search results as such listings are clearly labeled as ads – separated from 
the free search results.  

Why do my Web pages not rank well? 
Unfortunately there is not one simple answer to that question.  The first place to start would be 
with a question: “For what keywords?” 
 
Maybe you have been unrealistic about the keywords you are targeting.  Maybe the competition 
for the keywords you are trying to rank well for is just too high.  You should probably not expect to 
rank as number one for “Microsoft”, just because your online bookstore has a used book about 
Windows XP on stock, just as you should not expect to get top rankings for broad competitive 
terms such as “Travel” and “Mortage”. 
 
Be realistic about your goals! 
 
If that’s not the problem then maybe you need to optimize your website further.  This tutorial can 
serve as a good starting point for the basic things to learn.  If and when you want to learn more 
we recommend that you visit http://www.searchenginewatch.com where you will find a huge 
collection of excellent articles and search engine references. 

I have paid for inclusion, why do I not rank better? 
For those search engines that offer this type of program you are guaranteed a fast inclusion and 
refresh in their index.  However, you are not guaranteed top rankings for your keywords.  That 
alone is determined by the search engine’s relevancy algorithms. 
 
You can read more about how the search engines work at Position Tech’s Learning Center  

[

http://www.positiontech.com/learningcenter/index.html

] or Search Engine Watch 

[

http://www.searchenginewatch.com

/]. 

How deep do search engines browse my website? 
There is nothing to keep search engines from reading and indexing content on your website – 
even in very deep directory structures.  There are no set rules on how deep search engines go.  
In some cases – with some websites - they might go deeper than in others.  
 
Generally a good internal link-structure will help you get as many pages as possible indexed.  If 
you have valuable pages buried deep down in sub-directories it might be worth linking directly to 
them from other pages higher in the directory structure, to “point out” to the search engines (as 
well as human visitors) how good these pages are. 
 
However, we recommend that you do not put your most important content in deep sub-directories.  
You should place your most important pages in the root, or in a directory directly below the root, 
as in the example below. 
 

Page 21    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

 

http://www.YourCompany.com/index.html 
http://www.YourCompany.com/about_us.html 
http://www.YourCompany.com/news.html 
http://www.YourCompany.com/products/index.html 
http://www.YourCompany.com/services/index.html 
http://www.YourCompany.com/shopping/index.html 

What do the search engines consider spam? 
There are unfortunately not any agreed standards on this topic.  However, there are a few basic 
rules that you should follow to stay out of trouble.  
 
Do not … 
 

• 

repeat keywords over and over just to get better rankings 

• 

target Web pages to keywords that are unrelated to the exact Web page 

• 

use hidden text or links on your Web pages  

• 

submit Web pages that are already indexed 

• 

participate in link-exchange programs to boost Link popularity 

• 

link to criminal, illegal or very offensive websites 

 

 
We recommend that you look at the search engines ‘Help’ pages for more information.  Many of 
them have published guidelines on the web for what they consider to be spam.  
 
TIP: 
If you are ever in doubt that what you are about to do is spam, don’t do it! Instead, study and 
investigate the technique or method further.  Do not always blindly trust what people tell you in 
public forums – use your own judgment and common sense.  
 
If something sounds too good to be true it probably is.  Easy tricks never work in the long run. 

Can search engines index my dynamic website? 
Most search engines will have difficulties interpreting a dynamic website.  Dynamic websites 
generally have a much more complex structure, which can easily make the search engine spiders 
get lost or confused trying to read and browse through such websites.  When a spider becomes 
confused it messes up the index, so most search engines tend to stay away from dynamic 
websites. 
 
However, if you are using Pay For Inclusion or Trusted Feeds you will be able to tell the search 
engine exactly what URLs you want indexed and thereby making it safe for the search engines to 
spider those Web pages. 
 
There are also a number of other solutions you can implement to improve the search engine 
friendliness of your dynamic website.  It is however beyond the scope of this tutorial to get into the 
details of these methods … 
 
You can learn more about dynamic websites in the Optimization section of this tutorial. 

Can you tell me how the search engine algorithms work? 
No, the algorithms are well kept secrets of the search engines.  However, there are a lot of 
similarities in the way they work that we recommend you study further if you want to improve your 
rankings.  

Page 22    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 

background image

The number of daily visitors from search engines has dropped – why? 
Search engines are constantly adding new pages to their indexes and changing the algorithms to 
improve search results.  That is part of the job of maintaining a good search engine. 
 
If the number of visitors you receive has dropped the reason could be that you do not have as 
good rankings as you did before or that less of your Web pages are included in the search engine 
indexes. 
 
If there have been a lot of new Web pages added within your category it will be harder for you to 
rank well.  In that case you may have to optimize your Web pages further to keep decent rankings 
for your most relevant keywords.  
 
 
But the drop in visitors can also be due to changes in search trends.  If you have a website that 
sells Christmas decoration you will most likely (hopefully) get a lot of search visitors in December. 
But you should not get surprised, or blame your rankings, if you experience a dramatic decrease 
in traffic by mid January and February. 
 
Searches for ski-tours, summer vacations, camping, boating, swim suites and fur coats all have 
their own season.  People search for what they need – when they need it. 

Help! My site got dumped – what do I do? 
First, make sure that your website is actually not in the index any more, what search engines it 
has been dropped from and how many pages were lost – if not all.  It may be that your website 
was not excluded at all, but your rankings just changed a bit.  If that is the case you will have to 
analyze why that is and make the necessary changes to your website or search engine 
optimization strategy. 
 
If some, or all, of your pages have been excluded from a search engine there can be several 
reasons for that.  Most commonly it is because your web server was down the last time the 
search engine spider was trying to browse your website.  If that is the case then you should not 
worry – the next time the spider is out it will most likely pick up your site again and re-index your 
Web pages. 
 
It is not very often that a website gets banned by search engines.  Compared to the billions of 
Web pages that do get indexed, it’s only a small fraction of sites that search engines chose 
choose to permanently exclude from their indexes.  These include extreme adult sites, hate, 
violence, racism - sites that most of us can agree search engines should not link to.  In some 
cases search engines also ban websites that have been taking radical steps to manipulate their 
rankings.  
 
If you have been optimizing your website to the limit and suspect you have been banned for 
crossing the line then you should contact the search engine for verification.  In some cases you 
can get back in if you make the necessary changes to the website and promise not to mess up 
again. 
 
 

Page 23    

 

Version 1.3 

Copyright 2002-2007 Position Technologies, Inc. 


Document Outline