Search Engine Optimization (SEO)

by Bill & Sara Newman (updated 2/7/2018)

Page Link: In case this is in in print, this page can be found at the following URL:

SEO Defined: Search Engine Optimization (SEO) is the study and task of optimizing web pages to rank highly (usually within the first 30 sites found) in search engines.  We wish to generally increase traffic (site visitors) but also to bring highly targeted visitors to our client’s site on a regular basis.  

The Goals of SEO: The two main goals of search engine optimization are:

  1. To be found at the top of any web search that specifically targets our client’s domain name
  2. Have our client’s web pages be as highly ranked as possible on searches for targeted terms (keywords)

The following details the specific plumbing involved and safe tactics to maximize site page ranks.

Misconceptions: Search engine optimization is not about gaining as many page hits as possible. Users who are forced to view a web page may become angry about being redirected, skew your client’s server log results and take up bandwidth. There are many practices deployed by SEO “experts” that are sketchy and risky:  Myths & Misconceptions about SEO

Top Engines Only: We should optimize based on the top 3-5 search engines in terms of market share. At the time of this writing (May 2012) Those 5 are Google, Yahoo Baidu and Bing with Google taking over 80% of the market share and none of the others over 8%! To check to see the current numbers check with netmarketshare

Match Search Engines Goals To Our Own: The goal of a search engine is to correctly match a consumer need with a solution. We can align the search engines goals with ours by designing and optimizing our websites to clearly provide a single solution (or set of related solutions).  We can also follow the advice provided by the search engines themselves, for example by signing up for Google Webmaster Tools 

While google currently holds the lion’s share of the market, you may also want to get a second opinion via Bing’s Webmaster Tools

Anatomy of a SERP: Search engines rank pages and show pieces and parts of our web pages on a SERP (Search Engine Results Page).  View the following excellent Anatomy of a SERP If you want to practice your SERP snippet creation, try this handy Search Engine Snippet Optimizer

History and 301 Redirects: One of the ways your client’s website will maintain a good search engine rank is to stay listed in one place over time.  The longer a site has been ranked, the better.  Your first job is to maintain what search engine rank your client already has.

Therefore the individual page addresses must be maintained or carefully redirected if the page addresses must change.  If you do need to move a client’s website or redirect pages to new addresses, please view the following:  Moving Websites

If you are considering changing the site pages of a client’s website (including the extension, for example, htm switching to php) who has a history with their web pages and need advice, please contact Bill for specific advice.  Your client could also have inbound links for example by a local newspaper’s website, etc.  

An SEO Campaign:  An SEO campaign is an attempt to improve a client’s website page ranks with a specific technique.  Usually a campaign is done and assessed one at a time to see exactly how effective was the technique used.  Here are the steps for an SEO campaign:

  1. Document baseline traffic via server logs
  2. Select search keywords & phrases
  3. Document current rank on keywords & phrases
  4. Optimize site pages based on keywords & phrases and advice of Google Webmaster Tools, etc.
  5. Create sitemap.xml and robot.txt files
  6. Submit site pages to search engines to be indexed
  7. Document traffic and rank to identify effect of campaign

We’ll go through details of each of the steps below. Once our pages are indexed,  (could be a week or more) check the results and see how many more visitors and sales a client may have a result!  Document all of this to show how much of a difference you have made, and add the percent improvement to your resume!

Best Time for an SEO Campaign? Before Launch!: Clients frequently expect immediate results. When a site is launched it’s unlikely to have a quality ranking.  The quicker you get a quick web page up and get it indexed the better!  Also check to be sure the domain name of the client isn’t 90% the same as their competition.  Having a domain name that has competition could put them behind from the start!

Save As PDF: Many of the things we’ll be tracking below are important but temporal in nature.  We’ll need to store copies of important data to identify a baseline of results to show our clients how much progress we’ve made.  In the chrome browser when printing select Save As PDF

Do Baseline Searches for Page Rank: Before you get too fancy, get an idea of where your client ranks currently for things like.  Do searches in Google for example and search on the following:

The words that you’re identifying above,and the current rank should be written down and stored.  You should search Google for instance at intervals for these words and record whether or not your client is ranked in the top 3 pages (top 30 sites).  If not, the goal is to get into that area for key searches.  This may get rather detailed.  See Keywords and Keyword Phrases below.  Copy, date and print all relevant search pages into PDFs if possible.  On the printed copies, write down the search engine rank (if any) and highlight.

See How All Site Pages Are Currently Indexed: On a site that has a history (a year old, for example) be sure to do searches that identify all of the site pages and see how they are indexed.  You can do this in google by typing in:

Where above is the client’s domain.  In this way you can see what a search engine has indexed for an entire site. Copy, date and print all relevant search pages into PDFs if possible.

Traffic Statistics via Web Server Logs: In order to identify baseline (current) web site traffic we need to access our client’s web server logs.  These log files store every page hit, every browser version used, every search engine, every search term used and even sites that provided a link to reach our site. Contact the client’s hosting company and ask them how your client can view the server logs.  Frequently the hosting company will provide a web server log analysis application such as webalizer or webtrends.

Once you can view the server log data, copy and print all relevant data into PDFs if possible.  The server log data will refresh, usually once a week and you may not be able to get data that is older than a week!  Once you can view the server log data, copy, date and print all relevant data into PDFs if possible.  

These reports will show keywords used to find your client's site and the search engines in play.  Hits on broken links can be determined as well.  If the server logs are not accessible, you can try to determine some traffic via third party tools.  View the following on Assessing Website Traffic

Keywords: In order to have our web pages categorized correctly by a search engine, we should pay attention to the words that users type in order to find our page.  These words are called keywords.  These keywords should be found both in the body of the text, the title tag, the alt tags of images, the meta keywords tag and the meta description tags.  

The search engines will expect these keywords and phrases to appear prominently in the text on our pages.  If we do a good job at matching the search engines expectations without creating a difficult read for our users, we have succeeded!

Keyword Identification: Use online tools such as wordtracker to determine which words appear frequently in your text.  Are these the keywords you expected to find?  Are these the keywords users would type to find your page?  Search on these words to see what other sites come up in a search.  Are there other words you should have used?  Consider adapting your page to include additional or different keywords based on your research.

Identify Keywords: Establish 10 keywords that describe the solution your client's site provides.  These keywords must match what users type on a search, not necessarily what we would use for keywords.

Keyword Resources: In order to get started generating keywords view the following tutorial: Keyword Research

To find out if the keywords we have selected match what users actually type we can use tools provided by wordtracker and Google’s adwords

At Wordtracker, you can take a trial to determine if the keywords you have brainstormed are searched on the web, as Wordtracker tracks frequently clicked web on most of the search engines and directories on a daily basis.

Want to see some keywords?  View the source code of the pages of your competition!  Pay special attention to those who rank highly in a search!  Check the keywords they are using. Particularly pay attention to the title tag, meta keywords, meta description, image names, etc. Keywords are NOT subject to copyright (unless you are using the name of your competition - don’t!!)!

Keyword Phrases: After the keywords are established they need to be worked into keyphrases that will be used to draw your targeted customers in, and simultaneously get well crawled by search engines and highly placed in the search engines and directories.

A search engine will penalize or reward sites that overuse or wrongly place the keywords or keyphrases, so placement and frequency is key.

The best combination is to have keyphrases that jump out of the search engine list, and content that delivers what is promised. The process of writing keyphrases is well addressed in the following tutorial: Choosing Keyword Phrases

Optimizing Site Pages: Once you have determined what keywords and changes need to be made, apply these changes to all visible site pages.  Below is some advice regarding key areas of pages:

Title Tag: This is the most important tag for a search engine.  A fragment of the title tag is used as the link to the page on a search engine result page.  A search engine expects each title tag to be unique, or our site’s rank may be penalized.  

Your "titles" must match your content, your keywords and your meta tags. This presents a difficulty, because your title tags also tell the user what page they are currently on, and for that purpose, they should be short and clear. This is contradiction to what the search engines appreciate. Favor the search engines, while giving brief navigation info for the user.

Many site owners mistakenly believe they should put their company names in this tag.  This is only a good idea if you are a well-known company that people will be searching for by name, such as Coca-Cola or McDonalds. Otherwise, you should assume that most potential customers will be searching for specific products or services, not a particular company name.  An effective title tag for a Dallas Tax Firm might be:

 <title>Tax Accountants - McGruber & Son - Dallas CPA</title>

Some experts say a title tag should be as little as 40 characters, and others say you should use 60 to over 100, so perhaps 40-60 is a good compromise.

The key here is the keyphrases need to appear in the title.  View a search and count the characters available on a google result page.

Be sure to examine the title tags (and other meta tags) of high ranking sites to get more ideas!

Body Text and Visible HTML: After your keyphrases are built, these need to be used as frequently as possible in the body of your targeted pages (the pages you target for submission). The safest thing for a search engine is to have at least 200 - 300 words of quality copy (content) on it. Search engines love content, especially when it is peppered with your well built keyphrases.

The key is to get your keyphrase on your page as often as possible, and in as many variations as possible, to cover any possible search for similar topics, while enticing the user to click further into your site. Some engines will rank you higher on a particular search if the keywords are clustered closely together.

Try to include your keyphrases as creatively as possible, in as many places as possible. Use them in "alt" tags for images. Use them in the links on your pages. Some proponents claim the "density" (how often the keywords appear, and how close to each other) is important.

Keep the first paragraph as close to the <body> and the top of the page as possible. Limit graphics or other HTML in front of your first paragraph. Also, use the <h1> or tag to identify the purpose of the page or emphasize your opening sentence.

Meta & other Hidden HTML Tags: Meta tags were originally included for search engines to categorize web pages. These tags are invisible to browsers, but are designed to include keywords and keyphrases for web page categorization.

There was a time that all you needed to be included in a search engine was the information in the meta tags. This has been so abused that now most search engines ignore much of the meta tag information, and use the actual information in the body of the site, and the title.

Developers would "spam" (repeat) keywords in meta tags, since they were not visible, to draw more hits for a certain topic. Now sites are penalized for using a keyword more than twice. Using a keyword twice, (not in succession) is a method many developers stand by, however.

Description Meta Tag: This is likely the most important meta tag. The beginning text of the description meta tag show up under the linked title text in a search engine result.

Try not to repeat words in this tag; however, you can use various forms of words in this tag, i.e., plural/singular, "ed" or "ing" forms of words, etc. Always make sure this tag is an actual sentence, not just a list of keywords.

Regarding rank, the search engines don't give this tag nearly as much weight as they give the title tag. However, some engines do index the words in this tag, and therefore it is important to get some keywords into it. Here’s an example:

<meta name="description" content="Nobscot's online exit interviews streamline your exit interview process to improve employee retention and reduce employee turnover. Nobscot Enterprises advises all businesses on personnel management issues in Seattle and the greater Puget Sound area including Tacoma, Olympia and Everett" />

Never insert the same word twice in a row in this tag, even if you're using different variations. (Plurals, ALL CAPS, different tenses, etc.) You can use the same word in different phrases, but never use that word more than three or four times within the tag, even if you're using different variations of it.

If the client is local to a particular area, add the city, state or other area identifiable info (“homemade muffins in West Seattle”, etc.)

Keywords Meta Tag: Put the keywords from the title of the page in the meta keyword tag. The first words in any tag are assumed to be given more weight, so these are most important.  Also place other keywords you generated here.  

After every important word or phrase from the text is included, add some common misspellings of some of these same words. An example from a popular Sauna building company:

<meta name="keywords" content="sauna,airwall,airwall saunas,american sauna,buy a sauna,buy a sauna kit,steam sauna,do it yourself sauna,sauna kit,european sauna,healthmate sauna,home sauna,home sauna kit,infra red,infrared sauna,infrared sauna heater,portable sauna" />

Note that this company disregarded the suggestion to only put words once, but they included the same word repeatedly in different contexts, between commas.

Robots Meta Tag: The robots meta tag indicates to search engines whether a page should be indexed or not and whether the links on the page should be followed and indexed as well.

The content of the robots meta tag contains directives separated by commas. To index the current page, and follow all links:

<meta name="robots" content="index,follow" />

To disallow indexing this page, but follow the links on the page:

<meta name="robots" content="noindex,follow" />

To disallow indexing for this page, or any linked to it:

<meta name="robots" content="noindex,nofollow" />

To index this page only, but none linked to it:

<meta name="robots" content="index,nofollow" />

When we are building a site for a client on our web server, it’s important to be sure all our pages have no index, no follow designation to disallow our pages from being searched during site construction!

robots.txt & Sitemap.xml: Before you have your site indexed, you may want to specify pages that you do not want the engines to index (crawl and categorize). There are two different files we can use to help search engines index certain pages and totally avoid others.  These files are named robots.txt and sitemap.xml

To use a robots.txt file or Sitemap.xml to help search engines index our site properly, we must place the file in the web root of the server space.

If you do not have access to the root directory, you can still specify the action of search engines in individual pages via the robots meta tag. This would be the case if more than one company shared a web page domain. Not a good idea for search engine optimization.

If you do have access to the "root" directory of your server, you can choose to exclude files that are private, back-end oriented, login only, etc. This is done by including a special text file, named "robots.txt" that is inserted in the root directory (the lowest level) of the web site.

In the robots.txt file, you indicate which pages or directories to exclude for indexing, and you can indicate specific search engines.

To exclude ALL search engines from your site (which you would NOT want to do, the file would only include:

User-agent: *

Disallow: /

This indicates all engines (User-agent, with the wildcard "*".) and the directory to be disallowed (not indexed) is the root directory, or "/" which indicates the lowest level directory. Since the lowest level directory is disallowed, NO OTHER directory stemming from it would be indexed either.

This example demonstrates how powerful this technique is, so it must be used wisely.

Here is an example of being more selective about which pages to exclude:

# /robots.txt file for

# mail for constructive criticism

User-agent: webcrawler


User-agent: lycra

Disallow: /

User-agent: *

Disallow: /tmp

Disallow: /logs

The first two lines, starting with '#', specify a comment. The first section names 'webcrawler' has nothing disallowed. All pages may be indexed.

The second 2 lines indicate 'lycra' has all relative URLs starting with '/' disallowed. Because all relative URL's on a server start with '/', this means the entire site will not be indexed.

The last 3 lines indicate all other engines should not visit URLs starting with /tmp or /log.

Note the '*' is a special token, meaning "any other User-agent"; you cannot use wildcard patterns in either User-agent or Disallow lines.

A drawback of this single-file approach is that you must have access to the root file of your directory. If you are sharing the directory with other sites, all sites must be addressed inside the one robots.txt file. If a robots.txt file is empty, it will be ignored.

Individual files can be excluded. Here is an example to exclude one file from all engines:

User-agent: *

Disallow: /administrators.htm

Multiple files and directories can be singled out, and excluded in this way.

Sitemap.xml: Placing a file named Sitemap.xml (capital S) in the root of your server space takes an opposite approach.  Instead of disallowing pages from being searched, it identifies pages you wish to be searched:

<?xml version="1.0" encoding="UTF-8"?>






<!-- created with Free Online Sitemap Generator -->











This is a standard mechanism used by Google.  It is particularly useful for dynamically generated pages, or those not linked in an obvious manner from other pages.  It also tells the search engine specifically where to index and it will avoid the other pages.  I recommend using both the Sitemap.xml and robots.txt as they do different things and Sitemap.xml may only apply to Google.  

To build a sitemap, go to a free online sitemap generator. Then edit it as appropriate: Online Sitemap Generator

More about Sitemaps

Canonical URLs: When we start to link and get our pages indexed, we can use the optional www in our links or not.  However, for search engine purposes, we should stick with one or the other!  To have a split between www and non www links is to potentially split our search results between our own pages!  View the following from Google’s Matt Cutts on Canonicalization

Clean The Site Up for Submission:  Be sure your site pages validate via the W3C Validator.  Next check to be sure you don’t have broken links via the W3C Link Checker

Search Engine Optimization Scanners: Once you believe you've built a search engine friendly page, try one or more of the following SEO scanners to get further advice:

Feel free to search for other such tools.  Once you've identified tactics to employ to improve the SEO capabilities of your site, write down the things you've done to improve and see if you can improve your results!

Match Current Results Against Baseline: Once you’ve applied all these techniques to the client’s site, it’s time to see if there has been a difference in rank.  Again check for keyword searches and print the results.  Have they improved?  how much? Document the result for your client and for your resume!

Hopefully you have already signed up for or Google Webmaster Tools on behalf of your client.  If so, compare the results previously with your current results.  If you have not seen an improvement, address the items found via the many tools provided.  For an overview of tools see SEOMOZ's Updated Guide to GWT

Site Submission: Once you’re satisfied that you’ve improved your client’s site page rank sufficiently, you may wish to submit your site officially to search engines.  As of 2012 it may be less necessary, however, there is no reason to tempt fate!  Also if a site is going live for the first time, we should definitely submit the site to Google for indexing:

The first page you will submit, and the most important page to target will be your index or default homepage.

For best results, this page should not have any redirects (automatic redirection to a different site) , dynamic linking (ampersands, question marks, etc. in the link) or Flash on it.

To submit your site to Bing, you will likely need to sign up for their webmaster tools first.  Here’s the link to Site Submission for Bing

Remember that Search engines penalize for frequent submission, automated submission, or re-submission of sites with no content changes.

Each engine is different, and each must be researched for each submission! This is one reason for keeping an accurate log of your submissions.

On a redesign, it is not the design that must change here, but the words, especially the content and possibly the keyphrases.

Changing and adding quality content is key to success, both for the robots and the humans involved, so don't let your site sit stagnant.

Keep diligent track of the dates, methods, changes, etc. of submissions you have sent. Track your traffic, both before and after submission. Check your traffic as you submit to more engines.


After Submission: Once the site has been submitted, you can watch the server log files and do test searches to see where the site pages rank after the submission.  Be sure to store PDFs of the search pages and print and highlight where the search results are now and compare.  This process may take a week (or weeks) as no search engine guarantees how soon (or if) a site will be indexed!

Link Promotion: Another effective tactic for generating traffic and getting more highly placed is to have as many inbound links to your site from other highly ranked sites.  Link Promotion is the process of getting your site linked by as many quality sites as possible.  

Search for, and send thoughtful emails to webmasters whose sites complement yours, and invite them to trade links (link to each other’s sites)

Having a site linked by quality sites is the newest way Google and other search engines are ranking the quality of sites for it’s customers.  Here’s how this works: PageRank

Keep It Organic: Executing the SEO tactics on this page focus on what is simple and honest.  As such it has been deemed organic search engine optimization, which is in contrast to pay per click and other techniques that may gain greater but also potentially temporary results.  Leave such tactics to the professional SEO specialists!

Stay Informed: SEO is a complicated and ever changing process. It is difficult to keep on top of, and many sites opt to hire a specialist. Even if that is the eventual choice, knowing about the process will help you select the proper firm to hire, as there are many questionable and harmful practices. The damage done by improper submission can take years to correct!

SEO Per Google: View the following for a great overview of Search Engine Optimization from Google's point of view: Google SEO Starter Guide (Links to an external site.)Links to an external site.

SEO Presentation Slides: Here’s a link to our SEO Presentation Slides

Original SEO Page: Sara & Bill have more detailed info on an original SEO page