Friday, March 9, 2007

Choosing Keywords Efficiently for SEO

Search Engine Optimization (SEO) is paramount for businesses that choose to make themselves known via the Internet.

Keyword research and, ultimately, keyword selection are extremely important parts of the overall SEO process. Selecting keywords relevant to your field of activity is essential. Make sure you conduct a proper in-depth keyword research before you begin placing keywords on your web site.

If you want your web site to be high ranked in search engines, then find out the words that searchers use to get to your site and place them in the appropriate places.

Nevertheless, avoid the trap of placing too many keywords on a single page, because many search engines nowadays have "spam" filters, which means that your site might not even appear among search results.

Below are some tips on how you can perform a proper keyword research and how to use the keywords you have selected in order to optimize your site and rank it as high as possible in the most commonly used search engines. Always be aware that keyword selection is fundamental. Choosing keywords that people might not use to find your site and your products/services will not get you the expected traffic.

What you need is relevant traffic, namely visits that will eventually convert into sales/contracts. The higher the conversion rate is, the better for you and your business.

Given the fact that the main purpose of a web page is to provide information, and that the Internet offers a vast array of choices for prospective customers, it is of utmost importance to have the right keywords embedded in your web pages. To achieve this, there are some steps that you need to follow.

Brainstorming

The first step you need to take is to simply sit down and write down keywords. Write everything. Do not discard any idea that comes into your mind. You will brush up that extensive list later on. Consult your old web site (if there is one) and even some documentation from the sales department. Consult your internal search logs (the logs on your server) to track where your visitors came from and if possible, extract more keyword combinations. Essentially, you will find out what you are missing. Even if youre not sure about a certain keyword, that it will prove relevant for your purpose, write it down anyway. It may lead you to something more closely related.

If you feel that you have "the writers block", have a brainstorming session. The main purpose of this session is to make you and the other participants think outside the box. Interact with people, ask for their opinion and get their feedback. You can get invaluable ideas and suggestions from a multitude of sources, such as your colleagues, clients, or friends. Remember to always think like a customer. This is a rule of thumb in marketing. Remain focused on the prospective customers and try to understand how they think.

Keep in mind that searchers often mistype words when searching the Internet. Try to think of the most common spelling mistakes for your keywords and include them in your list. Dont neglect those words that are written preferentially with a hyphen or without it, or those that are written with or without spaces between them. Add them to your list.

Another thing that you have to pay attention to is the language that the targeted audience uses. Take jargon into account, and make use of it if you consider necessary.

At the end, begin brushing up your list. Remove any words that, even if they describe your products / services perfectly, are simply too common and arent searched for too often. Work on your list until you get a number of keywords that you consider the closest match to what you are looking for.

Keyword Research Tools

Now that you have narrowed down your keywords list, its time to evaluate the result. Technical assistance is welcome and help is available online. There are numerous resources, applications that will help you in your endeavor, free of charge or for a certain fee. Nevertheless, remember that no tool is perfect. Dont be afraid to use your intelligence and common sense. These tools will help you select which keywords to use, but, by no means, will predict the amount of (relevant) traffic that youll get by using them.

Free


  • Overture Inventory- currently a Yahoo! Search Marketing product - generates a list of related searches that include your term and also shows how many times the term was searched for during the last month.
  • Related Pages- generates a list of possible keyword combinations based on lists of keywords that you provide
  • Googles Keyword Tool- generates potential keywords for your ad campaign and reports their Google statistics, including search performance and seasonal trends
  • Google Suggest- as you type, Google will offer suggestions
  • SEO Book Keyword Suggestion Tool- shows top keyword phrases from Overture; links to other keyword research tools and links at various vertical databases, blog searches and tagging systems to show you top results from other information sources
  • Digital Point Keyword Research Tool- shows you the results of your query from both Wordtracker and Overture for determining which phrases are searched for most often
  • Good Keywords- a free Windows software for finding the perfect set of keywords for your web pages

Fee Based

  • Wordtracker- free trial also available - Discover the best keywords to target on your website
  • KeywordDiscovery- free trial also available - Find the best keywords for your website
  • Keyword Intelligence- drive high quality customers to your website using the top search terms used successfully by millions of people across all major search engines including Google, Yahoo! and MSN

When you consider that you have narrowed down your list to what you were looking for, submit each and every one of the remaining keywords into some of the most popular search engines and see the results. If the results consist of web pages that are not by far related to your own, this is a sign that you are still a long way from the expected result and you have to toil some more.

Competition Analysis

View Source Code

The simplest way is to view the source code of the competitors web pages. This does not mean that you have to copy (or even be able to if they are trademarked) what they have, but it will help you gain more insight into what you need to include in your own source code. Observe the keywords they have used in places such as the title, headings, description, etc., and use this knowledge to get new ideas of your own.

Google

Another method is entering the keyword that you consider as "primary" for a specific page (the one around which all the optimization process revolves) in a search engine such as Google.

This is the toughest search engine at present, and most SEOs techniques nowadays are made with it in mind You can read our article about it here: Is Outsourcing SEO Services a Good Idea for Your Business? Focus your attention on the number of "hits" that Google returns for your keyword. It will probably be a rather large number, so the next step is to refine your search and try to get less and less hits on your keyword. Two of the methods that you can use are the "all-in-title" method, and the "all-in-anchor" method.

All in Title

This method targets the keyword embedded in the title tag of the web page. What you have to do is to type allintitle: "your keyword" in the search field of your search engine and it will give you results with sites that contain your keyword in their titles.

All in Anchor

An anchor is the part of a web site that is clickable (it has a link associated to it, and it can be either a picture or some text). To find out who are your competitors that use this keyword placement technique, type allinanchor: "your keyword" in the search field of your search engine. The results represent those sites that have anchor text containing your keyword.

Online Help

Of course, you can analyze your competition by means of other resources available online, such as:

  • GoogSpy- free; shows a small sample of terms that competing sites are using to get ranked in search results
  • AdGooRoo.com/- fee-based, but a trial version is also available
  • HitWise- fee-based

Trends Monitoring

If you want to perform an analysis and a comparison of the searchability of your selected keywords, then you can use Google Trends, which analyzes a portion of the Google web searches and calculates how many searches have been done for the terms that you enter relative to the total number of searches done on Google over time. A graph is displayed where you can see the interest taken in your keywords. Use this information to make adjustments and to improve your keyword selection and placement strategy.

Analytics

Keep track of your visitors and the performance of your keywords by using Google Analytics. Dont think that you will do great from the very beginning. Keyword placement is a process that never stops. Analyze your results and learn from your mistakes. Also, keep experimenting based on the results that you get. To gain more experience and knowledge, regularly check forums and consult people with experience. Theres always something to learn or work on.

16 Rules of Thumb

There are some basic rules that you need to follow when choosing keywords in order to receive relevant traffic on your web site:

  • Dont use single words. Phrases are better than words.
  • Be relevant. Make sure that the description of your web site reflects what you have to offer.
  • Be a mind reader. Try to anticipate your prospective visitors needs and meet them.• Be specific. Choose keywords with a narrower focus to face less competition.
  • Localize your keywords: make use of your geographical location, but avoid getting too specific. Also, be aware of the differences between words in different locations (e.g. elevator (US) - lift (UK), sweater (US) - jumper (UK)). Perform your optimization process accordingly to get the best results.
  • Choose a primary keyword for which to optimize your web page, and some secondary keywords that can be helpful in increasing the number of your potential visitors who use them in their search.
  • Use the available keyword research tools. They can be of great help, but dont neglect your own abilities.
  • Test your final list on Google (home page). Its time consuming, but its a must. Otherwise you will end up using keywords that have very little chance of ranking at least reasonably well in search engines.
  • Make sure you use 3-4 different (but related) keywords on each of the pages of your web site. This approach has been proven to be quite successful.
  • Pay attention to your keyword placement. The best places in the HTML layer are title tags, header tags, ALT tags, anchor texts, bold/italic tag content, URLs, meta-tags and comments. You can use Page Primer, a tool that "gives you detailed, customized tips to boost your search engine ranking by analyzing your page and providing personalized recommendations for your individual pages".
  • Dont overlook misspellings of your selected keywords. Add them to your list.• Review and rewrite your site to include the keywords you have selected.
  • Use an internal site search function if you have a large site. Your visitors will navigate and find the info theyre looking for much easier. You can also see exactly the terms theyre searching on.
  • Check keyword density (how many times a keyword is used on a page divided by the number of words on that page). Theoretically, the higher it is, the better, but this could turn against you. You could risk getting penalties form search engines.
  • Target highly searched terms that have as little competition as possible. Nevertheless, if you happen to find a keyword with little competition, dont include it on your web site just for this reason alone.
  • Focus on the conversion rate: the percentage of Internet users that perform searches using the keywords that you have selected and convert into buyers (of goods or services).

Finally, make sure that you have seen this process through, and that you havent overlooked any of the steps and procedures that are recommended for SEO. And dont forget that less traffic with more sales is always better than more traffic with fewer sales.

Process of SEO

There is a lot of speculation about how search engines index websites. The topic is shrouded in mystery about exact working of search engine indexing process since most search engines offer limited information about how they architect the indexing process. Webmasters get some clues by checking their log reports about the crawler visits but are unaware of how the indexing happens or which pages of their website were really crawled.

While the speculation about search engine indexing process may continue, here is a theory, based on experience, research and clues, about how they may be going about indexing 8 to 10 billion web pages even so often or the reason why there is a delay in showing up newly added pages in their index. This discussion is centered around Google, but we believe that most popular search engines like Yahoo and MSN follow a similar pattern.


Google runs from about 10 Internet Data Centers (IDCs), each having 1000 to 2000 Pentium-3 or Pentium-4 servers running Linux OS.

Google has over 200 (some think over 1000) crawlers / bots scanning the web each day. These do not necessarily follow an exclusive pattern, which means different crawlers may visit the same site on the same day, not knowing other crawlers have been there before. This is what probably gives a daily visit record in your traffic log reports, keeping web masters very happy about their frequent visits.


Some crawlers jobs are only to grab new URLs (lets call them URL Grabbers for convenience) - The URL grabbers grab links & URLs they detects on various websites (including links pointing to your site) and old/new URLs it detects on your site. They also capture the date stamp of files when they visit your website, so that they can identify new content or updated content pages. The URL grabbers respect your robots.txt file & Robots Meta Tags so that they can include / exclude URLs you want / do not want indexed. (Note: same URL with different session IDs is recorded as different unique URLs. For this reason, session ID’s are best avoided, otherwise they can be misled as duplicate content. The URL grabbers spend very little time & bandwidth on your website, since their job is rather simple. However, just so you know, they need to scan 8 to 10 Billion URLs on the web each month. Not a petty job in itself, even for 1000 crawlers.

The URL grabbers write the captured URLs with their date stamps and other status in a Master URL List so that these can be deep-indexed by other special crawlers.

The master list is then processed and classified somewhat like -

a) New URLs detected
b) Old URLs with new date stamp
c) 301 & 302 redirected URLs
d) Old URLs with old date stamp
e) 404 error URLs
f) Other URLs

The real indexing is done by (what were calling) Deep Crawlers. A deep crawler’s job is to pick up URLs from the master list and deep crawl each URL and capture all the content - text, HTML, images, flash etc.

Priority is given to ‘Old URLs with new date stamp’ as they relate to already index but updated content. ‘301 & 302 redirected URLs’ come next in priority followed by ‘New URLs detected’. High priority is given to URLs whose links appear on several other sites. These are classified as important URLs. Sites and URLs whose date stamp and content changes on a daily or hourly basis are stamped as News sites which are indexed hourly or even on minute-by-minute basis.

Indexing of ‘Old URLs with old date stamp’ and ‘404 error URLs’ are altogether ignored. There is no point wasting resources indexing ‘Old URLs with old date stamp’, since the search engine already has the content indexed, which is not yet updated. ‘404 error URLs’ are URLs collected from various sites but are broken links or error pages. These URLs do not show any content on them.

The Other URLs may contain URLs which are dynamic URLs, have session IDs, PDF documents, Word documents, PowerPoint presentations, Multimedia files etc. Google needs to further process these and assess which ones are worth indexing and to what depth. It perhaps allocates indexing task of these to Special Crawlers.

When Google schedules the Deep Crawlers to index New URLs and 301 & 302 redirected URLs, just the URLs (not the descriptions) start appearing in search engines result pages when you run the search "site:www.domain.com" in Google. These are called supplemental results, which mean that Deep Crawlers shall index the content soon when the crawlers get the time to do so.

Since Deep Crawlers need to crawl Billions of web pages each month, they take as many as 4 to 8 weeks to index even updated content. New URL’s may take longer to index.

Once the Deep Crawlers index the content, it goes into their originating IDCs. Content is then processed, sorted and replicated (synchronized) to the rest of the IDCs. A few years back, when the data size was manageable, this data synchronization used to happen once a month, lasting for 5 days, called Google Dance. Nowadays, the data synchronization happens constantly, which some people call Everflux.

When you hit www.google.com from your browser, you can land at any of their 10 IDCs depending upon their speed and availability. Since the data at any given time is slightly different at each IDC, you may get different results at different times or on repeated searches of the same term (Google Dance).

Bottom line is that one needs to wait for as long as 8 to 12 weeks, to see full indexing in Google. One should consider this as cooking time in Googles kitchen. Unless you can increase the importance of your web pages by getting several incoming links from good sites, there is no way to speed up the indexing process, unless you personally know Sergey Brin & Larry Page, and have a significant influence over them.

Dynamic URLs may take longer to index (sometimes they do not get indexed at all) since even a small data can create unlimited URLs, which can clutter Google index with duplicate content.

What to do:

Ensure that you have cleared all roadblocks for crawlers and they can freely visit your site and capture all URLs. Help crawlers by creating good interlinking and sitemaps on your website.

Get lots of good incoming links to your pages from other websites to improve the importance of your web pages. There is no special need to submit your website to search engines. Links to your website on other websites are sufficient.