Google Keyword Sandbox:
Free tool from Google AdWords.
Offers likely synonyms to the word you type in, but does not approximate traffic.
To approximate traffic you would need to set up a Google AdWords campaign and track the number of times your ad displays.
Be
careful in doing this because it can get rather expensive if you create
random ads for the wrong words and / or bid highly for your keywords.
WordTracker:
Using the above tool you can get a good idea what words would be good to target.
In addition, there is a tool called WordTracker which is slightly more robust.
WordTracker takes sampled data from a couple meta search engines and projects future search rates for different words.
The
data pool WordTracker uses offers better data since it separates plural
listings from singular versions and also tracks meta search
clickthroughs, vice tracking search engine ads.
Since WordTracker
makes money by providing accurate Statistics vice selling keywords
their traffic tends to be a more fair representation of actual web
traffic.
Keep in mind that their meta search user breakdown might
be different than the cross section of normal web surfers and very low
search counts will likely have many anomalies.
Downloadable Keyword Software:
I usually do not recommend many downloadable software tools, but Good Keywords is free and offers some useful features.
I
do not use it often, but it saves you keyword searches and can be well
worth the free download for doing preliminary keyword research.
Checking Keyword Competition:
Many people will look at the number of pages listed for a phrase and think that is a fair estimate of competition level.
It is not.
That
is just a measure of how many pages have those words somewhere in the
content or in links that are pointing at their pages.
A better
measure of competition is to search for "keyword A keyword B," as that
will at least give you the number of pages which have that phrase on
it.
You also can further target your competition estimation by using allintitle: and allinanchor: search functions.
Pages
which have your keyword phrases in their title may be optimized, and
pages which have them in their inbound links stand a good chance of
being fairly well optimized.
The best way to know what your competition level is though is to just look through the search results.
Off
the start it might be a little hard to read them, but over time as you
look through them you will become better at being able to see how well
optimized they are.
Some signs of a competitive marketplace are
when you notice many lead generation type websites, many exceptionally
smooth websites, or high bid prices on those keywords in the top pay
per click search engines.
You can get an (extremely) rough
approximation of the value of a top listing on major search engines for
a keyword by looking at the top listings using the Overture view bid
tool and looking at their search frequencies with WordTracker.
The Tail of Search:
Many people feel the need to rank for a broad generic term and optimize exclusively for that term.
The problem with this is that around half of all search queries are unique each day.
If
you were trying to rank well for "used books" you should cater to a
variety of terms around that idea, such as used book store, buy used
books, used book search, etc.
Need Help with Keyword Selection?
Meta Tags
When people refer to meta tags they are talking about the meta description and
meta keyword tags. Some search engines may display the meta description as part
of the search results, but the meta keywords tags should not appear in search
results.
For more on the metatags visit:
http://www.hitwalker.nl/SEO-simple-steps.html
Robots Exclusion Standard:
When primitive robots were first created some of them would crash servers.
A
robots exclusion standard was crafted to allow you to tell any robot
(or all of them) that you do not want some of your pages indexed or
that you do not want your links followed.
You can do this via a
meta tag on the page copy <meta name="robots"
content="noindex,nofollow"> or create a robots.txt file which tells
the robots where NOT to go.
The Robots Exclusion Protocol is a
method that allows Web site administrators to indicate to visiting
robots which parts of their site should not be visited by the robot.
In
a nutshell, when a Robot vists a Web site, say
http://www.happynewyear.com, it firsts checks for
http://www.happynewyear.com/robots.txt.
If it can find this document, it will analyse its contents for records like:
User-agent: *
Disallow: /
Broken Links in Your Site:
Many directory editors and site visitors will quickly grow disinterested with your site if it is full of broken links.
Some directory editors will run a link checker on your site in the background while they review the content.
The Internet is dynamic and ever changing, and some of your links may break from month to month.
I recommend checking your site for broken links before submitting it to any of the major directories.
Xenu Link Sluth is a free downloadable link checking program which can even help you quickly build a site map.
When Broken Links are OK:
If
your site is a clearly dated news site then you do not need to go back
to edit all of your links as sites around the web change.
Navigation
Effective navigation should let a user know:
� What site they are on.
� Where they are in that site.
� Where they have been.
Navigation and Search Engines:
Good navigation helps the search engines better understand the site structure as well as helping site users.
Typically your most important documents will have the greatest number of inbound links.
Often people will use tabs or images for their links which have a minimal amount of descriptive text in them.
You can offset this by using descriptive text links in the page footer.
It is common to have one set of navigation that is used by site visitors and another that is used by search engine spiders.
Effective navigation can also increase your keyword density by placing relevant optimized link streams on the page.
Proper navigation also gives you exceptionally keyword rich internal links.
A popular technique for doing this is called using bread crumbs.
Setting up navigation looks professional, helps the user, and improves your rankings.
You can't beat that with a stick!
Dynamic Navigation:
Often sites use Java and other client side navigation.
Search engines struggle to follow things that happen on the client side (or in the browser).
You
can tell if a sites navigation is client side by viewing the source or
turning off Java and active scripting then reloading your document.
If you feel you must use it make sure you add static text links to the bottom of your pages.
Site Map:
It is also a good idea to have a site map, linked to from the home page, which links to all major internal pages.
The
idea is to give search engine spiders another route through your site,
and to give users a basic way to flow through your site if your
navigation is broken or confusing.
The site map should be:
� quick loading
� light on graphics
� & overtly simplistic
I usually title my site map as "site map."
Sometimes
when people optimize their site map it lists above the other pages in
their site since it has so many keyword dense text links on it.
The site map is not the ideal entry place into a web site.
Xenu Link Sluth checks for broken links and can also help you quickly build a site map.
Optimize Each Page:
One
of the most important things to get across is that each page is its own
unit and has its own ranking potential and its own relevant keywords.
Usually a home page has more value than the other pages since it is where most other sites will link into your site at.
Home pages should generally be optimized for the most competitive keyword phrases in your market.
Interior pages should be optimized for other relevant phrases.
There are a ton of things to optimize on each page.
We already spoke of how to choose your keywords, page titles etc....
Within each page there is also a ton of content that can be optimized.
On the Page Optimization Only Goes So Far:
When optimizing a page for competitive terms the bulk of the ranking algorithm will be based upon link analysis.
Effective link building has no limit to how much it can help your rankings.
Some people think that more is better and more is better and more is better.
This is usually true with inbound linking, but is not true with on the page keyword density.
The algorithms for grading page copy are based on a bell curve.
After
your keyword density goes to a certain point adding more words does not
improve your rankings any further and can eventually start to erode
your rankings.
Each search engine has it's own bell curves and they do not all align with one another.
Thus
the most effective way to improve your rankings on all search engines
will be via link building, but proper page structure and on the page
optimization does still play an important role in gaining targeted
inbound traffic (especially for uncompetitive
keyword phrases or in search engines that rely heavily on page contents).
Text is Important:
Almost every page is going to have navigation and decoration.
It's impossible to have just one thing (usability, copywriting, SEO, etc.).
Building a page and a site is a balancing act.
The portions of the page that matters most and you have the most control over is the text.
Some places practice so much SEO that the copy reads like rubbish.
Obviously, that is no good. Traffic means nothing if people do not convert.
Use Keywords in Headings:
Use
the keywords in headings and subheadings throughout the page - this
heading should capture the person's attention and tell them they are in
the right place.
<H1>SEO Copywriting - Optimizing Web Pages for Search Engines</H1> would be a classic straight SEO approach.
Depending on competition levels you may wish to use something with a call to action as well.
For non competitive phrases it is easier to add more sales geared information to your headings.
Heading tags go from H1 to H6 with the biggest tags being the smallest #.
You can also change how the text appears using CSS.
Typically think of these headings like you would a heading in a newspaper.
I usually try to get my keyword phrases and similar phrases in my page heading as well as subheadings.
The
rest of the page copy is usually wrote with sales conversion in mind
and I do not pay too much attention to optimizing it for search
engines.
Natural writing will cause you to use your keywords throughout the text.
Be Creative:
There are so many creative ways to increase keyword density.
Again, assuming we wanted to target "eat cheddar" we could write the following:
Cheddar is one of my favorite foods to eat.
Cheddar is ...
Notice how the keywords overlap and are in different sentences.
There are many different ways to get your keywords in the content
Spread Your Keywords Throughout The Page:
Some of the more recent algorithms may have the ability to look for natural language patterns.
In natural language often times the different keywords in a keyword phrase will often appear far apart from one another.
To boost your rankings in these algorithms you will want to use the word eat in some spots and cheddar in other spots.
Often your keywords will appear next to each other naturally.
Some
phrases like "peanut butter" often occur together, but in general all
of your occurrences of the keywords should not be together.
Keywords at the Top of The Page:
Some people strongly believe that keywords at the top of the page and before your navigation enhance search engine rankings.
I honestly have never worried much about this as I assume the effect would be somewhat trivial in most cases.
It
can easily be accomplished by writing a sentence above your branding
images or through using a floating DIV or other CSS techniques.
When using tables some people use a blank cell technique to make the search engines see the body content before navigation.
If
search engines place weighting on where the keywords are on the page
then they use the order of the words in the actual page source code and
not the visual display of the pages.
Naming Filepaths:
Usually
you want to use short file names and folder names so that the data is
easy to transmit using various means such as email. Long file paths may
look a bit spammy to search engine editors or searchers looking through
search results.
Generally you want to use one to a few keywords in each filename or folder.
Use lower cased filepaths because some directories do not handle upper cased filenames.
Separate words with a � symbol between each word.
If
you leave blank spaces it will look weird in the address bar and if you
use _ search engines will not be able to parse apart the individual
words in each file name.
File names are not hugely important for SEO.
If
your site is already built there is probably little reason to change
filenames, but if you are making a new site it is worth the 5 seconds
it takes to use keyword rich filenames.
Interacting with Search Engines:
Most pages that get submitted to search engines are junk.
There is no guarantee that your site wil get included for free just by submitting it.
The best way to submit your site to search engines is by having them find links into your site from other sites.
There is no need to submit or resubmit your site to search engines every time.
Yahoo! is the only major search engine to offer a paid inclusion program.
The Yahoo! paid inclusion program powers all Yahoo! Search properties.
In addition to a $49 inclusion fee, they also charge a category based price per click.
Prices for the inclusion programs vary from $30 to $50 on the major search engines.
Usually paid inclusion is not worth it for most sites.
Directories vs. Search Engines:
Search engines are operated by scripts and machine code.
Directories are human compiled lists of sites organized by categories.
Since directories are entirely human edited they take a ton of time and effort to maintain.
Whenever I create a new site I submit it to many directories.
A few of the larger directories are listed in the next section.
In addition here is a relationship chart to show how the largest search engines and directories interact.
When
submitting to directories it is worth it to spend the extra time to
ensure you are in the correct category and are following the directory
guidelines.
Submitting to Search Engines:
You may want to pay to submit your site, but most search engines will list your site free.
The best way to get your site indexed is through having a search engine follow a link from another site.
The two most popular directories are DMOZ and the Yahoo! Directory.
The Open Directory Project:
The Open Directory Project (DMOZ) is free, but sometimes it can take a while to get listed,or you wont get listed al all.
DMOZ editors work free of charge and are under no obligation to list your website.
However,some believe and proved that not everything works as it should at DMOZ.
Editors doing favors for friends and those that benifit a lot from having their own sites listed.
Proof of that can be found searching with google.
Ensure you take the time to submit your site to the right category and follow their directory guidelines.
If your site is not in English make sure you submit it to the world category.
Regional sites should be submitted to their respective regional category.
With the ODP you do not need to keep resubmitting over and over.
If
for some reason your site can not get listed after 30 days ask at the
Resource Zone and inquire about your site every six months thereafter.
And as many thought and hoped for,on their forum Resource Zone you dont have freedom of speech.
They will ban you from the forum when you questions are to direct.
The Yahoo! Directory:
Your site may list in Yahoo! powered search results even if you do not submit to their directory.
Yahoo!
charges a $299 recurring annual fee for commercial sites (double that
for adult sites), which is a bit expensive for many smaller sites.
Generally
I recommend paying for placement in many second tier directories before
paying for placement in the Yahoo! Directory since most of the second
tier directories only charge a one time site submission fee.
Non
commercial sites can list in the Yahoo! Directory free, and I can
attest to the fact that they have listed multiple sites I own for free.
Second Tier Directories:
Directories such as Gimpsy, GoGuides, JoeANT, BlueFind, Web Beacon and Skaffe all cost less than $50 to submit to.
JoeANT is free if you become an editor, and it only takes a couple minutes to sign up.
Gimpsy is free if you are willing to wait a few months.
Skaffe is free for editors.
GoGuides has a bulk submission discount program.
Wow Directory is another directory which has been providing free site submission.
Industry Specific Directories:
Business.com is a good business directory which costs $99 annually to list your site.
MicroSoft also has a small business directory which may be a good deal at under $50 per year.
There
are also many industry specific directories you can find by searching
for terms such as "my keywords + add URL," "my keywords + submit," or
"my keywords + directory."
Try to find directories which have one
time submission fees or directories which look as though they are going
to be longstanding directories.
It can get a bit out of hand trying to manage recurring fees with dozens and dozens of random small directories.
Tips to Pick Directory Categories:
Your site may fit in many categories.
When choosing a category to submit to in a directory I look at a few different ideas.
- Is my site likely to be accepted if I submit to this category?
-
Are there reasons this organization or other sites outside of this
organization are likely to place extra emphasis on (link into) this
category?
- How many links are listed in this category?
- Where does this category fit in the directory structure?
Directory Traffic:
Directories rarely provide much direct traffic.
The bulk of the value of a directory listing is in how search engines will evaluate the links.
Reciprocal Link Required:
Some directories require reciprocal links to be listed in them.
I do not recommend swapping links with most of these types of directories.
Directories are intended to list sites.
Sites are not intended to list directories.
If you like something then feel free to link to it, if not then don't.
The exceptions to this rule are that I am usually willing to reciprocate links with:
� extremely powerful sites that I do not believe are going to get penalized for aggressive link exchange
� directories which are well focused and are defined as an industry hub in the topic of your website
Directory Warnings:
Some sites that pose as directories do not provide static text links and / or their pages do not get indexed.
Many of these databases will never provide any direct traffic or link popularity.
Additionally many directories require reciprocal linking and use their sites to promote aggressive high margin products.
If
you link into sites that primarily promote high margin items then you
are sharing the business risk that site owner is taking.
Google Search Distribution:
Currently Google is powering around 50% of US search (Google, AOL, and many others).
Google shows up to 10 AdWords ads on their search results, but they keep them separate from the regular (or organic) listings.
There is no direct way to pay Google money to list in their organic search results.
So how does Google work?
Google is primarily link driven.
You
want to build various keyword rich links into your site from a variety
of websites hosted on a variety of C block IP addresses to improve your
Google rankings.
What Pages of My Site are Indexed by Google?
You can check to see what pages of your site are indexed by searching Google for site:www.mysite.com mysite.
How do I Submit My Site to Google?
While
Google also offers a free site submit option the best way to submit
your site is by having Google's spider follow links from other web
pages.
Where do I Rank in Google for My Keywords?
Tracking various sites helps me determine some of the ways Google may be changing their algorithm.
Google Backlink Check:
Backlinks is another way of saying "links into a page."
When
you check backlinks in Google (link:www.whateversite.com), it will show
a small sampling of pages linking into the site in question.
Many of the other major search engines show a larger sampling of links when you use their link functions.
To get a more accurate picture of links you will also want to check backlinks in Yahoo!.
Yahoo often shows many backlinks that the Google search will not show.
The code to check Yahoo! backlinks is linkdomain:www.site.com.
Yahoo! Search Distribution:
Yahoo!
technology now powers around 40% of US search including Yahoo!, MSN,
AllTheWeb, AltaVista, and many other sites that syndicate or use
portions of these search indexes.
Yahoo! places up to four
Overture ads at the top and bottom of Yahoo! search results, and also
places ads on the side of search results.
On some of their partner sites they usually blend these ads so that they look very similar to regular search results.
Yahoo! also has a paid inclusion program which allows them to generate revenue from the regular (or organic) listings.
Editorial Approach to Search:
Unlike Google, Yahoo! believes that a hybrid of human review and mathematics works better than just math alone.
Sites included in the Yahoo! Directory or in the Overture Site Match paid inclusion program are given an editorial review.
Yahoo! has also stated that some of their editors randomly review portions of the web.
It
is believed that sites which receive a review may eventually be given a
small ranking boost, though Yahoo! has stated that review does not
effect relevancy.
What Pages of My Site are Indexed by Yahoo!?
You can check to see what pages of your site are indexed by searching Yahoo! for www.mysite.com.
While
Yahoo! also offers a free site submit option (you must be logged in to
use it), the best way to submit your site is by having Yahoo!'s spider
follow links from other web pages.
Where do I Rank for My Keyword in Yahoo!?
Yahoo! Backlink Check:
Backlinks is another way of saying "links into a page."
When you check backlinks in Yahoo! (linkdomain:www.whateversite.com) it usually shows most of the known links into a site.
Often times Yahoo! counts many links that may not be counted by other search engines.
Yahoo! has also had some trouble with 301 redirects and may show the wrong URL locations for some
of the backlinks.
Additionally
you can check the backlinks into a specific page by specifying the full
URL (linkdomain:www.whateversite.com/index.php) of whatever specific
page you are looking at.
If most of a competitors backlinks come
from a single site or a few sites you can filter those out of the
search results using �site:www.thatsite.com �site:thatsite.com
Teoma Search Distribution:
Ask Jeeves and Teoma have a distribution of around 5% of US search.
On the Ask Jeeves site they show many Google AdWords ads at the beginning of search results on highly commercial terms.
These ads look very similar to other listings on the page.
Ask Jeeves also sells banner based advertising and Kelkoo product ads for some of their more expensive words.
On
the Teoma site they only list two ads and keep them separate from the
regular results, but most of their traffic comes from their other
search properties.
What is Teoma and How Does it Work?
Teoma is the search engine which powers Ask Jeeves. The core of the Teoma
search
technology is based upon the idea that society and the internet consist
of tiny communities which self organize into hubs and authorities.
Hubs and Authorities:
An authority is a site which is linked to by many sites covering that topic.
A hub links to many sites on a particular topic.
It is said that a good authority has links from many good hubs, and good hubs link to many good authorities.
Effect of Links on Teoma:
Teoma does most of its link work after the user searches, and primarily focuses on local communities, so:
- It is hard to measure link popularity in Teoma.
- Off topic links have extremely little effect on search results.
- On topic links are exceptionally important.
How to do Well in Teoma:
Since the primary focus of Teoma is on local communities it can take a long time to rank well in Teoma.
You will need to find ways to embed yourself in the correct local communities to list well for competitive terms.
Teoma has three sections to its search engine results pages.
The results section is Teoma is focused on local filled with the sites that are considered the authority websites.
They also provide communities.
The best refine and resources sections.
Refine is a list of other terms in the local associated way to rank well in Teoma term space.
For example, cheese may have phrases like cheddar in the refine is to link out to related section.
Resources are the sites are considered topical hubs.
Teoma and SPAM:
I actually do not know a bunch about their spam policies as it rarely comes up in discussion.
They have some of their policies listed when you apply for paid inclusion and you can report spam to
This E-Mail address is being protected from spam bots, you need JavaScript enabled to view it
The Problems with Teoma's Technology:
Since
Teoma is so focused on local communities it is very easy for people to
spoof false topical authority onto a site by creating many sites within
a specific theme that commonly link to the spoof site and other
authoritative sites on that topic.
Since Teoma does not provide a
ton of traffic and Ask Jeeves throws a ton of ads on the top of their
search results there is less of a strain on their algorithm by people
manipulating search results to gain profit than there is on Google or
Yahoo!
Avoid the Sure Downfalls:
Off the start you may be a little more desperate for links as you are learning SEO.
A
good tip is to stay away from drug, gambling, and porn sites as they
obviously destroy your credibility (and prevent others from wanting to
exchange links with you).
Exchanging links with other sites that
link to those types of networks may also hurt your site as well (since
they may eventually be penalized).
Always use your own judgment as you will be the one footing the bill if the idea is wrong.
Using Common Sense:
If
the site you are linking to has nothing to do with the topic of your
site and no relation to what may interest your viewers then you
probably do not want to link to them.
Carpet bomb linking
strategies may be successful in the short term, but over time they will
grow less and less useful to the point of eventual ineffectiveness.
There is also an indirect linking effect to linking.
If you link to really weird sites then it lowers the odds that other industry resources will want to link to your site.
Example of a Junk Link Request:
If you send junk link exchange requests like this they will usually be deleted as spam.
On the same token if you receive a message like this you should delete your spam.
Hello Sir / Ma'am
I was at your site today.
Great site I was wondering if you would want to exchange links.
I already have a link to you located at http://www.spamsite.com/links/reciprical-links-exchange2/other-sites37.htm
As you well know search engines look at links and give sites credit for their incoming links.
By linking together we make both of our sites stronger.
Please link to me with the following information
<a href="http://www.spamsite.com">Buy bla bla etc...</a>
Eventually Aggressive Promotion Techniques get Penalized:
Even
if a bad site has decent PageRank you can usually bet that they will be
losing it soon if they keep spamming people to exchange links with all
kinds of random sites.
Link exchange works up until a point and then eventually it become link farm.
Link farm:
A link farm is a site with a bunch of completely unrelated links scattered about in no logical order.
Eventually
overtly aggressive sites get penalized, but webmasters using those
techniques will have usually already started another site, and you may
end up suffering for their greed if you exchange links with an
incredibly aggressive site.
Another trick they may use is to have
you link into their good site and have them link to you from a
different domain of essentially no value.
Since their good site is
not going to get penalized (because it does not link out to anyone) it
does not matter much to them if their bad site receives a spam penalty
for being part of a link farm.
Garbage Links:
Guestbooks and the like are loosing their relevancy in search results each day.
Many
of the holes in blog software which permitted heavy spamming are also
being taken care of (although there are some sophisticated spam scripts
on the market).
Some people have stated that Google is even
filtering out pages with the file extension of links.htm or other page
filepaths that would obviously indicate the page was created for link
exchange.
Make sure you do not have a page with the filepath of link or links.
There are many legitimate long term free link opportunities available, though they may take a good bit of work to find.
Submitting Articles:
There are tons of places on the web where you can submit articles.
In
addition to submitting them, if you provide an extremely compelling
article with reprint rights you will find that it may just end up all
over the web.
Learn who the experts in your field are and pattern some of their actions with your own unique content.
Renting - Buying Links:
This is an advanced SEO technique most webmasters do not need to do.
I would learn and practice SEO for at least a month or two before I jumped right into any type of link rental advertisements.
Make Sure Your Site Works Well First:
Many people aggressively advertise on other sites without fixing internal conversion problems.
If you can double your conversion rate without much additional expense why should you concentrate on more Exposure first?
With that being said you can boost your link popularity by renting a few strong inbound links.
I
usually prefer to rent links from related sites as they may also send
direct traffic as well as provide a direct boost to link popularity.
Register with Directories First:
By registering your sites at many different directories it makes your link popularity look like a natural part of the web.
If
you only have one or two sources of links and those sources are selling
to other sites it may stick out rather easily to search engine
algorithms.
Renting links is extremely expensive if you do it incorrectly.
Registering
your sites with many tier two directories costs a one time fee of $50
or less each, which continues to pay for itself month after month.
Importance of Anchor Text:
When renting links ensure you use the best anchor text possible and do not rent links based exclusively on PageRank.
When I rent links I make sure I am renting extremely descriptive anchor text (as anchor text is the single most important part
of link reputation in Google or Yahoo!)
. Related links summary:
Google Sets Google Sets
Inventory overtureInventory overture term tool
adwords.google KeywordSandbox
WordTrackerWordTracker
Suggestion tooldigitalpoint suggestion
|