shags38
Top Contributor
As you are probably aware I had an issue with a couple of my sites with respect to keyword density.
Having tried to sort that out I have become a little bemused about (1.) what Google sees as keywords, (2.) how it determines keyword density and most importantly (3.) what influence / control a webmaster has in determining to GoogleBot what words it should look for?
On the last point (3.) I have read lately that telling GoogleBot what keywords to look for is like telling David Beckham how to curve a football..... i.e. that the Bot reads the page, sorts the frequency and placement of words and phrases and then determines what are the keywords/phrases of the page and not what I and many others thought was the case, that you enter the keywords in your CMS that you want the bot to look for and it goes about looking for them.
Recent reading has suggested that the Bot takes no notice of that keyword input by the webmaster or the cms. Anyone like to comment on that?
Using the Keyword Density Checker in my CMS it shows one word, two word phrase, 3 word and 4 word phrases results and I can change the keyword(s) for it to primarily look for and in the summary report. Some other tools on the net show different results breakdowns and some do not allow you to/ask you to nominate the keyword(s) ....however here is the point that bemuses me.
Let us for one moment assume that the comments regarding point (3.) are in fact the case.
Here is the dilemma as I see it. If your main keyword that you want to rank well for in SERPs is a 3 word phrase; mobile phone plans, home contents insurance, big red apples the it is more than likely that the two word phrases (all 3) and individual words are important in themselves albeit secondary to the 3 word phrase in the order desired. It is also likely that some of the single words will be used in isolation to the main 3 word phrase. This means that it is possible that a single word will win the vote by the GoogleBot as a, or possibly the, main keyword that is in fact used at a rate beyond the acceptable ratio.
example;
"When choosing a Mobile Phone Plan it is important to look at how often you use your mobile and compare the mobile phone plans available to you in the marketplace. Here at XYZ Mobile Pty Ltd we have a range of phone plans for the occasional mobile user to those that use their mobile phone very regularly" .... etc etc.
The Bot is likely to return keywords as follows or something similar;
mobile - 6
phone - 4
mobile phone - 3
phone plans - 2
mobile phone plan(s) - 1 each
So how does the Bot determine which keyword phrase is the key one that (a) you want to rank for and (b) what it will allocate the most SERP credit to?? ....... if indeed it does not take notice of your request to look for specific words/phrases in particular..... "please rank me for the following search term ...."
In the above example if the page (we will assume home page) is to have sufficient content to be regarded as having quality for a good return for a search then we should work on a minimum of say 500 words of text on the page?? (this is where I went wrong with my two sites sandboxed for excessive density - it wasn't so much the over-use of the keywords but more that the volume of words was very thin ...... to the point ..... now I have added some prattle and all is ok).
If you are aiming at a 3-5%max keyword density it relates to 15-25 repeats of a word or phrase. Using "mobile phone plans" 15 times in 500 words is feasible without sounding too repetitious, 25 is probably pushing it. However aside from the 3 word phrase, in this case it will be impossible to write quality informative content without also using the phrases mobile phone and phone plans and additionally using the words mobile, phone and plan in isolation over an above their usage in the 3 and 2 word phrases.
It would be probable that the word phone or mobile could end up in a 30-40 plus count including their use in the phrases. Add to that any use of the words or phrases in ant link anchor text in the footer and the numbers jump (after cleaning up one of my penalized sites my keyword density showed the highest density for an unrelated word/term that happened to be used in anchor text in outward links in my footer ....... go figure)
So apparently Google applies some contextual logic in its mathematical algorithm (oxymoron?) and makes allowances for the situation of single words volumes described above, essentially discounting by a percentage I suppose ... or does it? if it does then how effectively? and what about two word phrases, does it also discount that ratio? I would say not.
So I find it difficult to abide by what Google wants ..... to write authorative engaging descriptive copy, staying on topic with good grammar like a human for humans and not robots using a mathematical algorithm which cannot determine the accuracy or truthfulness of statements and which use spellcheck and grammarcheck (both designed by a Czech) and wouldn't know a red apple from a green apple if they saw it.
I am perplexed. I am now trying hard to use acceptable white hat SEO practices and still rank well in SERPs but find it hard to please my mistress.
Back to the point - what are your thoughts on how keywords are determined and requesting specific keyword(s) search in text of GoogleBot?
cheers,
Mike
p.s. I must need a Nanny Nap ....... I had to vet this post as I initially used the word Who instead of which for a bloody robot!
Having tried to sort that out I have become a little bemused about (1.) what Google sees as keywords, (2.) how it determines keyword density and most importantly (3.) what influence / control a webmaster has in determining to GoogleBot what words it should look for?
On the last point (3.) I have read lately that telling GoogleBot what keywords to look for is like telling David Beckham how to curve a football..... i.e. that the Bot reads the page, sorts the frequency and placement of words and phrases and then determines what are the keywords/phrases of the page and not what I and many others thought was the case, that you enter the keywords in your CMS that you want the bot to look for and it goes about looking for them.
Recent reading has suggested that the Bot takes no notice of that keyword input by the webmaster or the cms. Anyone like to comment on that?
Using the Keyword Density Checker in my CMS it shows one word, two word phrase, 3 word and 4 word phrases results and I can change the keyword(s) for it to primarily look for and in the summary report. Some other tools on the net show different results breakdowns and some do not allow you to/ask you to nominate the keyword(s) ....however here is the point that bemuses me.
Let us for one moment assume that the comments regarding point (3.) are in fact the case.
Here is the dilemma as I see it. If your main keyword that you want to rank well for in SERPs is a 3 word phrase; mobile phone plans, home contents insurance, big red apples the it is more than likely that the two word phrases (all 3) and individual words are important in themselves albeit secondary to the 3 word phrase in the order desired. It is also likely that some of the single words will be used in isolation to the main 3 word phrase. This means that it is possible that a single word will win the vote by the GoogleBot as a, or possibly the, main keyword that is in fact used at a rate beyond the acceptable ratio.
example;
"When choosing a Mobile Phone Plan it is important to look at how often you use your mobile and compare the mobile phone plans available to you in the marketplace. Here at XYZ Mobile Pty Ltd we have a range of phone plans for the occasional mobile user to those that use their mobile phone very regularly" .... etc etc.
The Bot is likely to return keywords as follows or something similar;
mobile - 6
phone - 4
mobile phone - 3
phone plans - 2
mobile phone plan(s) - 1 each
So how does the Bot determine which keyword phrase is the key one that (a) you want to rank for and (b) what it will allocate the most SERP credit to?? ....... if indeed it does not take notice of your request to look for specific words/phrases in particular..... "please rank me for the following search term ...."
In the above example if the page (we will assume home page) is to have sufficient content to be regarded as having quality for a good return for a search then we should work on a minimum of say 500 words of text on the page?? (this is where I went wrong with my two sites sandboxed for excessive density - it wasn't so much the over-use of the keywords but more that the volume of words was very thin ...... to the point ..... now I have added some prattle and all is ok).
If you are aiming at a 3-5%max keyword density it relates to 15-25 repeats of a word or phrase. Using "mobile phone plans" 15 times in 500 words is feasible without sounding too repetitious, 25 is probably pushing it. However aside from the 3 word phrase, in this case it will be impossible to write quality informative content without also using the phrases mobile phone and phone plans and additionally using the words mobile, phone and plan in isolation over an above their usage in the 3 and 2 word phrases.
It would be probable that the word phone or mobile could end up in a 30-40 plus count including their use in the phrases. Add to that any use of the words or phrases in ant link anchor text in the footer and the numbers jump (after cleaning up one of my penalized sites my keyword density showed the highest density for an unrelated word/term that happened to be used in anchor text in outward links in my footer ....... go figure)
So apparently Google applies some contextual logic in its mathematical algorithm (oxymoron?) and makes allowances for the situation of single words volumes described above, essentially discounting by a percentage I suppose ... or does it? if it does then how effectively? and what about two word phrases, does it also discount that ratio? I would say not.
So I find it difficult to abide by what Google wants ..... to write authorative engaging descriptive copy, staying on topic with good grammar like a human for humans and not robots using a mathematical algorithm which cannot determine the accuracy or truthfulness of statements and which use spellcheck and grammarcheck (both designed by a Czech) and wouldn't know a red apple from a green apple if they saw it.
I am perplexed. I am now trying hard to use acceptable white hat SEO practices and still rank well in SERPs but find it hard to please my mistress.
Back to the point - what are your thoughts on how keywords are determined and requesting specific keyword(s) search in text of GoogleBot?
cheers,
Mike
p.s. I must need a Nanny Nap ....... I had to vet this post as I initially used the word Who instead of which for a bloody robot!