What's new

Content interpretation by Google - Quality vs Original

shags38

Top Contributor
I saw some interesting comments in another thread about Google now suggesting they will be vetting the quality of content as well as originality, however I decided it was not really on topic to continue to comment in that thread - in particular was a comment "original doesn't mean quality" in respect to content as about to be assessed by Google crawlers following the Panda algorithm update. The comment was made in support of Googles apparent new stance that it will be assessing the "quality" of content as well as its originality :rolleyes:

It begs the question, even with a thousand Pandas how will a Robot determine the quality of the content of an article or paragraph or even a simple sentence. Take a look at the articles in all of the article sites, including e-zine, and you will see myriads of examples of poorly written articles that ramble on in a pointless charade of keyword nonsense (like most of my posts in here) - most are written by Asians or Indians (no racial slurs intended) who do not have command of the English language and who are pumping articles out faster than rice bubbles leaving a cereal box in the hands of a two year old.

I have over 2000 sites parked at WhyPark and I am continually bitching to them about the second rate quality of their articles to no avail of course - they get them from the same sources as everyone else even though they claim them to be proprietary. The point being that a Robot cannot determine the correctness, and hence quality of the statements in the content, the spelling in may cases (did you loose something?? - lose / loose is a perfect example among many of incorrect spelling in context but is ok with spell check robots). How can a robot determine the quality of an argument or a sales pitch or the flowering of specifications or of say creative accounting?? I say Red is a mixture of seventeen variations of certain colours (colors), someone else says sixteen and there not being a mathematically correct answer the robot cannot offer a quality assessment.

Will Googlebot start rating the quality of statements of philosophers both past and present?

Google is an advertising company - it needs to reassure its paying advertisers, the big fellas who help make up the Billions spent with Google Adwords, that they are continually working for them in driving to achieve continual improvement in the quality of search results. In doing so they "indoctrinate" many and scare most into certain trains of thought that they engender.

I will take a power of convincing that a robot, regardless of the algorithms poured into it, can determine "quality" of the written word in volume when it struggles in isolation (lose / loose example again). Highly educated humans, people with very high intelligence quotients, people who have dedicated their whole lives to writing / writings (and this includes Lawyers) often cannot agree on the interpretation of the written word or verse.

In the eyes of GoogleBot and other search engine robots original = quality, it cannot be any other way until the day of the development of computers with true intelligence (not algorithms which are mathematically based). Even "original" is variable - when statements to the effect of "changing an article by about 30% of its content makes the new article original in the eyes of Google" is stated by experts it suggests a blight on the meaning of the word original. Google knows it is almost impossible for the volume of content of the web for a certain subject to all be original - there is only so much one can write about an apple before you venture close to or over the line of repetition or even plagiarism.

What Panda did apparently suggest was that "longer" content pages would be favored, i.e. more words, not necessarily quality, but more words to a page and a higher ratio of written content versus images / videos / ads etc on a page. (page - not site - google ranks pages, not sites)

But hey what would I know about this subject? :) I just know that Google has a lot of people bluffed and scared, eating out of their hand - quite an achievement for an advertising company - more power and influence than the President of the USA. Matt Cutts or anyone at Google makes a comment and a ripple becomes a Tsunami.

...... and so as the sun begins to set (rise actually) I bring my prose to a close :D

any comments girls?

cheers,
Mike
 

Jonathan

Top Contributor
It begs the question, even with a thousand Pandas how will a Robot determine the quality of the content of an article or paragraph or even a simple sentence.

Bounce rate, time spent on page, number of link backs from established sites, etc. Also, Microsoft Word (or any other modern word processor for that matter) can recognise spelling errors, grammatically incorrect sentences, etc., so why wouldn't Googlebot be able to do it?
 

Shane

Top Contributor
Bounce rate, time spent on page, number of link backs from established sites, etc. Also, Microsoft Word (or any other modern word processor for that matter) can recognise spelling errors, grammatically incorrect sentences, etc., so why wouldn't Googlebot be able to do it?

Bounce rate and time on page were the first two that sprung to mind for me too.

I hadn't thought of a spelling and grammar check, but I guess that would be child's play compared to some of the other things Google can do.

Those factors combined would give a very good indication of a site's quality in my opinion.
 

Ash

Top Contributor
Novice question here... if you don't have google analytics installed would google still be able to determine bounce rate and time on page?
 

Chris.C

Top Contributor
Novice question here... if you don't have google analytics installed would google still be able to determine bounce rate and time on page?
It has enough market share with Chrome to get some reasonable feedback on sites that have noteworthy amounts of traffic.

Bounce rate, time spent on page, number of link backs from established sites, etc. Also, Microsoft Word (or any other modern word processor for that matter) can recognise spelling errors, grammatically incorrect sentences, etc., so why wouldn't Googlebot be able to do it?
My line of thinking is similar to Jonathan's. It's simple enough to measure a number of content variables. I'd actually be surprised if Google isn't already doing it.

We all read content on the web everyday, if you compare content from a low quality article directory or splog to quality content on a well maintain blog there is obvious differences.

Of course for Google to use something as a ranking factor they will look for something that seems to be true in aggregate, and I don't think that it's too difficult to imagine looking for signals like, good grammar, sentence structure, correct spelling, use of headings, lists, bullet points, short concise paragraphs that aren't keyword stuffed rather than long, verbose, adjective filled, keyword stuffed paragraphs, integration of unique images, videos, etc as all being indicators or quality.

I'm not saying I have tested these things exclusively, and I'm not implying that if you spell a word wrong your site will be deindexed, though what is not coincidence is the fact that sites that include the above are also a lot less likely to have a high bounce rate and will have increased time on page than a splog with nothing but article directory articles and ads.

Just think about the sorts of content you like to read after searching for something and rest assured that Google is trying to tweak it's algorithm to get that sort of content to the top.
 

Community sponsors

Domain Parking Manager

AddMe Reputation Management

Digital Marketing Experts

Catch Expired Domains

Web Hosting

Members online

Forum statistics

Threads
11,100
Messages
92,053
Members
2,394
Latest member
Spacemo

Latest posts

Top