Unstoppable Domains — AI Assistant

Google tweaks search to punish 'low-quality' sites

SpaceshipSpaceship
Watch

Archangel

randypendleton.comTop Member
Impact
1,774
Google has tweaked the formulas steering its Internet search engine to take the rubbish out of its results. The overhaul is designed to lower the rankings of what Google deems "low-quality" sites.

That could be a veiled reference to such sites as Demand Media's eHow.com, which critics call online "content farms" — that is, sites producing cheap, abundant, mostly useless content that ranks high in search results.

http://news.yahoo.com/s/ap/20110225/ap_on_hi_te/us_tec_google_search
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
AfternicAfternic
What I have trouble figuring out is how exactly is something like this being organised and will it come with a fee to make Google look the other way?
 
0
•••
What I have trouble figuring out is how exactly is something like this being organised and will it come with a fee to make Google look the other way?

I was thinking the same thing.
G00gle has moved from their " Do no evil " concept to " Do no evil unless we can make money from it ".
It is always the little folk who get hammered.
We have no-one to blame but ourselves because we let g00gle become too powerful.
 
0
•••
I have no proof, but I think this move to (finally) remove "low-quality" sites really has nothing to do with all the spam and MFA sites, but rather click fraud and conversion fraud. Think about it: If people click on a listing, go to a junk site, click on an ad they are interested and find what they are looking for, who is hurt or harmed?

But when you add the factor of fraud you remove the value to the advertiser. That is why my largest client stopped all content network ads last year, and they were spending thousands a month on content ads. The problem is that the fraud was massive and stopped being worth the cost.

This may hurt Google more than help, because in at least one case that I know of, they have suspended the account of a long time publisher that would never do any kind of fraud.

The correct response to the fraud problem in my opinion would have been to move to a "flat rate" model that removes any possible click fraud. Conversion fraud might still be an issue, but I think much less. But Google may never adopt that idea since they will earn much less from it.

Disclaimer: (see my sig)
 
0
•••
is it mean 'no auto-content blog' anymore?? woow.. :red:
 
0
•••
Yes, no auto-blog, no more spun article sites, no more spam comments.... it is all going away and people will have to start making "real" sites again... :)
 
0
•••
Well actually you CAN still have autoblog/spun article sites ... if you're willing to pay for advertising to get visitors...

Real sites - what a concept :D!
 
0
•••
means less automated stuff n maybe more quality search results
 
0
•••
Yes, no auto-blog, no more spun article sites, no more spam comments.... it is all going away and people will have to start making "real" sites again... :)

I won't lie this has made me consider re-entering the web development world.

For the last two years the web development world went from amazing quality websites to websites that aren't even up to grade school level. Spun articles, and all the spam killed, no slaughtered the legitimate web developers who were in it to improve the internet world and make a living doing so.

I was one of the first users of automated content, but it was always a extra piece to the cake and never the cake its self.
 
0
•••
The only problem being that G will run into the same issue that spamassassin ran into. Perfecting the fitlers. In the beginning spamassassin was blocking way too many hams.

Also how does big G know if someone stole an article from Site A? If G spiders Site B that stole the article, it gets the credit first, right? So Site A just got buried, even though they were the good guy here.
 
0
•••
That's a great question. My only guess is Google will look aat the age of the file and give 'ownership' to the older of them. But even this, I'm unsure of.

Also how does big G know if someone stole an article from Site A? If G spiders Site B that stole the article, it gets the credit first, right? So Site A just got buried, even though they were the good guy here.
 
0
•••
I think that google should go by the XML sitemaps when indexing, instead using link discovery with it. Then it can know better if something is original or copied. When a new link goes up on the sitemap, then google spiders it instantly and all of that jazz.
 
0
•••
Dynadot — .com TransferDynadot — .com Transfer

We're social

Spaceship
Domain Recover
CatchDoms
DomainEasy — Payment Flexibility
  • The sidebar remains visible by scrolling at a speed relative to the page’s height.
Back