Dynadot โ€” .com Transfer

Spidered but not indexed - Why ?

SpaceshipSpaceship
Watch

PAULJEFFERY

Established Member
Impact
1
Hi,

Many of my sites have been spidered but it has been almost 2 months and the goog spidered sites have not been indexed - I know it can take a while to get them spidered but I heard they get indexed very soon after that - so what's the go ?

Should I be concerned that a site googs spidered in mid August has not yet been indexed - over 30 sites in that situation rite now.

Cheers

Pjay :'(
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
GoDaddyGoDaddy
Are these parking sites or dev'd?
 
0
•••
Hi Cy

As you can see, these are content - Whypark sites - I wanted to be able tp present some useful info to my visitors and have the freedom to put my own affil links etc - optimise my sites rather than the typical parking - some sites google index immediately others have only been spidered as I mentioned - the funny thing is, when I submit a forex site, or loan site it gets spidered and indexed quite quickly - and these are not exactly niche topics ! also don't have my own content on them yet but will get around to that - hopefully. At first I thought' oh yeah will take 3 months, then I saw only 2 weeks for those new sites - then saw the others eventually spidered after 10 weeks but not indexed.
What is the normal time between spidering and indexing ?

Cheers
 
0
•••
1
•••
Paul, try creating a quick blog on blogger.com and add links to your site there. Blogger=Google=Fast Indexing. The crawlers are all over the place on blogger. I think that should help. I've got some of my sites indexed in 3 days flat with the same method.
 
1
•••
Yeah,

At first I thought it could be a dupe content issue but my newest sites crawled and indexed within 2 weeks were also Whypark - also dupe content - more dupe content than any of the other sites.

And the Blogger idea - yeah ....... did that for two of my other sites and your right - they were listed by the end of the week. good advice. I know that helps them getting crawled ... but indexed ?

cheers,
thanks for the replies

Pjay
 
0
•••
When you check to see if they're indexed with the site: command, make sure you try it with AND without the www. I notice this varies between parking companies.
 
0
•••
I used my Google Blogger ( I forgot I had ) - as VARON recommended - the 10 x 5 month old sites were spidered and indexed within 48 hours !! Thanks VARON - good idea !! :)

Cheers,

Pjay
 
0
•••
Varon to the rescue again , nice one , i'll also give that a try.
 
0
•••
more advice on this

Varon said:
Paul, try creating a quick blog on blogger.com and add links to your site there. Blogger=Google=Fast Indexing. The crawlers are all over the place on blogger. I think that should help. I've got some of my sites indexed in 3 days flat with the same method.

I did this today and was wondering if its not going to cause me issues with goog in the future?

Also how do I know when my blog has been spidered and indexed? Where would I find this info? I have also added my blog and my parked domains to googles url crawler page.

thnx all!!
 
0
•••
go into Google and type


site:www.yourdomain.com

or

site:yourblog.blogspot.com


The "site" command is faulty when it comes to listing the number of pages you have or whether or not they've gone into the supplemental index, but it will at least tell you whether or not you're listed.
 
0
•••
I have noticed that after Googlebot spidered my WhyPark sites in the last few days that the # of indexed pages have reduced drastically. Virtually all my sites now have only 4-12 indexed pages where as before they had between 20-200 indexed pages.

Netmeg or anyone else, do you have any idea why this has happened? Would it have anything to do with the fact that I edited all my WhyPark templates (to make them look better)? I also changed some keywords and the Meta Description on the majority of them

GIL :)
 
0
•••
What exactly do you get when you run the site: command? Are you seeing a line at the bottom of the last result that says:

In order to show you the most relevant results, we have omitted some entries very similar to the XX already displayed.
If you like, you can repeat the search with the omitted results included.



Do you get the same results when you type:

site:www.mydomain.com

and

site:www.mydomain.com/*

??
 
0
•••
netmeg said:
What exactly do you get when you run the site: command? Are you seeing a line at the bottom of the last result that says:

In order to show you the most relevant results, we have omitted some entries very similar to the XX already displayed.
If you like, you can repeat the search with the omitted results included.



Do you get the same results when you type:

site:www.mydomain.com

and

site:www.mydomain.com/*

??

NO I'm not getting "In order to show you the most relevant results, we have omitted some entries very similar to the XX already displayed.
If you like, you can repeat the search with the omitted results included."

YES I get the same results on both
site:www.mydomain.com

and

site:www.mydomain.com/*

antivirusoftware.info is an example of having close to 200 indexed pages last month and now only has 8.

Thanks GIL
 
0
•••
Ok, your antivirusoftware.info site reports 15 results on my DC, so what you might be looking at is a temporary glitch. There's a lot to tell you, and I'm at work with clients climbing up my heels, so I don't have much time at the moment but I'll come back later and expound - here are some initial thoughts:

1) you might not have actually had 200 results when you looked at it before. Google's site command doesn't give accurate counts, it's a known issue, and it's by design. You can get a better picture of how Google sees your site in Google Webmaster Tools than you can by the site command. How many actual physical urls do you really have on that domain? And how much content is really and truly unique?

2) Specifically, some of those 200 results might have been "Supplemental." This is too big a topic for me to go into in depth at the moment, but supplemental pages includes previous copies of current pages, old pages that have been removed, and pages that Google just plain thinks aren't terribly important. Google *used* to display a green "supplemental" indicator when an url was supplemental, but now they've taken that out. So now, the only way to tell how many pages you have MINUS the supplementals is to append a /* after your domain name in the site command. For example, my fireworks site reports 374 pages with site, but with the /* appended, I only have 199. That's normal. Supplementals tend to disappear roughly after a year.

3) You definitely have an issue because all your meta description tags are exactly the same. You need to have unique page titles AND meta description tags for every single page. Go through and fix those, and wait a week or ten days, and see if some of your pages come back. Even though Google isn't showing the filtered results message at the end of your listings at the moment, you are definitely going to be filtered for that, if you aren't already. That's another type of duplicate content.

4) I ran copyscape on your site. There's a lot of other sites using the same articles. Google doesn't see any reason to index the same article a gazillion times. Whose site they actually decide to display is a crap shoot, and there's no guarantee that the same site will be showing it from one day to the next. I would start alternating WhyPark content with unique content immediately.

5) There's a lot of Google shake-up these days. One of my top pages dropped completely out of the index on Friday, but it was back at #1 on Monday (and still is) So don't panic.



More tonight.
 
Last edited:
0
•••
netmeg said:
Ok, your antivirusoftware.info site reports 15 results on my DC, so what you might be looking at is a temporary glitch. There's a lot to tell you, and I'm at work with clients climbing up my heels, so I don't have much time at the moment but I'll come back later and expound - here are some initial thoughts:

1) you might not have actually had 200 results when you looked at it before. Google's site command doesn't give accurate counts, it's a known issue, and it's by design. You can get a better picture of how Google sees your site in Google Webmaster Tools than you can by the site command. How many actual physical urls do you really have on that domain? And how much content is really and truly unique?

2) Specifically, some of those 200 results might have been "Supplemental." This is too big a topic for me to go into in depth at the moment, but supplemental pages includes previous copies of current pages, old pages that have been removed, and pages that Google just plain thinks aren't terribly important. Google *used* to display a green "supplemental" indicator when an url was supplemental, but now they've taken that out. So now, the only way to tell how many pages you have MINUS the supplementals is to append a /* after your domain name in the site command. For example, my fireworks site reports 374 pages with site, but with the /* appended, I only have 199. That's normal. Supplementals tend to disappear roughly after a year.

3) You definitely have an issue because all your meta description tags are exactly the same. You need to have unique page titles AND meta description tags for every single page. Go through and fix those, and wait a week or ten days, and see if some of your pages come back. Even though Google isn't showing the filtered results message at the end of your listings at the moment, you are definitely going to be filtered for that, if you aren't already. That's another type of duplicate content.

4) I ran copyscape on your site. There's a lot of other sites using the same articles. Google doesn't see any reason to index the same article a gazillion times. Whose site they actually decide to display is a crap shoot, and there's no guarantee that the same site will be showing it from one day to the next. I would start alternating WhyPark content with unique content immediately.

5) There's a lot of Google shake-up these days. One of my top pages dropped completely out of the index on Friday, but it was back at #1 on Monday (and still is) So don't panic.

More tonight.

It's all WhyPark content. No original content. At the moment it has 105 articles. It used to have more but I had to change all keywords that mentioned "software" eg "antivirus software" because for some strange reason Adsense was only displaying "cuddly toys" or "soft cuddly bears" type of ads.

That change in keywords reduced the # of articles from close to 200 to less than 100. That happened last month.

Google Webmaster Tools today shows the same quantity of indexed pages (8) as using the site: comand and GWT says that 89 URLs were submited on Sept 10th, meaning that WhyPark has automatically added another 16 articles in the past month and a half.

Your sugestion to start alternating WhyPark content with unique content immediately sounds like the most logical step to take 'cause it's becoming more and more obvious that duplicate content is going nowhere.

Thanks for your time and great advice Netmeg

GIL :)
 
0
•••
You're welcome, but get those meta description tags fixed too if you can - that's just a shade less important.
 
0
•••
Spaceship
Domain Recover
CatchDoms
DomainEasy โ€” Live Options
  • The sidebar remains visible by scrolling at a speed relative to the pageโ€™s height.
Back