Dynadot โ€” .com Registration $8.99

Protecting ourselves from malicious spiders

Spaceship Spaceship
Watch

gattoplano

Established Member
Impact
183
Hi,
I'm building a directory containing images. Now, I want to prevent this directory from being lurked from malicious spiders, sucking all of its contents.

Then, I want search engines' spiders to navigate it deeply, and to find all of the images and contents.

I was thinking about limiting the number of images/day views for common users and to set no limitations for search engines. But it sounds so bad :\

Should I use some other cloaking technique? Is there any good article around about this issue?
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
GoDaddyGoDaddy
try searching for "robots.txt". It's some sort of spider-limiter, although if it's a malicious spider I doubt if it will stop at any limitations, whether cloaking or not.
 
0
•••
You can forbid direct folder access by changing chmod to something less than 755.

-Steve
 
0
•••
I can't as my directory is structured so that common users must have access to any point in any moment.

I just want to limit the amount of information that someone can view. I would like to assume that it's impossible for a human to open and read carefully more than 10 directories in a minute, or to see more than 25 images, and limit access for content stealers while leaving it open for search engines.
 
0
•••
0
•••
Dynadot โ€” .com Registration $8.99Dynadot โ€” .com Registration $8.99
Unstoppable Domains
Domain Recover
DomainEasy โ€” Payment Flexibility
  • The sidebar remains visible by scrolling at a speed relative to the pageโ€™s height.
Back