Hi,
I posted a thread bout six months ago abot paging, and got some great replies but now I have another problem where the system I was going to go with wouldn't quite work and I just wanted to see what you all think about this as an alternative, and whether or not search engines would look down on this.
I was originally having my URLs rewritten to be something like this:
hotelsite.com/toronto-1
hotelsite.com/toronto-2
hotelsite.com/toronto-3
etc, etc...
...where the last number is the current page of hotels that the user is viewing. However, we may need to add locations to the database the contain a number at the end of the string, and then this makes it more confusing to deal with when I parse the URL. So I was thinking of making a fake extension which is actually the page number instead, like this:
hotelsite.com/toronto.1
hotelsite.com/toronto.2
hotelsite.com/toronto.3
Does anyone see any reason why URLs like that wouldn't be cached by search engines? It would make it so much easier to work with if we did it something like that. Could this perhaps be an ever better alternative, as the search engines may ignore file entensions entirely?
Thanks for the help,
A.
I posted a thread bout six months ago abot paging, and got some great replies but now I have another problem where the system I was going to go with wouldn't quite work and I just wanted to see what you all think about this as an alternative, and whether or not search engines would look down on this.
I was originally having my URLs rewritten to be something like this:
hotelsite.com/toronto-1
hotelsite.com/toronto-2
hotelsite.com/toronto-3
etc, etc...
...where the last number is the current page of hotels that the user is viewing. However, we may need to add locations to the database the contain a number at the end of the string, and then this makes it more confusing to deal with when I parse the URL. So I was thinking of making a fake extension which is actually the page number instead, like this:
hotelsite.com/toronto.1
hotelsite.com/toronto.2
hotelsite.com/toronto.3
Does anyone see any reason why URLs like that wouldn't be cached by search engines? It would make it so much easier to work with if we did it something like that. Could this perhaps be an ever better alternative, as the search engines may ignore file entensions entirely?
Thanks for the help,
A.







