Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Shaun E on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

ASP pages and search engines

Status
Not open for further replies.

song2siren

Programmer
Jun 4, 2003
103
GB
Hello

I'm getting various HTTP 500-100 server errors generated by search engine robots visiting my ASP pages.

The errors are caused when the robots open, for example, mypage.asp without the link to a particular record in my SQL database such as mypage.asp?ID=5. I just needed some advice on the best way to allow all my ASP pages to be indexed, but prevent any errors if the robot tries to open them without the full URL.

I was thinking <meta name="robots" content="noindex"> on each of the ASP pages would do the trick, but I'm not convinced this would allow the indexing of all the records. Perhaps something in a robots.txt file would be better. Any help or suggestions would be very much appreciated.

Thanks
 
you could always default to a particular record if none is specified.

At the top of your mypage.asp...
Code:
If Request.QueryString("id") <> "" Then
  intID = CInt(Request.QueryString("id"))
Else
  intID = 1
End If

Tony
reddot.gif WIDTH=500 HEIGHT=2 VSPACE=3

 
Or make sure that when no querystring is present the page displays some good sales copy and do not use 'ID' in or as a querystring identifier it will stop many of the SE spiders from visiting the pages.




Chris.

Indifference will be the downfall of mankind, but who cares?
 
Many thanks for the suggestions on this - I'll give them a go and see what happens.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top