song2siren
Programmer
Hello
I'm getting various HTTP 500-100 server errors generated by search engine robots visiting my ASP pages.
The errors are caused when the robots open, for example, mypage.asp without the link to a particular record in my SQL database such as mypage.asp?ID=5. I just needed some advice on the best way to allow all my ASP pages to be indexed, but prevent any errors if the robot tries to open them without the full URL.
I was thinking <meta name="robots" content="noindex"> on each of the ASP pages would do the trick, but I'm not convinced this would allow the indexing of all the records. Perhaps something in a robots.txt file would be better. Any help or suggestions would be very much appreciated.
Thanks
I'm getting various HTTP 500-100 server errors generated by search engine robots visiting my ASP pages.
The errors are caused when the robots open, for example, mypage.asp without the link to a particular record in my SQL database such as mypage.asp?ID=5. I just needed some advice on the best way to allow all my ASP pages to be indexed, but prevent any errors if the robot tries to open them without the full URL.
I was thinking <meta name="robots" content="noindex"> on each of the ASP pages would do the trick, but I'm not convinced this would allow the indexing of all the records. Perhaps something in a robots.txt file would be better. Any help or suggestions would be very much appreciated.
Thanks