I've got a new ASP-based site, which contains only about 30 actual ASP pages. About 5 pages in each of 6 sections... the meat of the site is based on a whole bunch of include files. Actually, they are NOT includes, they're EXECUTES...
My situation: I don't want robots indexing the INCLUDE files and presenting the code snippets to Googlers, so I set my Robots.txt with "Disallow: /inc/"... Then, I also added the META tag <NAME="ROBOTS" CONTENTS="INDEX,FOLLOW">... on each page (again, via INCLUDE), which will (hopefully) direct the 'bots to visit pretty much all the content on the site, but always via the "pages", never the "includes" on their own.
My question: Does that seem like it's gonna work?
My remark: Yes, I'll be checking the log over the next couple days, and yes, that'll tell me whether it's working or not... I'm just asking in case it definitely WON'T work and perhaps one of you fine folks can save me some time and consternation.
My situation: I don't want robots indexing the INCLUDE files and presenting the code snippets to Googlers, so I set my Robots.txt with "Disallow: /inc/"... Then, I also added the META tag <NAME="ROBOTS" CONTENTS="INDEX,FOLLOW">... on each page (again, via INCLUDE), which will (hopefully) direct the 'bots to visit pretty much all the content on the site, but always via the "pages", never the "includes" on their own.
My question: Does that seem like it's gonna work?
My remark: Yes, I'll be checking the log over the next couple days, and yes, that'll tell me whether it's working or not... I'm just asking in case it definitely WON'T work and perhaps one of you fine folks can save me some time and consternation.