Hi
Password protection. That is the one invented for access control.
[ul]
[li]Disallow access in robots.txt.
Some downloaders will obey, but search engine crawlers certainly will index them.[/li]
[li]Check the HTTP_REFERER header.
Some downloaders will be excluded, but you will deny the access for some legitimate human visitors too.[/li]
[li]Check the USER_AGENT header.
Some downloaders will be excluded, but is a common practice to fake it.[/li]
[li]Check for interval between subsequent requests.
Some downloaders will be excluded, but is very low probability to work.[/li]
[/ul]
Anyway, a visitor could just visit your site, than saving the images from the brewser cache. Or a proxy cache.
Feherke.