tsyle,
To answer your question fully, I will need some additional information.
1.) Do you need your program to "log in" to the website from which you will be downloading this file?
2.) Will the program be posting data to the site in order to generate these reports?
If you are simply trying to download a file from a website without any sort of interactive feature (aka log in to the website), then you will likely want to look into the System.Net.HttpWebRequest or System.Net.WebClient classes:
[URL unfurl="true"]http://www.csharpfriends.com/Articles/getArticle.aspx?articleID=115[/url]
If you need your program to log in to the website to generate the report first, then there are a number of considerations:
1.) Security. How will you maintain security across your program AND the website? Will you use SSL to encrypt the HTTP session? How will the username and password for the website be stored locally? Will the user have to enter their credentials every time they want to generate a report?
2.) Handling of cookies and scripting sessions. WebClient and HttpWebRequest do have the ability to handle cookies if you're willing to go to the trouble to use them.
3.) Reverse-engineering the website. In order to log in and properly store the authentication cookies (assuming the website does it this way), you first need to know exactly how the website works. Does the website store a session id in a cookie on your computer?
4.) Scraping websites. It is generally not recommended to scrape websites in order to process information. If the website administrator makes a few minor modifications to the website, will your program be robust enough to continue working without recoding? What if they administrator changes the login mechanism?
5.) There are a number of other considerations that may arise depending on the specifics of the implementation.
In short, simply downloading a file from a predictable or static URL is fairly simple to do using the WebClient and/or HttpWebRequest classes. Constructing a program to log in and scrape a website for information is, however, a considerably more difficult task and is highly prone to scalability and maintainability problems.
You should check with the site administrator to see if they offer a programming API to interface with their website (which most websites do not offer, unfortunately). API's are (supposedly) robust and will not break when they update their systems.
Sincerely,
Nathan Davis
Phyrstorm Technologies, Inc.