Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Wanet Telecoms Ltd on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Performance with large number of rows

Status
Not open for further replies.

jby1

Programmer
Apr 29, 2003
403
GB
I have a (SQL 2000) table that will be growing by around 1.5 million rows per month. The row size is about 650 bytes. We will want to retain rows for either three months or twelve months, depending on feasability. The table will be accessible by up to 15,000 users, although in reality I would be surprised if 1% of these ever actually access it. There will be a one-of insert of rows every month, by automatic import, and then all access will be by select queries, for reports run by users.

Does anybody foresee any problems with this? What steps may I take to maximise performance? (Obviously I intend to specify indexes for frequently searched-on columns)

Any help gratefully received!
 
Other than the obvious as having as much memory as possible on your DB server and running only SQL Server on that machine I can only think of the following:

(1) Updating the statistics after your insert of new rows each month and possibly again every week just to aid in getting the best execution path for the querries.

(2) Are the querries all pre-built or will user being creating one-off querries on their own? If the users are constantly generating one-off querries you may want to update statistics a bit more frequently than if all the querries are pre-built.

Good Luck.
 
if u can separate your data with views, and create others tables, for instance:

table mytable ' u will separate it


mytable_1
mytable_2
mytable_3
mytable_4

u drop mytable....

then, u will join them in a view...

Try to find this documentation in the help

Perhaps it helps.




 
Just a word of encouragement: we run a LOT of databases/tables, but my favorite is a table with close to 500,000,000 rows of about the same length as yours. It is on a very humble server, and performs quite satifactorily. The only BIG problem is if you have to do any sort of recovery/restore/reindex. Otherwise it is great.
 
Guys

Thanks very much for you responses. As a result of my investigations, I have concluded that what I thought of as a large table, and what actually seems to consitute one, are two entirely different things! Therefore I shouldn't have any problems with it. I am about to implement the solution, so I will then know for sure! What is particularily useful for me is that the large number of records is only imported once per month, and can be done over night. Once imported, the data can be immediately condensed for running my reports off. The raw data will still be required for user display, but that is it.

Thanks again!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top