Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

SQL Indexing question 3

Status
Not open for further replies.

jabrony76

Technical User
Apr 23, 2001
125
US
Hi all,

Here's the situation. We house medical claims data and desperately need to speed up transactions because some of the tables are 50+ million rows. One suggestion was to re-visit our indexing.

The data is loaded in monthly batches and each batch has it's own unique ID which we've indexed. Do we need to re-create or rebuild the index after adding next month's batch?

Also, is there any rule of thumb for the # of possible entries for an index? Would it be advisible to add an index to something like a diagnosis code (there are aprox 15K possible codes).

Any help would be great!
Thanks.
 
Indexes should be added to columns that you are using for filtering.

Yes you should be rebuilding the indexes after each batch load. This should increase performance.

Denny

--Anything is possible. All it takes is a little research. (Me)
 
Run profiler, leave it for a while, save trace table/workload. Then analyze it manually or through index tuning wizard (up to recommendation step). This will give you extra information about columns that probably need indexing.

Be careful with clustered indexes.

Also consider playing with filegroups and disks. With such amount of data physical storage becomes very important.
 
Sounds like indexing may be your first issue. But there is planety more to performance tuning.

Consider putting your indexes and you transaction logs on separate physical drives than the one(s) you data is(are) on. Indexed views might also help.

This should also speed up processing.

Go through your code and eliminate cursors and select * and use joins instead of subqueries and all sorts of little things you can do to improve performance.

You might be best of getting a couple of books on performance tuning to help you when dealing with a database this size. Little things that don't matter a hill of beans when you have 50,000 records can make a big differnce at 50 million.

Questions about posting. See faq183-874
 
Thanks all for the suggestions - I will consider them all. Our physical storage is via a rather large IT supported SAN (10TB I believe) so we are able to physically store all the necessary data/logs/indices.

End user apps use views but we are the team who builds the warehouse and not end-users so we find ourselves digging in and transforming data with insert/update/deletes.

 
As each batch has a unique ID if you are accessing via that then consider loading each batch into a different table and accessing via an partitioned view. It will make things easier to handle - especially if the data is static once loaded.


======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top