Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Optimizing a query

Status
Not open for further replies.

shepherd

Programmer
Nov 8, 2000
48
US
I have a 6 GB log file which i have to import daily. I'm using DTS to import this data into a SQL table, and the process up to that point is streamlined. Afterwards, I need to process the data and input into a separate table that will always contain about a billion rows. The table has a clustered index on a date field, and 2 other indexes on single fields. Logging for the table is turned off.

The Question: Can I optimize the insert process further either through code (playing with the index, ie manually rebuilding the index after inserts, etc.) or through settings on the table itself, or even through my approach to inserting the rows themselves (bulk inserts vs. something else)?

Thanks for the help!!
 
Hi there,
bulk insert is the fastes method for inserting.
You can further optimize bulk copy by following step:
1.drop the indexes on the questioned table
2. bulk copy command
3. re-create indexes.

I had done it for small amount (50 million records) of data, and it really gives an increase in performance. Though, I never had tried with large data transfer as of you.

And you can increase the perfomance while you are copying the data into seperate table by using/ creating proper indexes on source table.

 
Thanks for the reply,

Unfortunately I don't know of a way to do a bulk insert from a table to another table. I am only aware of being able to do this either from a data file to a table...
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top