I have a 6 GB log file which i have to import daily. I'm using DTS to import this data into a SQL table, and the process up to that point is streamlined. Afterwards, I need to process the data and input into a separate table that will always contain about a billion rows. The table has a clustered index on a date field, and 2 other indexes on single fields. Logging for the table is turned off.
The Question: Can I optimize the insert process further either through code (playing with the index, ie manually rebuilding the index after inserts, etc.) or through settings on the table itself, or even through my approach to inserting the rows themselves (bulk inserts vs. something else)?
Thanks for the help!!
The Question: Can I optimize the insert process further either through code (playing with the index, ie manually rebuilding the index after inserts, etc.) or through settings on the table itself, or even through my approach to inserting the rows themselves (bulk inserts vs. something else)?
Thanks for the help!!