delphidestructor
Programmer
- Oct 20, 2000
- 67
I have a stored procedure that inserts records into a table that is about 600,000 records in size at this point. This stored procedure is called recursively from a middleware application that reads a text file and calls this procedure for every line of the file that has valid information. At first (table size small) it would execute very quickly, regardless of the text data file size. Now that the table size has grown it takes about 0.5 seconds per insert. With a file that may have up to 1000 lines of valid data or possibly more this takes some time. Would it be worth the time to reprogram the middleware application to write the valid data to another file and then call a bulk insert on the newly created file?
I’m at a loss here. The table record size is only 32 bytes and there are only 2 indexes. The only thing that is any different about this table is the index that isn’t the PK has a bit value in addition to the fields in the PK. I wouldn’t think that this should cause much problem.
Mike
I’m at a loss here. The table record size is only 32 bytes and there are only 2 indexes. The only thing that is any different about this table is the index that isn’t the PK has a bit value in addition to the fields in the PK. I wouldn’t think that this should cause much problem.
Mike