I am writing an application that will need to convert the data from a DBF file into an Access database table.
The file has over 200,000 records in it.
The application looks at a table in the Access database, and the DBF file and fills 2 listboxes to allow the user to map the fields to each other. When the user clicks the continue button, the application then creates the SQL statements necessary to insert the values into the table.
Also, in the listboxes, you have the option to delete fields from the DBF file and insert Null values if you can't match a field.
During the SQL generation, I am checking for this Null value in the Listbox, which sends the app into a conditional that then finds out what datatype the Access table field is, and creates a default values, otherwise, just uses the value from the DBF file.
However, my code is apparently **EXTREMELY** inefficient because I'm not even executing the SQL statement yet, and it would take probably 10 or 15 minutes to to work through the recordset. (only a 700 MHz CPU)
I'm thinking this should be faster, however, I can't seem to come up with a better way to do this.
Also, this routine needs to be generic as it will be handling multiple DBF files (not at the same time).
Any suggestions on how I might improve the performance?
The file has over 200,000 records in it.
The application looks at a table in the Access database, and the DBF file and fills 2 listboxes to allow the user to map the fields to each other. When the user clicks the continue button, the application then creates the SQL statements necessary to insert the values into the table.
Also, in the listboxes, you have the option to delete fields from the DBF file and insert Null values if you can't match a field.
During the SQL generation, I am checking for this Null value in the Listbox, which sends the app into a conditional that then finds out what datatype the Access table field is, and creates a default values, otherwise, just uses the value from the DBF file.
However, my code is apparently **EXTREMELY** inefficient because I'm not even executing the SQL statement yet, and it would take probably 10 or 15 minutes to to work through the recordset. (only a 700 MHz CPU)
I'm thinking this should be faster, however, I can't seem to come up with a better way to do this.
Also, this routine needs to be generic as it will be handling multiple DBF files (not at the same time).
Any suggestions on how I might improve the performance?