Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Database Performance Question 1

Status
Not open for further replies.

CmPaiva

Programmer
Nov 13, 2002
124
PT
Hello, all!

I've the folowing question in mind:
Having a given Table, Named <MyTable>, which methode is the faster:
- Use of Recordset object, using Open rs,AddNew, Assign fields and Update,
or
- Use INSERT INTO MyTable (fields,...)VALUES (vals,...);

It's just a performance question, because the SQL methode, requires a smaller code (just an .Execute ...).

Thanks for your comments,
Carlos Paiva
 
Not actually measured that.
But.
As a general consideration:
Calling Execute involves some SQL interpreter started - that probably costly in terms of performance.

And with AddNew you can use BatchUpdate instead of Update (more speed gain?).

 
A straight bulk insertion is faster.
Use a command object, a parameter object, and set the Prepared property to True so that consecutive calls are fastest, and then run the action query against the Command object's Execute method.
 
CCLINT,
could you please give us some example?
I can't quite get that &quot;bulk insertion&quot; is.

Say, I have a table and I need to insert some thousand records.
How am I perform it if not by inserting line by line (via addNew or Insert)?

Any hints would be greatly appreciated.
 
Sorry, I mis-spoke by saying 'bulk'. Just remove the word 'bulk' from my statement.

What I meant was an ACTION query using INSERT INTO and the VALUES clause for a single record.

You use a bulk insertion of multiple records using the same action query when the values come from another table/file:

&quot;INSERT INTO SomeTable (Field1, Field2) SELECT Field1, Field2 FROM SomeOtherTable&quot;

You can also leave out the field names if both tables are identical.

You can also use Alias field names if the destination table has different field names.

 
It depends.

Parsing an SQL string is an awful lot of work. So avoiding it will speed up your program significantly. You can avoid it by using parameterqueries (on the client) or stored procedures (on the server). Most SQL parsers make use of indexes if they are there. So having the right indexes also speeds up the queries.

Recordset objects should be fast, but there again you should choose the right type. In DAO, you should use a table-type recordset and use indexes and the seek method. I Don't know how it is in ADO, but I am pretty sure that different cursors / settings make a hell of a difference.

In general, opening resources for each action is bad for performance. Keeping the resources open (connections, parameterqueries, recordsets, etc.) will give you much better performance.

Best regards
 
CCLINT,
First, thanks for quick reply.

So this method is for, say, copying from another table.
Fine.
And if we do not have that table but generating new records, should we still do inserting line by line (again, via addNew or Insert)?

(I just counted worst possible case - and it seems that my program will insert 300 000 records at start up... So naturally I want to speed it up as much as possible).

Any more light?
;)
 
> and it seems that my program will insert 300 000 records at start up

Where is the data for the many records coming from?
You could dump them into a (local) text file and then use a bulk update, transfering the records from the text file to the db.

>Parsing an SQL string is an awful lot of work

It can be. But does not need to be.
If it is written correctly, and, as I mentioned as well, you are using a command object and the prepared property, then it shouldn't matter after the first run.
An AddNew on the recordset object however does just the same:
It uses an INSERT INTO statement to actually pass the update to the provider in this form, because this is how the provider understands and receives the data, and then it updates it's records its self.
ADO and the recordset manager passes the AddNew once Update is call, to the provider via an INSERT INTO statement. This needs to be parsed as well. And on top of that, ADO needs to prepare the statement to pass.
If I have already written it, and it has been prepared using a command object, then the provider has already partially processed it.
 
Thanks all,for this discussion!

One question inposes over what DonQuichote say's.
>Parsing an SQL string is an awful lot of work

Well, using a recordset with AddNew Method still has the sql parser job to be done (Table still requires to be opened!) does'nt it? And if Your destination, is a Join, then it still has more parser work to be done. So what's the gain in using Recordset object, for adding say a few hundreds of records from a source table(s) instead of its sql counterpart?
Second, i'm not sure of that, but i think that internaly, recordset object still uses SQL commands to have the job done. Please comment.

Carlos Paiva
 
CCLINT:
>Where is the data for the many records coming from?
it's an output from another program, lot's of small files in some hierarchy of folders.

>You could dump them into a (local) text file
But will not it be &quot;double work&quot; for a computer?
BTW all my files (DB too) are local.

>and then use a bulk update, transfering the records from the text file to the db.
probably using &quot;ADO text driver&quot; mentioned elsewhere?
well, probably I should try and see if there is speed gain. But I'm afraid ADO text driver will use too-strong-for-a-given-task methods and it will be slow.

>An AddNew on the recordset object however does just the same:
It uses an INSERT INTO statement ...
:
I went using recordset.addNew because I thought &quot;adding empty record is simple - so here we have no SQL interpreter involved&quot;.
Besides, I use recordset.UpdateBatch method - so it updates table not each record but then I call it.

Using command object and the prepared property seems that INSERT statement will go faster. But can I use something like UpdateBatch (so it not write each record in DB but cache it somehow)?
 
>DB but cache it somehow

Use a transaction.

Anyways, if the data is all local, and the files are in some consistant asc format, then why not just use the JET ISAM text driver to insert the records into the local db (MDB?)

You set a connection on the db, and then use a action query on the text file - two lines of code for each file.

But, this is getting off track to CmPaiva's original question. Please start a new thread.
 
CCLINT:
>But, this is getting off track to CmPaiva's original question. Please start a new thread.

I think my questions all answered.

Thank you for your time.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top