Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Most efficient use of ADO when looping? 2

Status
Not open for further replies.

Jasen

IS-IT--Management
May 1, 2002
435
US
I have a program, written as a service, that periodically opens all the files in a specific directoy, parses them out and sends the results to a stored procedure (SQL server is localhost) to be processed and stored away. Then it deletes the file and moves to the next. I've tried it a couple different ways and although it works fine, it seems pretty slow to me. I have a generic example of the way I'm currently doing it, was wondering if anyone sees any obvious traps?

Code:
sub main

'dim variables, setup cnn object and open connection here

curFile = Dir(myPath)

do while curFile <> ""
'for every file I open I recreate the command and prm 'objects.  I suppose I could do this once, globally, but 'this part of the process is really not slow
     Set cmd = New ADODB.Command
        
     cmd.ActiveConnection = cnn
     cmd.CommandText = "spCountPoints"
     cmd.CommandType = adCmdStoredProc
       
     Set prmNode = cmd.CreateParameter("Node", _ 
     adVarChar, adParamInput, 50, node)

     cmd.Parameters.Append prmNode
     '5 more prm's after this one

     Open cuFile for input as #fileNum

     Do while not EOF(FileNum)
           'line input, and parse out the text here
           prmNode.value = sNode
           '5 more prm.values set here
           'each line parsed in the file gets a sql call
           cmd.execute
     loop 'go to next line
     
     close #filenum
     kill curFile
     set prm's & cmd = nothing
loop ' next file

'all files are done
cnn.close
set cnn = nothing

An alternative I considered was to open a table, and push an entire file to it in batch mode, then fire off the stored procedure after the parsed file was finished. Not sure if that would really save in time though, and it would require a rewrite of the SP.
 
The most efficient way of doing this would be to skip ADO entirely.

I wrote a routine recently that dumps data in to a sql server database. It looks like this...

Code:
Private Sub ImportRecordOfType(ByVal FileName As String, ByVal RecordType As String, ByRef Server As SQLDMO.SQLServer)

    Dim BULK As SQLDMO.BulkCopy

    FileName = VBA.Left$(FileName, Len(FileName) - 1) & RecordType

    If Not FileExists(FileName) Then
        Exit Sub
    End If

    Call Server.Databases.Item("<TableName>").ExecuteImmediate("Truncate Table TempData")
    Set BULK = New SQLDMO.BulkCopy
    BULK.DataFilePath = FileName
    BULK.RowDelimiter = vbCrLf
    Call Server.Databases.Item("<DatabaseName>").Tables.Item("TempData").ImportData(BULK)
    Call Server.Databases.Item("<DatabaseName>").ExecuteImmediate("ImportRecordType" & RecordType)
    Call Server.Databases.Item("<DatabaseName>").ExecuteImmediate("Truncate Table TempData")
    
End Sub

Basically, I bulk loaded data in to a temp table, then executed a stored procedure to put it where it belongs.

-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
I tried to hide the database name, but made a mistake. Where you see [!]<TableName>[/!], change to [!]<DatabaseName>[/!].

-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
Interesting method. I'll have to play with that.
 
Back to the ADO thing...

If you just want to execute a stored procedure with 5 input parameters, and it doesnt return a recordset or outparameters... you can just use the .Execute() method of your ADO connection object.

cnn.Execute "spCountPoints 'one', 2, 'THREE', " & Four & ", 5"

 
Yes, but using connection's .execute method is not any faster than using a command object's .execute method, (ok, maybe you'll gain some time in not having to create parameter objects?) and it opens you to possible injection attacks running straight text sql statements like that, if you're allowing user input anywhere.
My original problem was the fact that the routine was doing a SQL call after every line in the file, several thousand lines, and several hundred files in all. The app creating the text files was actually making them faster than my app could process them.

George, I tried out your example, and it was an order of magnitude faster. Bulk imported 130 650Kb text files in about 4 minutes, and maybe another 5 for the stored procedure to parse the rows in the table and do what I wanted with it. Whereas running a single sql command after each line input in every file took about 15-20 minutes for just one file, and hours for all of them. I need more stars.
 
Wait!

130 Files in 4 minutes with my method.
1 file in 15 minutes with the old method.

That's incredible. I jknew it would be faster, but I didn't really expect that. I'm glad to have helped.

-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
<I need more stars.

I'll give him one. I can use that too!

Bob
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top