Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations MikeeOK on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Alternative to a cursor

Status
Not open for further replies.

jon24422531

Technical User
Jan 27, 2004
295
GB
Hi guys
I always get a good response here, so I am confident of an answer to this:
For historic reasons our company creates a new database for each new customer. These consist of several identical tables, two of which store all the pertinent information for that customer. Some of these tables are large, some not so, but they are all identical in structure.
Now we would like to copy all the data from each DB (total around 13,000,000 records) to another server on one large DB each evening. Our DBA (who is away this week) always approaches these issues in the same way by opening QA and typing "DECLARE CURS CURSOR FOR SELECT...."
We run this from the target server:
Code:
DECLARE @Database VARCHAR(100), @SQL VARCHAR(1000)
DECLARE CURS CURSOR FOR SELECT C.CustID FROM SERVER4.MAINDB.dbo.CUSTOMER C JOIN SERVER4.master.dbo.sysdatabases D ON C.CustID=D.name ORDER BY C.CustID
OPEN CURS
FETCH CURS INTO @Database
WHILE @@FETCH_STATUS=0
	BEGIN
	SET @SQL='INSERT INTO ALLCUSTS..STORAGE SELECT '''+@Database+''', * FROM SERVER4.'+@Database+'.dbo.STORAGE'
	EXEC(@SQL)
	FETCH CURS INTO @Database
	END
CLOSE CURS
DEALLOCATE CURS
This often fails and sends me this Email:

JOB RUN: 'MAINDB : All Custs STORAGE' was run on 20/03/2006 at 18:30:01
DURATION: 12 hours, 39 minutes, 24 seconds
STATUS: Failed
MESSAGES: The job failed. The Job was invoked by Schedule 1 (Mon - Sat, 18:30). The last step to run was step 2 (Cursively INSERT From Customer DB's).

Step 1 incidentally is to truncate the table.
So my question is, is there a better way to run this rather than using a cursor?

Jon
 
This problem is sequential by nature - get into one database, do INSERT INTO, proceed with next database etc. And cost of read-only cursor is irrelevant compared to INSERT itself (it is massive, right)? So any approach without cursor won't make it any faster. Even well (un)known sp_msForeachDB sproc uses worker cursor for internal handling.

> Now we would like to copy all the data from each DB (total around 13,000,000 records) to another server on one large DB each evening.

How about doing that selectively? Taking only rows inserted/changed since last "synchronization"?

------
[small]select stuff(stuff(replicate('<P> <B> ', 14), 109, 0, '<.'), 112, 0, '/')[/small]
[banghead]
 
Vongrunt
Thanks for your swift response. I guess you are right that in this instance only a cursor will do. However, I would like to know a little more about synchronization. Is it in BOL and is it fairly easy to understand?

Jon
 
> Is it in BOL...

Yup, see "replication".

> and is it fairly easy to understand?

Once all that terminology is absorbed :) - IMHO, yes. Implementation is sometimes tricky though.

To oversimplify concept: imagine table to export has timestamp column. Whenever row gets inserted or updated, new timestamp value takes place - always greater than last one (see @@DBTS in BOL). So if during every import you remember greatest timestamp for each table, next time only rows with greater timestamps must be transferred.

------
[small]select stuff(stuff(replicate('<P> <B> ', 14), 109, 0, '<.'), 112, 0, '/')[/small]
[banghead]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top