Pasted the following into a query but it seems to attempt to insert the same 100000 records for each batch.
Is there an easy may to adjust so it will insert the next 100000 until all 4 million are inserted.
thanks.
--Insert rows into MyTable from ImportTable
--where key value doesn?t already exist.
--Set rowcount to 100000 to limit number of inserts per batch
Set rowcount 100000
--Declare variable for row count
Declare @rc int
Set @rc=100000
While @rc=100000
Begin
Begin Transaction
--Use tablockx and holdlock to obtain and hold
--an immediate exclusive table lock. This usually
--speeds the insert because only one lock is needed.
Insert table1(email,source)
Select t.field1,2
From table2 t
Where (t.dupe is null) and (not(t.field1 is null))
--Get number of rows updated
--Process will continue until less than 100000
Select @rc=@@rowcount
--Commit the transaction
Commit
End
Is there an easy may to adjust so it will insert the next 100000 until all 4 million are inserted.
thanks.
--Insert rows into MyTable from ImportTable
--where key value doesn?t already exist.
--Set rowcount to 100000 to limit number of inserts per batch
Set rowcount 100000
--Declare variable for row count
Declare @rc int
Set @rc=100000
While @rc=100000
Begin
Begin Transaction
--Use tablockx and holdlock to obtain and hold
--an immediate exclusive table lock. This usually
--speeds the insert because only one lock is needed.
Insert table1(email,source)
Select t.field1,2
From table2 t
Where (t.dupe is null) and (not(t.field1 is null))
--Get number of rows updated
--Process will continue until less than 100000
Select @rc=@@rowcount
--Commit the transaction
Commit
End