Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations wOOdy-Soft on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

sql passthrough 1

Status
Not open for further replies.
Sep 17, 2001
673
US
I am not sure if it is possible but this is what I need. I am using FoxPro 7 to in front of MS SQL 7. I want to take lets says 10 records from a cursor and insert them into a table on my SQL server. But I don't want to post the records I am inserting until I am ready. So in case there is a poweroutage during the insert process I don't have bad data. So if possible can someone demonstrate how to prepare inserts of the 10 records on the server side for example. Then when done finalize the inserts. Finally if the user wants to cancel the current transaction, be able to rollback the last transactions. Thanks!
 
Hi,
. So in case there is a poweroutage during the insert process I don't have bad data

For that you have to use Transactions or Buffered Tables.

BEGIN TRANSACTION
SELECT statement... into Cursor Mycursor
select Mytable
Append from DBF('Mycursor')
END TRANSACTION

If the power short comes in the middle of the above routine execution, you won't get the appended records.

if you want to use Buffered tables...

CURSORSETPROP("BUFFERING",5,"MYTABLE")

SELECT statement INTO CURSOR Mycursor

BEGIN TRANS
SELECT MYTABLE
APPEND FROM DBF('MYCURSOR')
TABLEUPDATE(1)
END TRANSACTION


HTH

Suggestions Invited...

[smile] gchandrujs [sunshine]
 
Rob,

What you want to do is certainly possible. Gchandrujs mentioned transactions. That is the usual approach, although you need to write the code in a stored procedure in the server not in VFP as he suggeste.

There is no point in doing it in VFP, because at the time that you are creating your VFP cursor, the server is not aware of what you are doing.

You need a stored procedure on the server that looks something like this:

BEGIN TRANSACTION
INSERT ..... (first insert)
IF @@ERROR<>0
-- something went wrong
BEGIN
ROLLBACK TRANSACTION
RETURN
END
-- repeat above for remaining inserts
-- finally, commit the transaction
COMMIT TRANSACTION

A simpler approach is to batch the inserts into one SQLEXEC(). In VFP, do this:

lcCommand = ;
&quot;INSERT .... (first insert)&quot; + ;
&quot;INSERT ..... (secnd insert)&quot; + ;
etc.

SQLEXEC(lnConn,lcCommand)

SQL Server will treat the batch of commands as a single unit, and will wrap it in a transaction.

Neither of these approaches will let you trap for the user cancelling. There's no way of doing that on the server, because the server is not aware of the user interface. Put another way, once you have sent the commands, it will be too late to change your mind.


Mike Lewis
Edinburgh, Scotland
 
Hi,

I am using MySQL which does not support transaction and VFP3 which does not allow more than one query in a SQLEXEC. Therefore I have to use a simple but effective trick to achieve your requriement.

First of all insert everything into a temp table in the server. Once the job is done and you are sure you want to insert into the real target, execute a INSERT INTO xxx SELECT yyy FROM temp_table. You may revoke anytime before the last INSERT is send.
 
Thanks for the comments. So there is not a way using sql passthrough to prepare inserts then post them? I would think there is a way to buffer the data on the SQL server side then pass an execute. Then the only thing that would happen if there was a power outage in the middel of the insert to buffer, nothing would commit. However if all inserted and the commit was executed I would be set. I guess the key is that I need to insert 1-100+ times to a SQL server buffer of somekind then pass a commit transactions command. sqlprep? Thanks for any thought you all have.
 
I was looking some more and found this, but not sure if this is exactly what I need

= SQLSETPROP(gnConnHandle, 'Transactions', 2) && Manual transactions
DO WHILE !EOF
= SQLEXEC(gnConnHandle, &quot;INSERT INTO authors (au_id, au_lname);
VALUES ('aupoe', 'Poe')&quot;) && Modify the authors table
ENDDO
= SQLCOMMIT(gnConnHandle) && Commit the changes
 
The code worked, I can insert into some limbo place then when I am done inserting the 1 to xxx records, I execute the commit and wa-la, all the records commit to the table. I am just wondering where this limbo place is and if it could get filled up with bad data? Any ideas?
 
When I inserted 3 more lines, then did not commit to simulate power failure the table become locked up. I disconnect the handle then reconnected with a different handle. Any hints of what and how to fix this?
 
Ok here it is. I closed my foxpro session because I could not even query the data from the test table from the server. Then the server immediately released the lock. Somehow the server locked my test table and would not let me even select data from the table until I closed foxpro down. So if this new program I am writing locks up in the middle of an insert and does not execute the commit I guess am in big trouble? How come this would happen? Will this sqlprep/sqlcommit be a good situation for a multi-user environment?
 
Ok last word then I will shut up and hopefully someone has some bright idea. When I execute the insert and try to select records from the table, it is locked. As soon as I execute the commit, the lock is released. I obviously can't have this happen. Help!
 
Hi again Rob,

You seem to be making some progress -- at least, you are learning about the issues. Good old trial and error.

You asked if there was a way to prepare the statement on the server until you are ready to execute them. There is ... you use a feature called Prepare/Execute. Essentially, you send the command to the server using SQLPREPARE(). Then you keep on calling SQLEXEC() without a second parameter as may times as you like. Each call will re-execute the command. If you can paramterise the command, it will run with different data each time.

However, the aim of Prepare/Execute is to improve performance. It does not solve your problem of power outages, because that can still happen in the middle of the process.

I think my original plan of batching the INSERTs and sending them in a single command is a good one. A similar approach would be to write all the commands to a text file, and send that as a batch.

Your idea of using SQLSETPROP(gnConnHandle, 'Transactions', 2) with SQLCOMMIT() is another possibility. This is essentially transaction processing controlled by the client. However, I read somewhere (the Hacker's Guide I think) that this is an inefficient and unreliable method. It looks like you have discovered that for yourself <g>.

Hope you manage to crack it.

Mike Lewis
Edinburgh, Scotland
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top