Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Error 7391....unable to begin a distributed transaction 3

Status
Not open for further replies.

JoeAtWork

Programmer
Jul 31, 2005
2,285
CA
Hello good people of the SQL Programming forum!

I have an SQL 2000 database with a Company table. When a certain field changes from "Prospect" to "Customer", I wish to insert the record into a FoxPro 2.6a table. I have set up a linked server to the FoxPro database.

I have written a stored procedure that does the copying of information from the SQL database to the FoxPro database. This works excellently when called from Query Analyzer.

However, to do this automatically when the status changes from "Prospect" to "Customer", I wrote an update trigger for the Company table that calls the stored procedure. However, I get the following error when I now try to update the Company record:
Server: Msg 7391, Level 16, State 1, Procedure proc_AddCustToScheduler, Line 87
The operation could not be performed because the OLE DB provider 'MSDASQL' was unable to begin a distributed transaction.
[OLE/DB provider returned message: [Microsoft][ODBC Visual FoxPro Driver]Driver not capable]
OLE DB error trace [OLE/DB Provider 'MSDASQL' ITransactionJoin::JoinTransaction returned 0x8004d00a].
I think I understand that the problem is that inside a trigger there is an implicit transaction going on, but FoxPro doesn't support transactions and thus it fails. I think this would work if I could specify not to use a transaction, but I don't know how to override the implicit transaction in the trigger. I don't need it to be a transaction - either the record is added to the FoxPro database or it isn't, so I don't need to do rollbacks, etc.

What I have tried so far:

1. Add SET XACT_ABORT ON at the beginning of the update trigger
2. Tried the various suggestions in Microsoft articles 306212, 329332, 839279

Code:
CREATE TRIGGER trig_CompanyUpdate ON [dbo].[Company] 
FOR UPDATE
AS
DECLARE @OldStatus nchar(40),
        @CurrentStatus nchar(40),
        @CompanyID int,
        @CustID nchar(12)

SET XACT_ABORT ON

SELECT @CurrentStatus = Comp_Type, @CompanyID = Comp_CompanyID, @CustID = Comp_IdCust
 FROM Inserted

SELECT @OldStatus = Comp_Type FROM Deleted

IF (@OldStatus <> @CurrentStatus) AND (LTRIM(RTRIM(@CurrentStatus)) = 'Customer') AND (@CustID IS NOT NULL)
BEGIN
  EXEC proc_AddCustToScheduler @CompanyID, @CustID
END

SET XACT_ABORT OFF

 
first of all this trigger is written very badly and will not work when 2 rows get updated in a batch

SELECT @OldStatus = Comp_Type FROM Deleted

Deleted can contain many rows you will only process one row and your FoxPro table will not have all the rows (Oops...)


now to the question itself what instead of this you create a scheduled DTS package that runs every minute and insert the rows in FoxPro instead?

2 more things to try but I doubt they will work

what happens when you put SET XACT_ABORT ON inside the proc instead of the trigger

is DTC (distributed Transaction Coordinator) running



Denis The SQL Menace
--------------------
SQL Server Code,Tips and Tricks, Performance Tuning
SQLBlog.com, Google Interview Questions
 
Hi SQLDenis. I don't think I need to deal with batch updates, from what I see in the CRM application (the SQL database), users update a company record one at a time.

If I do need to process multiple rows in DELETED/INSERTED, I guess I would need to open a cursor to DELETED and loop through it to call the stored procedure for records that meet the criteria.

I have considered the DTC option and it is my backup plan. The thing I don't like about it is every time it runs I will need to get all Company records where status="Customer" and then look for FoxPro records that need to be added. I would think that's going to stress both systems, especially if run once a minute, and both these systems are used during the day (if I go with DTC I would probably run it once a day).

what happens when you put SET XACT_ABORT ON inside the proc instead of the trigger
I tried, didn't make a difference.

is DTC (distributed Transaction Coordinator) running
Well, if I look at local services I see "Distributed Transaction Coordinator" and it's status is "Started". As well, it is configured according to the Microsoft articles I quoted (i.e. it is Logged On as a "Network Service").

So, I does this mean there is no way to use a trigger without an implicit transaction?

 
JoeAtWork, never, ever assume there will always be only one record at a time inserted. This is one of the main ways in which database integrity is lost. People do update outside the user interface on occasion (think a mass price increase of ten percent for instance). Triggers get invoked even if someone updates outside the user interface; that's why all business rules should be enforced in triggers or constraints or default values. So that when some other method of updating the database is used, data integrity will not suffer.

Second, never consider solving the problem of multiple record inserts using a cursor. Cursors are one the worst things you can do to a database for performance. I have taken processses that took hours to run with a cursor and changed them to a set-based process that ran in millseconds or seconds. I'd advise that you forget you know how to create a cursor.

I know that doesn't solve your problem. But I'm trying to prevent you from having serious problems in the future. You need to start as soon as possible to build good coding habits even if you think the situation doesn;t require them. When teh situation changes or in a differnt situation where these things have amjor impact on performance then you will already know how to do things in a set-based fashion and won;t casue your database to come to a screaming halt becasue you try to solve problems inappropriatly with a cursor.

In this case if the trigger will not run a distributed transaction to Foxpro, have you considered having your application send a direct command to do the insert when the situation is appropriate? This still won;t handle any mass updates, but would get most of your day-to-day changes from the user interface. Then you could run a job once or twice a day to pick up any that happened outside of the user interface.

"NOTHING is more important in a database than integrity." ESquared
 
Doing a cross database insert requires that it be transactional (at least as far as SQL Server is concerned).

For distributed transactions to work correctly both server involved in the transaction (SQL Server machine and Fox Pro machine) both must be setup per and rebooted.

Before you reboot you also need to relax the security. In the component services Administrative Tool Expand down to the "My Computer" and right click and properties on it. MS DTC tab, then select "Security Configuration", check all the boxes and select the bottom radio button "No Authentication Required". Click ok a couple of times and close the window.

This should get distributed transactions working between the machines.

You can test this by forcing a distributed transaction like this.
Code:
begin distributed trasnaction
select *
from RemoteMachine.database.dbo.table
where 1=2
commit
(You may need to change the syntax so that it works for FoxPro, I'm not sure about it's object syntax.)

I'll dito what's also been said about making sure to process all rows not just one row.

Denny
MCSA (2003) / MCDBA (SQL 2000)
MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration / Microsoft Office SharePoint Server 2007: Configuration)
MCITP Database Administrator (SQL 2005) / Database Developer (SQL 2005)

My Blog
 
SQLSister - I've read many of your posts and appreciate your participation in this thread (consider me a fan).

I think I'm going to have to clarify a few points. First of all, neither of these systems are "my" application - I am merely building the bridge so that the older FoxPro app will have new customer records from the CRM. The CRM is from a third party vendor, and therefore I cannot change the business logic code. Ordinarily, I wouldn't even considering adding a trigger to the CRM (i.e. messing with their database), but their database is pretty simple, and since I'm not updating CRM data, I think it is OK to do it. The worst I foresee is if there is an upgrade I might loose the trigger, which I would have to add in again.

Second, we are not talking about records inserted into the CRM, it is updates only. The business logic of the CRM is that a company is always added first as a "Prospect". After that, at any point they can be upgraded to "Customer". That is why I was not anticipating batch updates - because there is no way for the user to do that. If one of the users found a way to enter data outside of the interface, they would be wrecking their CRM because they would be by-passing the business logic. I agree that the database should protect it's own integrity with table relationships, constraints, etc., but the CRM vendor has not chosen to do so.

However, Denis's post did open my eyes to the fact this may change (perhaps in the next CRM version the interface will have checkboxes to choose which prospects to upgrade - and do them all at once), hence my musings on using cursors.

I know cursors are evil, and I've avoided using them in 99.8% of my SQL code. However, I have not thought of an alternative, and you have not offered one either.

My stored procedure has the signature:

proc_AddCustToScheduler
@CompanyID int,
@CustID nchar(12)

It has logic converting the CRM company record to a format acceptable to the FoxPro database, as well as generating new ID's. So, if I have to consider batch updates, I also have to consider that I need to call this stored procedure for each record. So, what is your suggested alternative to a (read-only, forward-only) cursor?


 
mrdenny said:
Doing a cross database insert requires that it be transactional (at least as far as SQL Server is concerned).
If that is the case, then why am I able to run my stored procedure in Query Analyzer, where it is not in a transaction?

mrdenny said:
For distributed transactions to work correctly both server involved in the transaction (SQL Server machine and Fox Pro machine) both must be setup per and rebooted.
As I indicated in my original post, I've already changed the settings as recommended in that Microsoft article. This didn't alleviate the problem.

Also, I should point out that the SQL Server and FoxPro database are on the same machine.
 
Have you made the additional changes to the security that I posted? Those aren't covered in the origional article.

When you run the procedure manually you are using an implicit local transaction on both sides. Because you are running the procedure from a trigger the trigger has to wait for the procudure to complete before the update can be reported as being successful. Because of this a distributed transaction is used within SQL Server so that the update to the local table can be rolled back if the insert/update to the remote system is not completed for some reason.

In the Compont Services window does it show that transactions are getting through, failing, or not showing up at all?

Now that I re-read the initial error message that you posted above it seams to indicate that FoxPro doesn't support distributed transactions at all.

You may need to log the change to a local table, then have a job which runs in a cursor processing the records from that change table to the foxpro database.

Denny
MCSA (2003) / MCDBA (SQL 2000)
MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration / Microsoft Office SharePoint Server 2007: Configuration)
MCITP Database Administrator (SQL 2005) / Database Developer (SQL 2005)

My Blog
 
MrDenny - I've done all the security updates now and still have have the same issue.

MrDenny said:
Now that I re-read the initial error message that you posted above it seams to indicate that FoxPro doesn't support distributed transactions at all.
That has been my thought all along, and why I was hoping I could overide the implicit transaction in the trigger.

MrDenny said:
You may need to log the change to a local table, then have a job which runs in a cursor processing the records from that change table to the foxpro database.
That's a good idea, and may be my solution!

Right now I am waiting to hear back from my client on the question of whether they really need the update to be immediate, or if a daily job will do.


 
Even if daily isn't good enough, if every couple of minutes is good enough you could write a cursor which goes through the tables via a job every couple of minutes and sends the data.

Denny
MCSA (2003) / MCDBA (SQL 2000)
MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration / Microsoft Office SharePoint Server 2007: Configuration)
MCITP Database Administrator (SQL 2005) / Database Developer (SQL 2005)

My Blog
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top