Can someone give me an idea as to how to figure out what the overhead is on a server due to triggers? I recently added a number of 'audit triggers' coded as follows:
The three functions being used probably don't help... OperatorInitals takes the UserId and looks up the user initials an a table. The two date functions just return the time or date from the system time.
I had about 15-20 of these triggers in the DB and all seemed to be going well..... this morning I added about 50 more and things came crawling to thier knees. Deadlocks all over the place....
These fields are supposed to be set by our application developers so thier use is a 'belt and suspenders' approach at best.
Is there some way of seeing what % of the server load is spent on triggers? Is there another approach that would work better?
Paul
Code:
CREATE TRIGGER [dbo].[UpdateDSPTRUCKAuditFields] ON [dbo].[DSPTRUCK]
AFTER UPDATE
AS
BEGIN
SET NOCOUNT ON ;
UPDATE [dbo].[CodeMast]
SET Edited_By = (dbo.OperatorInitials())
,Edited_Time = dbo.McTimeStamp(GETDATE())
,Edited_Date = dbo.MCDate(getdate())
WHERE Recnum IN (SELECT Recnum
FROM Inserted)
END
The three functions being used probably don't help... OperatorInitals takes the UserId and looks up the user initials an a table. The two date functions just return the time or date from the system time.
I had about 15-20 of these triggers in the DB and all seemed to be going well..... this morning I added about 50 more and things came crawling to thier knees. Deadlocks all over the place....
These fields are supposed to be set by our application developers so thier use is a 'belt and suspenders' approach at best.
Is there some way of seeing what % of the server load is spent on triggers? Is there another approach that would work better?
Paul