Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
MyronCope
Starting Member
46 Posts |
Posted - 2010-04-07 : 11:21:05
|
I am working mostly on one database and there is an "Authors" table in another database that I need to use when adding new books to the "Books" table.I added the Insert/Update trigger because when there is an insert or update attempted in the "Books" table I have to make certain there is a matching author in the "Authors" table, I cannot add a book unless the author exists in the database.The trigger that I created below works fine if I am trying to insert or update only one record but it does not work correctly for inserts/updates on more than one record, so this is the problem that I am trying to fix with my trigger.So when I have the following three updates in this order:Good Update 1Bad UpdateGood Update 2My trigger will allow the first update (good update 1) but when it gets to the "bad update" it will not let that happen (which is great) but then it will abort the updates and not allow the "good update 2" (which is not so great).How do I modify the trigger below to allow the "good update 2" to happen (and the same for inserts). thanksMy trigger is below:CREATE TRIGGER trig_MyTriggerON someDatabase.dbo.BooksFOR INSERT, UPDATEASDECLARE @authorID VARCHAR(5)DECLARE @countAuthorsForThisBook NUMERIC-- Get authorID from the virtual table "Inserted" SELECT @authorID = (SELECT AuthID FROM Inserted)--Get authorID pre-existing in the someDatabase.dbo.Authors tableSELECT @countAuthorsForThisBook = (SELECT Count(AuthID) FROM someDatabase.dbo.Authors where AuthID=@authorID)IF (@countAuthorsForThisBook < 1) BEGIN -- Rollback tran if authorID not found in the Authors table ROLLBACK TRAN RaisError('Error: Insert/Update failed! You cannot insert a book for an author that does not exist in our database!', 16, 1) ENDELSE print 'Insert/Update success!'GO |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-04-07 : 12:45:48
|
You need to use Set Based processing to test all the rows in INSERTED - rather than the single row you are currently testing (which will be at random) using:SELECT AuthID FROM InsertedThis may do the trick:CREATE TRIGGER trig_MyTrigger ON someDatabase.dbo.BooksFOR INSERT, UPDATEASIF EXISTS( SELECT A.AuthID FROM Inserted AS I JOIN dbo.Authors AS A ON A.AuthID = I.AuthID GROUP BY A.AuthID HAVING COUNT(*) < 1)BEGIN ROLLBACK .... but note that if any of the Books being updated satisfies the condition then all the other rows in the transaction will be rolled back too |
 |
|
|
MyronCope
Starting Member
46 Posts |
Posted - 2010-04-07 : 12:59:28
|
quote: Originally posted by Kristen You need to use Set Based processing to test all the rows in INSERTED - rather than the single row you are currently testing (which will be at random) using:SELECT AuthID FROM InsertedThis may do the trick:CREATE TRIGGER trig_MyTrigger ON someDatabase.dbo.BooksFOR INSERT, UPDATEASIF EXISTS( SELECT A.AuthID FROM Inserted AS I JOIN dbo.Authors AS A ON A.AuthID = I.AuthID GROUP BY A.AuthID HAVING COUNT(*) < 1)BEGIN ROLLBACK .... but note that if any of the Books being updated satisfies the condition then all the other rows in the transaction will be rolled back too
thanks Kristen,I was thinking it over since my post and it looks like the best thing for me to do is to not do batch inserts/updates. It looks like rollback will rollback the entire remaining batch so I'm going to make each insert/update be one batch which will also meet the requirements of the system too.thanks |
 |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2010-04-07 : 13:40:11
|
| what about bringing Author table as a replicated copy to your database and then setting fk relationship to it?------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-04-07 : 15:04:15
|
| "I was thinking it over since my post and it looks like the best thing for me to do is to not do batch inserts/updates. It looks like rollback will rollback the entire remaining batch so I'm going to make each insert/update be one batch which will also meet the requirements of the system too."Problem with that is that something COULD update/insert a batch - even if it is only you running some ad-hoc SQL to sort out some data!And I think it is quite correct that if a single row violates a business-rule that the whole batch is aborted (its annoying for the operator, but it is correct from the database's perspective)We tend to solve this type of thing by doing the actual INSERT / UPDATE in a Stored Procedure. That can take care of checking which records are in violation of business rules, and "deal with them" - e.g. move them to a "Failed" table. As a long-stop the Trigger would then enforce the business-rules too, so if there was an error in the programming of the Stored Procedure (or any other SQL) the Trigger would enforce it. |
 |
|
|
MyronCope
Starting Member
46 Posts |
Posted - 2010-04-08 : 12:02:18
|
quote: Originally posted by Kristen "I was thinking it over since my post and it looks like the best thing for me to do is to not do batch inserts/updates. It looks like rollback will rollback the entire remaining batch so I'm going to make each insert/update be one batch which will also meet the requirements of the system too."Problem with that is that something COULD update/insert a batch - even if it is only you running some ad-hoc SQL to sort out some data!And I think it is quite correct that if a single row violates a business-rule that the whole batch is aborted (its annoying for the operator, but it is correct from the database's perspective)We tend to solve this type of thing by doing the actual INSERT / UPDATE in a Stored Procedure. That can take care of checking which records are in violation of business rules, and "deal with them" - e.g. move them to a "Failed" table. As a long-stop the Trigger would then enforce the business-rules too, so if there was an error in the programming of the Stored Procedure (or any other SQL) the Trigger would enforce it.
I am still supposed to insert good records in a batch, even if some of the records fail. You mentioned using a stored procedure so now I am thinking that I should do my stored procedure inside the if statement something like this:IF (@countAuthorsForThisBook > 0) BEGIN --success so call stored proc to insert book CALL STORED PROC HERE TO INSERT BOOK ENDELSE print 'FAILURE' |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-04-08 : 12:37:14
|
That would insert them "one-by-one", it would be better (i.e. more efficient) to insert the "good" ones in a single INSERT statement, and then report the "bad" ones separately. Presumably most of the time we hope there will be NO bad ones But that pre-supposes that you do have them in a "block" ready for inserting? |
 |
|
|
|
|
|
|
|