Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
JAG7777777
Starting Member
25 Posts |
Posted - 2009-12-04 : 04:13:54
|
Hi All,I have an intermittent failing package when kicked of by the Agent.This is the message I receive from the Agent that attempts to run the package:"Transaction (Process ID 51) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. In the package, at the point where it is failing, I have two SQL data flow tasks that both contain simple, one line SQL DELETE FROM WHERE queries that are running in parallel as they look for different rows to delete (i.e. different key values - one WHERE X = '17' and one WHERE X = '16'). I have them in parallel in a sequence container to improve package performance (as they will both delete around 60,000 rows)I have the SQL tasks in the data flow set up as 'Direct input' and 'Bypass prepare false' and 'Result set None' - They are basic OLE DB connections to a SQL DB using windows authentication.In terms of the properties that might be affecting transactional processing, I have changed the isolation level properties on both to ReadUncommitted. Both still have the default setting of Supported on their Transaction option (and so do all the parents going right back to the package level) - is this the problem?The package is failing - but only once or twice a week (running early am). Other times it runs fine.When I set it off running manually it always works fine - but this is during the day when there are less overnight hungry processes running.The failure message is always the same and is always the same of the two delete from SQL tasks - i.e. it is as if one is always starting before the other and locking the resources thereby preventing the other from starting......?Please can anyone suggest a solution? I am new to SSIS and don't want to go fumbling about too much without understanding what I am doing....hoping an expert or two can throw some light... Many thanks in advance.JAG7777777 |
|
smeeluv
Starting Member
20 Posts |
Posted - 2009-12-04 : 08:42:51
|
I would remove the parrallel scenario of the delete jobs and run them in sequence and see if that resolves the problem. If it does not, then it seems like something else is locking those tables and not allowing your process access to delete from them. See what happens with them running and sequence so you can rule that out first. |
 |
|
JAG7777777
Starting Member
25 Posts |
Posted - 2009-12-04 : 10:53:13
|
Hi smeeluv,Thanks for the reply.....yes, I did think about that one too and have changed the package to run in series and will monitor it.........but if we can get them in parallel it would be quicker (package takes about 2hrs in total at the moment)....? If this is possible of course in SSIS. I thought it was possible in a SQL server DB to delete from the same table via two different processes at the same time? Just seems odd that the package does not always fail and that we have witnessed these two tasks actually working together before now.....do you know whether it is simply a case of bad luck that they, from time to time, end up reading the same record that is being deleted? Kind regards,JAG7777777JAG7777777 |
 |
|
smeeluv
Starting Member
20 Posts |
Posted - 2009-12-08 : 10:47:23
|
It's definitely possible in my opinion. I could see it working if you have two processes running deletes to the same table that work with separate groups of data. If there's nothing like that set up than yes this will happen from time to time. Is there a way to merge the two processes to one delete process? |
 |
|
JAG7777777
Starting Member
25 Posts |
Posted - 2009-12-16 : 03:31:31
|
Thanks for your reply :-)Its not really practical to merge - but I can see why you're suggesting this and could see how this would work if the situation were a little different.The package business logic is this.....The package that is locking checks to see if some source data has run at each of a number of sites (an OLE DB source).....if the data is there, it deletes the tuples from a destination table (and later re-populates this destination table with the new tuples that have been proven to exist at the OLE DB source).We have this check and delete data flow going on simultaneously for each site - those sites that didn't have source data (i.e. their previous extract failed) don't delete the tuples in the destination table - so this way, we don't lose all their data in the destination table - we essentially are only missing the previous day's data.The destination table is shared, but each site has their own site code, so each process deletes tuples from the same shared table but only where the site code is theirs.If SSIS 2005 did incremental loads then none of the above would be required :-(. We're basically re-loading all a sites' data every night - but only if they had a successful extract to an intermediate table using another package.Our host DB is very old and has no time stamp on alot of the data or any way of applying triggers. We also have this legacy data extraction (to the OLE DB source at each site) which works very well - hence, we are making use of this to cut out a lot of the donkey work :-)Hope this makes sense :-)BTW - we changed the package to delete in series and since then it hasn't locked.......it is only minutes slower aswell - so this does suggest the parallel deletion was perhaps being prohibited somewhere in terms of processing and we weren't actually getting any benefit from the parallel approach....?Maybe the processes don't like sharing the index to the primary keys when they check for deletion....?Thanks again for your reply.JAG7777777 |
 |
|
|
|
|
|
|