Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Drew Westling
Starting Member
8 Posts |
Posted - 2012-02-20 : 09:33:08
|
hi,I'm having issues with the compatibility level on my production environment. On the test and development server I'm running on level 90 meanwhile it's set to 100 on the production server.A query that takes about 4 seconds on a significantly slower machine takes about 5 minutes in production mode.The query is as follows: DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp ([objectID], [objectType], [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], xmlData.xmlData.query('./*') as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData)I'm using this to insert a fairly large XML data into the database and this is the first step of that transaction. If I run the query without the INSERT part it's fast on both servers.Is there a way of getting this query fast on a level 100 machine too? Kind regards,Andreas WestlingThings should be as simple as possible, not simpler. |
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
Posted - 2012-02-20 : 09:51:33
|
Are you sure it's the compat level that's the problem? If you set the dev or test servers to compat level 100, is it slow there?--Gail ShawSQL Server MVP |
 |
|
Drew Westling
Starting Member
8 Posts |
Posted - 2012-02-20 : 10:10:28
|
Yes, I'm quite sure of it. However, this query runs smooth for both levels: DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp ([objectID], [objectType], [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], nul AS [innerData] --xmlData.xmlData.query('./*') as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData)I've been looking at MSDN for differences between the two levels but didn't find anything that looked like it could relate to this problem. :(Things should be as simple as possible, not simpler. |
 |
|
SwePeso
Patron Saint of Lost Yaks
30421 Posts |
Posted - 2012-02-20 : 10:22:00
|
You can probably speed things up right here.As I interpret the code, you iterate ALL nodes and fetch certain attributes and their values into the temp table?Do you need all nodes? Or only the nodes where the attribute @delete is true? N 56°04'39.26"E 12°55'05.63" |
 |
|
Drew Westling
Starting Member
8 Posts |
Posted - 2012-02-20 : 11:58:54
|
Ok, I've done a bit more testing.All queries where run at the same server at minimal load.I've tested five queries, two at compatability level 90 and three at level 100.Query 1: ALTER DATABASE [TEST] SET COMPATIBILITY_LEVEL = 90 DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp (objectID, objectType, [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], xmlData.xmlData.query('./*') as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData) Query 2:Same as query 1 with the difference that I don't get the innerData from the result. ALTER DATABASE [TEST] SET COMPATIBILITY_LEVEL = 90 DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp (objectID, objectType, [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], null as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData) Query 3:Compatability level 100, same as query 1. ALTER DATABASE [TEST] SET COMPATIBILITY_LEVEL = 100 DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp (objectID, objectType, [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], xmlData.xmlData.query('./*') as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData) Query 3:Compatability level 100, same as query 1. ALTER DATABASE [TEST] SET COMPATIBILITY_LEVEL = 100 DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp (objectID, objectType, [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], xmlData.xmlData.query('./*') as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData) Query 4:Compatability level 100, same as query 2. ALTER DATABASE [TEST] SET COMPATIBILITY_LEVEL = 100 DECLARE @Temp AS TABLE(objectID int, objectType int, [delete] bit, innerData xml); INSERT INTO @Temp (objectID, objectType, [delete], [innerData]) SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], null as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData) Query 5:Just to verify that it's the insert that is causing the trouble: ALTER DATABASE [TEST] SET COMPATIBILITY_LEVEL = 100 SELECT xmlData.value('@objectID', 'int') AS objectID, xmlData.value('@objectType', 'int') AS objectType, xmlData.value('@delete', 'bit') AS [delete], xmlData.xmlData.query('./*') as innerData FROM @xml.nodes('/root/rev') [xmlData](xmlData) The client statistics tells us that it's only the third query that is really slow. If the problem would have been located in getting data from my xml document that should have resulted in a slow query for query 5. My conclusion to this test is that the xml column in the table that I'm inserting into is causing the server to do a lot more work at compat level 100.In reply to if I can limit the number of rows, I can't. :( This is only the first portion of the transaction and I need the posts that has the deleted-flag set to true too.Things should be as simple as possible, not simpler. |
 |
|
Jan Novak
Starting Member
6 Posts |
Posted - 2012-02-20 : 14:29:48
|
Try to use:1) Select ... Into #table ....2) When you turn on SET STATISTICS IO ON, the worktable is present in all tested scenarios?3) Do you need all root elements? Or only '/root[1]/rev'? |
 |
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
Posted - 2012-02-20 : 15:34:44
|
I checked with some people about this, one said to have a look at 2 kb articles: KB940943 and KB957205--Gail ShawSQL Server MVP |
 |
|
Drew Westling
Starting Member
8 Posts |
Posted - 2012-02-21 : 03:22:43
|
I've had a look at the two KBs you wrote, but I don't think they relate to this issue. I'm running SQL 2008 R2 with SP1 applied. The version number is 10.50.2500.I've created a test database with a sp if anyone is interested.-- SET THE COMPATABILITY LEVEL manually here, it seems to act strange when setting it in a script.exec sp_insertTest @useNull = 0I've created this database on my local development machine and here I have no problem with the query at level 100. However, when I restore this database to my production server for further testing I get the same result as before. About 4.30 minutes for level 100 and 0.9 seconds for level 90.[url=http://uploading.com/files/bb3m226d/test.bak.zip/]test.bak.zip - 139.4 KB[/url]Things should be as simple as possible, not simpler. |
 |
|
Drew Westling
Starting Member
8 Posts |
Posted - 2012-02-21 : 04:01:16
|
Jan Novak:I've now tried you approach and it looks totally different. See the output from the different queries and cases below:-- INSERT INTO @Temp... SELECT level 90--Table 'XmlData'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.--Table '#3A4CA8FD'. Scan count 0, logical reads 4249, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. -- INSERT INTO @Temp... SELECT level 100--Table 'XmlData'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.--Table '#3E1D39E1'. Scan count 0, logical reads 4249, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.--Table 'Worktable'. Scan count 1, logical reads 4927598, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.-- with SELECT ... INTO #Test, LEVEL 90--Table 'XmlData'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.-- with SELECT ... INTO #Test, LEVEL 100--Table 'XmlData'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Both SELECT ... INTO queries where fast at both level 90 and level 100. However, is this approach "thread-safe"? Can it handle multiple calls at the same time without them interrupting eachother?Things should be as simple as possible, not simpler. |
 |
|
Jan Novak
Starting Member
6 Posts |
Posted - 2012-02-21 : 05:13:36
|
Local temporary tables (with one #) are temporary tables that are available only for the session that created them. These tables are automatically destroyed at the termination of the procedure or session that created them.So if you create a local temporary table at session level you should truncate this table before your Select ... Into ... statement.Another session will create the own local temp table.Worktable indicates that data are processed in tempdb which will probably slow down the speed - logical reads 4927598IMHO there will be a difference in query memory limit on both servers which add Worktable to work with the data, because the memory for the query is probably exceeded. |
 |
|
Drew Westling
Starting Member
8 Posts |
Posted - 2012-02-21 : 09:04:11
|
Ok, this is good news. Thanks for the workaround. I'm still intrigued by how this query can be so slow in level 100. But in any way I have to modify my existing SPs to use this syntax so this doesn't matter so much. I will however use level 90 untill I've completed this refactoring.Thank you all for your help!Things should be as simple as possible, not simpler. |
 |
|
|
|
|
|
|