Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-08 : 15:15:35
|
I have a trigger that runs everytime a table is updated (which is not very often). The trigger compiles several pieces of information for comparison and places that data into another table. Because of the nature of the data, I have to completely recreate the table being inserted into each time. I am curious if there is some way I can speed up this process, which current takes about 15 seconds, to process 130K rows.Below is my code:INSERT INTO tblCustom_MAP_Student_Year_Compare (Cur_Year,Cur_Examiner, Cur_School_code, Cur_School_Name, Cur_Content_Area, Cur_Grade_Level, Cur_Achievement_Level, Cur_Map_Scale_Score, Cur_MAPA_SCORE, Cur_Map_Score_Invalidation, Cur_District_Student_Number, Cur_MOSIS_State_Id, Cur_First_Name, Cur_Last_Name, Cur_IEP_Student, Cur_Map_Free_Reduced_Lunch, Cur_Race, Cur_Gender, Cur_LEP_ELL_In_US_Less_Than_Year ,Cur_Building_Less_Than_Year,Prev_Year, Prev_Examiner, Prev_School_code, Prev_School_Name, Prev_Content_Area, Prev_Grade_Level, Prev_Achievement_Level, Prev_Map_Scale_Score, Prev_MAPA_SCORE, Prev_Map_Score_Invalidation, Prev_District_Student_Number, Prev_MOSIS_State_Id, Prev_First_Name, Prev_Last_Name, Prev_IEP_Student, Prev_Map_Free_Reduced_Lunch, Prev_Race, Prev_Gender, Prev_LEP_ELL_In_US_Less_Than_Year, Prev_Building_Less_Than_Year, County_District)SelecttblMAP_Year1.Year, tblMAP_Year1.Examiner_Name, tblMAP_Year1.School_Code, tblMAP_Year1.School_Name, tblMAP_Year1.Content_Area, tblMAP_Year1.Grade_Level, tblMAP_Year1.Achievement_Level, tblMAP_Year1.Map_Scale_Score, tblMAP_Year1.MAPA_SCORE, tblMAP_Year1.Map_Score_Invalidation, tblMAP_Year1.District_Student_Number, tblMAP_Year1.Mosis_State_ID, tblMAP_Year1.First_Name, tblMAP_Year1.Last_Name, tblMAP_Year1.IEP_Student, tblMAP_Year1.Map_Free_Reduced_Lunch, tblMAP_Year1.Race, tblMAP_Year1.Gender, tblMAP_Year1.LEP_ELL_IN_US_LESS_THAN_YR, tblMAP_Year1.Building_Less_Than_Year, tblMAP_Yr_Prior.Year, tblMAP_Yr_Prior.Examiner_Name, tblMAP_Yr_Prior.School_Code, tblMAP_Yr_Prior.School_Name, tblMAP_Yr_Prior.Content_Area, tblMAP_Yr_Prior.Grade_Level, tblMAP_Yr_Prior.Achievement_Level, tblMAP_Yr_Prior.Map_Scale_Score, tblMAP_Yr_Prior.MAPA_SCORE, tblMAP_Yr_Prior.Map_Score_Invalidation, tblMAP_Yr_Prior.District_Student_Number, tblMAP_Yr_Prior.Mosis_State_ID, tblMAP_Yr_Prior.First_Name, tblMAP_Yr_Prior.Last_Name, tblMAP_Yr_Prior.IEP_Student, tblMAP_Yr_Prior.Map_Free_Reduced_Lunch, tblMAP_Yr_Prior.Race, tblMAP_Yr_Prior.Gender, tblMAP_Yr_Prior.LEP_ELL_IN_US_LESS_THAN_YR, tblMAP_Yr_Prior.Building_Less_Than_Year, tblMap_Year1.County_DistrictFROM student_test tblMAP_Year1 LEFT JOIN student_test tblMAP_Yr_Prior on tblMAP_Year1.MOSIS_STATE_ID = tblMAP_Yr_Prior.MOSIS_State_ID and tblMAP_Year1.Content_Area = tblMAP_Yr_Prior.Content_Area and tblMAP_Yr_Prior.Grade_Level IN ('03','04','05','06','07','08') and tblMAP_Yr_Prior.Year = (tblMap_Year1.Year - 1)WHERE tblMAP_Year1.Grade_Level IN ('03','04','05','06','07','08')and LEN(Ltrim(Rtrim(tblMAP_Year1.MOSIS_STATE_ID))) > 0and (tblMap_Year1.County_District = tblMap_Yr_Prior.County_District or tblMAP_Yr_Prior.COUNTY_DISTRICT IS NULL) |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2012-02-08 : 15:23:22
|
Did you check execution plan for this trigger? what are the costly steps involved?------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-08 : 16:23:30
|
quote: Originally posted by tkizer What indexes do you have in place and what does the execution plan look like for this?
There are several indexes, all put in place after using the Estimated Execution Plan suggested them.I've made the Execution plans XML available here:http://new.fergflor.k12.mo.us/execution.xmlIts a large file, I'm not exactly sure what I should be looking for though? I'm not a beginner with SQL Administration, but I'm far from pro either. Edit:40ish % is "Index Insert", 31 in "Table Insert" and 31 in "Table Spool"Those are high numbers, so I assume those are the best places to try to look for optimizations. I'm just not sure what it meansI did some reading, and it seemeed like adding a unique to the tables involved could help? I import them exactly as they are given to me, and there are no unique IDs associated with each row of data. |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-08 : 16:39:47
|
quote: Originally posted by tkizer Could you upload the actual execution plan too? The estimated one isn't good enough as we need to see stat info, plus estimated could possibly not reflect what is actually used.
Actual Execution Planhttp://new.fergflor.k12.mo.us/actual_exec.xmlThe table that this execution plan is built on will get much larger as well. We have only uploaded a fraction of what it will actually need to work though to create the table. I considered using views, but that would be much more resource intensive correct? |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-08 : 17:27:33
|
quote: Originally posted by tkizer I haven't looked at the plan yet...But why would you think a view would be more resource intensive? Or in the reverse, why would you think it would speed things up? And why does this need to happen in a trigger? I don't see any joines to the trigger tables, so I'm not understanding why it's being done there.
I'm not 100% clear on how views work, but if the code that is inserting into a table is intensive, but only occurs about once a week, wouldn't it be better than having a view that is based on that code running every time someone accesses the page, which could happen every 20 minutes? |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2012-02-08 : 17:33:05
|
No that's not how it works. Views are just a saved query. Whether you use a view or not doesn't change the performance here. Now an indexed view is completely different, still a saved query but also involves data.Let's drop the view concept from this topic as it doesn't help you here. We need to get this properly indexed. I still haven't been able to take a look at your plan, need to find some time.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
Kristen
Test
22859 Posts |
Posted - 2012-02-09 : 07:11:42
|
"You'll want to confirm if you need all of those indexes on it."Query to tell you which indexes have not been used (but you will need the system to have been running for "some time" to make sure that the ones that are needed have actually had a chance to be referenced - e.g. if you have an index used by a massive report once a month the index may still be important, but it will only be used once a month http://sqlserverpedia.com/wiki/Find_Indexes_Not_In_Use |
 |
|
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-09 : 11:07:35
|
quote: Originally posted by Kristen "You'll want to confirm if you need all of those indexes on it."Query to tell you which indexes have not been used (but you will need the system to have been running for "some time" to make sure that the ones that are needed have actually had a chance to be referenced - e.g. if you have an index used by a massive report once a month the index may still be important, but it will only be used once a month 
I used the query suggested by the article you linked. Any indexes I have seem to be used and they have what I assume is a very good "reads per write"... most of them are 7 or more. |
 |
|
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-09 : 12:25:21
|
quote: Originally posted by tkizer Looks like you need to update statistics on these tables as the actual/estimated rows differ quite a bit. Also, Student_Test table might need an index. Could you post what indexes it has? Also, the insert into the table is slowing you down. You'll want to confirm if you need all of those indexes on it.
I've made some pretty significant changes. I've combined all the triggers I had into one major trigger, that cut down the overall execution time by quite a bit.Additional, I used the database engine tuning advisor to figure out what stats or index could be added. Previous to doing that I removed everyone index on the any table involved. I then broke out all the different queries to figure out where the largest impact was coming from, and its definitely the "Select" portion of the statements, with the longest taking about 8 seconds to execute. Which to me seems like a very long time given that the tale is only about 109k rows.There are a lot of "Case" instances which I am sure slows it down a bit. Maybe I could derive that information from some other method.I've attached a select statement that is used to generate the table data in question. This select statement is used 4 times, with a small, but necessary, piece changing at the very beginning.http://new.fergflor.k12.mo.us/code.htmlAnyway to improve this is greatly appreciated. As mentioned, it takes about 8 seconds to run right now, and given that there are only about 109k rows, I feel like it should only take 1 or two.The execution plan is located at the previous link, here:http://new.fergflor.k12.mo.us/actual_exec.xml |
 |
|
Kristen
Test
22859 Posts |
Posted - 2012-02-09 : 13:15:23
|
quote: Originally posted by tomrippity I've attached a select statement that is used to generate the table data in question. This select statement is used 4 times, with a small, but necessary, piece changing at the very beginning.http://new.fergflor.k12.mo.us/code.html
Don't you format your code? Its completely unreadable in that format ... here it is for anyone else who needs it, although I have only put some cursory formatting in.SELECT Cur_Year, Cur_Map_Free_Reduced_Lunch = Case Cur_Map_Free_Reduced_Lunch When 'Y' Then 11 When '' Then 12 End, Cur_TCHEMPID, Cur_Examiner, Cur_Content_Area, Cur_Grade_Level, Cur_School_code, Cur_School_Name, SUM( CASE WHEN (Cur_Achievement_Level = 'Below Basic' or Cur_Achievement_Level = 'Basic' or Cur_Achievement_Level = 'Proficient' or Cur_Achievement_Level = 'Advanced') then 1 else 0 END ) as Cur_Nbr_Total, SUM( CASE WHEN (Cur_Achievement_Level = 'Below Basic' or Cur_Achievement_Level = 'Basic') then 1 else 0 END ) as Cur_Nbr_Bottom_Two, SUM( CASE WHEN (Cur_Achievement_Level = 'Proficient' or Cur_Achievement_Level = 'Advanced') then 1 else 0 END ) as Cur_Nbr_Top_Two, SUM( CASE WHEN (Cur_Achievement_Level = 'Below Basic') then 1 else 0 END ) as Cur_Nbr_Bb, SUM( CASE WHEN (Cur_Achievement_Level = 'Basic') then 1 else 0 END ) as Cur_Nbr_B, SUM( CASE WHEN (Cur_Achievement_Level = 'Proficient' ) then 1 else 0 END ) as Cur_Nbr_Prof, SUM( CASE WHEN (Cur_Achievement_Level = 'Advanced') then 1 else 0 END ) as Cur_Nbr_Adv, Map.dbo.udf_Safe_Divide_By_Zero( SUM( Cur_Map_Scale_Score * 1.0) , ( SUM( Case When (Cur_Map_Scale_Score <> '' AND (Cur_MapA_Score Is Null or Cur_MapA_Score = '')) THEN 1 ELSE 0 END ) * 1.0) ) as Cur_Avg_MAP, --------- Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Below Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 600.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 700.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Proficient' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 800.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Advanced' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 900.0 AS Cur_IP, ------------------------------------ ------------------------------------ SUM( CASE WHEN ( (Prev_Achievement_Level = 'Below Basic' or Prev_Achievement_Level = 'Basic' or Prev_Achievement_Level = 'Proficient' or Prev_Achievement_Level = 'Advanced') ) THEN 1 ELSE 0 END ) as Prev_Nbr_Total, SUM( CASE WHEN (Prev_Achievement_Level = 'Below Basic' or Prev_Achievement_Level = 'Basic') then 1 else 0 END ) as Prev_Nbr_Bottom_Two, SUM( CASE WHEN (Prev_Achievement_Level = 'Proficient' or Prev_Achievement_Level = 'Advanced') then 1 else 0 END ) as Prev_Nbr_Top_Two, SUM( CASE WHEN (Prev_Achievement_Level = 'Below Basic') then 1 else 0 END ) as Prev_Nbr_Bb, SUM( CASE WHEN (Prev_Achievement_Level = 'Basic') then 1 else 0 END ) as Prev_Nbr_B, SUM( CASE WHEN (Prev_Achievement_Level = 'Proficient' ) then 1 else 0 END ) as Prev_Nbr_Prof, SUM( CASE WHEN (Prev_Achievement_Level = 'Advanced') then 1 else 0 END ) as Prev_Nbr_Adv, Map.dbo.udf_Safe_Divide_By_Zero( SUM( Case When (Cur_Grade_Level <> Prev_Grade_Level) THEN Prev_Map_Scale_Score ELSE 0 END * 1.0 ) , ( SUM( Case When ( (Prev_Map_Scale_Score <> '' OR LEN(Ltrim(Rtrim(Prev_Map_Scale_Score))) > 0 ) AND (Prev_MapA_Score Is Null OR prev_MapA_Score = '') AND Cur_Grade_Level <> Prev_Grade_Level ) THEN 1 ELSE 0 END ) * 1.0 ) ) as Prev_Avg_MAP, Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Below Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 600.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 700.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Proficient' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 800.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Advanced' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 900.0 AS Prev_IP, --------- -- NOTE: 1st check if Previous Average MAP is > 0. If so, then subtract from Current Average MAP. CASE WHEN ( (Map.dbo.udf_Safe_Divide_By_Zero( SUM( Case When (Cur_Grade_Level <> Prev_Grade_Level) THEN Prev_Map_Scale_Score ELSE 0 END * 1.0 ), ( SUM( Case When ( (Prev_Map_Scale_Score <> '' OR LEN(Ltrim(Rtrim(Prev_Map_Scale_Score))) > 0 ) AND (Prev_MapA_Score Is Null OR Prev_MapA_Score = '') AND Cur_Grade_Level <> Prev_Grade_Level ) THEN 1 ELSE 0 END ) * 1.0 ) )) ) <> 0 THEN (Map.dbo.udf_Safe_Divide_By_Zero( SUM( Cur_Map_Scale_Score * 1.0), ( SUM( Case When (Cur_Achievement_Level <> '' AND (Cur_MapA_Score Is Null OR cur_MapA_Score = '') ) THEN 1 ELSE 0 END ) * 1.0) )) - -- Prev_Avg_MAP (Map.dbo.udf_Safe_Divide_By_Zero( SUM( Case When (Cur_Grade_Level <> Prev_Grade_Level) THEN Prev_Map_Scale_Score ELSE 0 END * 1.0 ), ( SUM( Case When ( (Prev_Map_Scale_Score <> '' OR LEN(Ltrim(Rtrim(Prev_Map_Scale_Score))) > 0 ) AND (Prev_MapA_Score Is Null OR Prev_MapA_Score = '') AND Cur_Grade_Level <> Prev_Grade_Level ) THEN 1 ELSE 0 END ) * 1.0 ) )) ELSE null END as Avg_MAP_Diff, -- Cur_IP ( Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Below Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 600.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 700.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Proficient' Then 1 else 0 end) * 1.0) , (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 800.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Cur_Achievement_Level = 'Advanced' Then 1 else 0 end) * 1.0) , (SUM(CASE WHEN Cur_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 900.0 ) - -- Prev_IP ( Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Below Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 600.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Basic' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 700.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Proficient' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 800.0 + Map.dbo.udf_Safe_Divide_By_Zero( (SUM(CASE WHEN Prev_Achievement_Level = 'Advanced' Then 1 else 0 end) * 1.0), (SUM(CASE WHEN Prev_Achievement_Level Is Not Null Then 1 else 0 end ) * 1.0) ) * 900.0 ) as IP_Diff, -- Below Basic, Basic, Proficient, Advanced SUM( CASE WHEN ( (Cur_Achievement_Level = 'Advanced' AND (Prev_Achievement_Level = 'Proficient' OR Prev_Achievement_Level = 'Basic' OR Prev_Achievement_Level = 'Below Basic')) OR (Cur_Achievement_Level = 'Proficient' AND (Prev_Achievement_Level = 'Basic' OR Prev_Achievement_Level = 'Below Basic') ) OR (Cur_Achievement_Level = 'Basic' AND Prev_Achievement_Level = 'Below Basic') ) THEN 1 ELSE 0 END ) As Nbr_Ach_Level_Improve, SUM( CASE WHEN ( (Cur_Achievement_Level = 'Advanced' AND Prev_Achievement_Level = 'Advanced') OR (Cur_Achievement_Level = 'Proficient' AND Prev_Achievement_Level = 'Proficient') OR (Cur_Achievement_Level = 'Basic' AND Prev_Achievement_Level = 'Basic') OR (Cur_Achievement_Level = 'Below Basic' AND Prev_Achievement_Level = 'Below Basic') ) THEN 1 ELSE 0 END ) As Nbr_Ach_Level_Same, SUM( CASE WHEN ( (Cur_Achievement_Level = 'Below Basic' AND (Prev_Achievement_Level = 'Advanced' OR Prev_Achievement_Level = 'Proficient' OR Prev_Achievement_Level = 'Basic')) OR (Cur_Achievement_Level = 'Basic' AND (Prev_Achievement_Level = 'Advanced' OR Prev_Achievement_Level = 'Proficient') ) OR (Cur_Achievement_Level = 'Proficient' AND Prev_Achievement_Level = 'Advanced') ) THEN 1 ELSE 0 END ) As Nbr_Ach_Level_Decline, County_District FROM tblCustom_MAP_Student_Year_Compare WHERE (Cur_Examiner Is Not Null and LEN(Ltrim(Rtrim(Cur_Examiner))) > 0) GROUP BY Cur_Year, Cur_TCHEMPID, Cur_Examiner, Cur_Content_Area, Cur_Grade_Level, Cur_School_code, Cur_School_Name, Cur_Map_Free_Reduced_Lunch, County_District But its a bit of a dogs breakfast ...Don't use a UDF Map.dbo.udf_Safe_Divide_By_Zero to check for safe division, that's got to be adding some inefficiency, just do ... SomeColumn / NullIf(MightBeZeroColumn, 0.0) and what's with all the "* 1.0" stuff? The optimiser might ignore them ... but it might not. Why can't SUM(CASE WHEN Prev_Achievement_Level = 'Basic' Then 1 else 0 end) * 1.0) be SUM(CASE WHEN Prev_Achievement_Level = 'Basic' Then 1.0 else 0.0 end) (if you need the SUM to be a float) ?Personally I would retrieve the values pre-processed in an inner nested query, and then SUM them in the outer query, so you don't have to keep recalculating all the same CASE statements repeatedly, e.g.:SELECT ...-- Was: SUM( CASE WHEN (Cur_Achievement_Level = 'Below Basic' or Cur_Achievement_Level = 'Basic' -- or Cur_Achievement_Level = 'Proficient' or Cur_Achievement_Level = 'Advanced') then 1 else 0 END ) as Cur_Nbr_Total, -- New: SUM(Cur_Achievement_Level_BELOW_BASIC + Cur_Achievement_Level_BASIC + Cur_Achievement_Level_PROFICIENT + Cur_Achievement_Level_ADVANCED) as Cur_Nbr_Total, -- Was: SUM( CASE WHEN (Cur_Achievement_Level = 'Below Basic' or Cur_Achievement_Level = 'Basic') then 1 else 0 END ) as Cur_Nbr_Bottom_Two, -- New: SUM(Cur_Achievement_Level_BELOW_BASIC + Cur_Achievement_Level_BASIC) as Cur_Nbr_Bottom_Two, ...FROM( SELECT ... list of all required columns ... CASE WHEN (Cur_Achievement_Level = 'Below Basic' THEN 1.0 ELSE 0.0 END AS Cur_Achievement_Level_BELOW_BASIC, CASE WHEN (Cur_Achievement_Level = 'Basic' THEN 1.0 ELSE 0.0 END AS Cur_Achievement_Level_BASIC, CASE WHEN (Cur_Achievement_Level = 'Proficient' THEN 1.0 ELSE 0.0 END AS Cur_Achievement_Level_PROFICIENT, CASE WHEN (Cur_Achievement_Level = 'Advanced' THEN 1.0 ELSE 0.0 END AS Cur_Achievement_Level_ADVANCED, ... any other calculated field you need for outer query ... FROM tblCustom_MAP_Student_Year_Compare WHERE ...) AS XGROUP BY Cur_Year, Cur_TCHEMPID, Cur_Examiner, Cur_Content_Area, Cur_Grade_Level, Cur_School_code, Cur_School_Name, Cur_Map_Free_Reduced_Lunch, County_District and can't this: SUM( CASE WHEN ( (Cur_Achievement_Level = 'Advanced' AND Prev_Achievement_Level = 'Advanced') OR (Cur_Achievement_Level = 'Proficient' AND Prev_Achievement_Level = 'Proficient') OR (Cur_Achievement_Level = 'Basic' AND Prev_Achievement_Level = 'Basic') OR (Cur_Achievement_Level = 'Below Basic' AND Prev_Achievement_Level = 'Below Basic') ) THEN 1 ELSE 0 END ) As Nbr_Ach_Level_Same, be simplified to: SUM( CASE WHEN Cur_Achievement_Level = Prev_Achievement_Level ) THEN 1 ELSE 0 END ) As Nbr_Ach_Level_Same, or if necessary then: SUM( CASE WHEN Cur_Achievement_Level = Prev_Achievement_Level AND Cur_Achievement_Level IN ('Advanced', 'Proficient', 'Basic', 'Below Basic') ) THEN 1 ELSE 0 END ) As Nbr_Ach_Level_Same, and this is inefficient:WHERE (Cur_Examiner Is Not Null and LEN(Ltrim(Rtrim(Cur_Examiner))) > 0) why have you got values in Cur_Examiner with trailing spaces? Clean up the data and enforce it so that it stays clean !!!!And don't store blank values, store then as NULL (the blanks you are storing indicate UNKNOWN, right?) then you could just doWHERE Cur_Examiner Is Not Null which is a potential candidate for an index, which a WHERE clause with various function() calls will NOT be.(And if you can't do that you don't need BOTH the LTrim() & RTrim() - if the thing is JUST spaces either one will do! : AND Rtrim(Cur_Examiner) <> '' might get you an index hit (but I doubt it because it contains a function) |
 |
|
tomrippity
Starting Member
37 Posts |
Posted - 2012-02-09 : 14:31:14
|
quote: Don't you format your code? Its completely unreadable in that format ... here it is for anyone else who needs it, although I have only put some cursory formatting in.
My code is formatted in my view. I just did a simple copy paste into an HTML page so that it would be easy for you to view. I thought it seemed pretty legible, but thanks for formatting for everyone else!I will start trying out some of your suggestions! Thank you so much for the help.I am actually redeveloping from a previous application that a former employee set up. I am reusing some of the stored procedure and what not that they put in place, such as this one and hoping to make everything more efficient. Hopefully, with the help I've received from everyone here, I will not only make this and many of the other queries much faster, but learn enough that I can start helping to answer questions too! |
 |
|
Kristen
Test
22859 Posts |
Posted - 2012-02-09 : 17:41:26
|
"My code is formatted in my view."Best to post it here (next time) then, it will retain the formatting if you use [code] tags around it. |
 |
|
|
|
|
|
|