Author |
Topic |
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 12:38:33
|
Ideas on best way to convert this to Dynamic SQL please?CREATE PROCEDURE MySproc @ParamN intAS... DECLARE and Populate @TableVarB ... SELECT A_Col1, A_Col2, B.B_Col1, B.B_Col2 FROM TableA AS A JOIN @TableVarB AS B ON B.ID = A.ID_B1 WHERE A.ColN = @ParamN ORDER BY A_Col1, B.B_Col2 I'm trying to create something like this:CREATE PROCEDURE MySproc @ParamN intASDECLARE @strSQL nvarchar(MAX)... DECLARE and Populate @TableVarB ... SELECT @strSQL='A_Col1, A_Col2, B.B_Col1, B.B_Col2 FROM TableA AS A JOIN @TableVarB AS B ON B.ID = A.ID_B1 WHERE 1=1' + CASE WHEN @ParamN IS NULL THEN '' ELSE ' AND A.ColN = @ParamN' + ... more conditional criteria ... + ' ORDER BY A_Col1, B.B_Col2' EXEC sp_ExecuteSQL @strSQL, N'@ParamN int', @ParamN, ... Using a #TempTable seems a retrograde step ...Edit: we have hundreds of Sprocs that have @TableVars which are likely to be changed to use Dynamic SQL, so creating a bunch of Table Values Parameters is probably not going to be viable. |
|
X002548
Not Just a Number
15586 Posts |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 12:47:49
|
quote: Originally posted by X002548 You need to declare AND populate the table variable with in the dynamic sql you are creating
Yeah, if it was as simple as that I would do that, but the @TableVars have all sorts of duties outside just this one query, unfortunately."what have you done with Kristen?"All his base are mine |
 |
|
X002548
Not Just a Number
15586 Posts |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 13:18:58
|
OK, so it looks like I can 1) define a TYPE and then use a Table Values Parameter (I think I have too many, individually unique, @TableVars and that this is therefore a non-starter)2) Change my @TableVar to a #TempTable or ##GlobalTempTableSince SQL 2000 I have converted pretty much all #TempTables to @TableVars. I wonder how much of a performance improvement I really REALLY got? Less recompiling, less locking, less Transaction Log activity. Perhaps some significant in-memory-only activity on @TableVars which would be TEMPDB activity with #TempTable?Perhaps I'll just have to bite the bullet and revert back to using #TempTables |
 |
|
X002548
Not Just a Number
15586 Posts |
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2012-01-06 : 13:34:44
|
Kristen,Is this a follow on to your dynamic select list from yesterday? I hate to say it, but I think you are creating a "house of cards." Unless you want to go all the way and create a real dynamic query generator I fear you might be creating something that is nearly unmantainable.That aside, table variables have some benefits depending on usage, but they act just the same as temp tables as far as TempDB is concerned. The potential problem is that there are no statistics on them, so they tend to perform worse than temp tables. |
 |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 13:39:58
|
quote: Originally posted by X002548 the same reason you can't use a local table variable in dynamic sql is the same reason you can't use a local temp tableIt must be a global or permanent table if you are going the reach beyond the scope of the dynamic sql where the table variable or temp table are referenced in
Sorry, I was only meaning to create OUTSIDE the Dynamic SQL and use WITHIN it, rather than creating WITHIN the Dynamic SQL and then expecting it to be in-scope outside (which, I agree would need a ##GloblaTemptable) |
 |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 13:53:15
|
quote: Originally posted by Lamprey Is this a follow on to your dynamic select list from yesterday?
Yup "I hate to say it, but I think you are creating a "house of cards." Unless you want to go all the way and create a real dynamic query generator I fear you might be creating something that is nearly unmantainable."Yeah, I've been putting it off for several years ... we have just started a fresh development cycle at the moment, so its opportune to try some stuff to see what sort of Gotcha's we encounter.The real issue, as per my post yesterday, is that we are increasingly adding more-and-more Columns to our core Sprocs, and each client only uses some of them, so the unused ones are inefficient to have to retrieve each time. Plus if a Client needs a Column that is not included we have to modify the Sproc and issue a Patch (which has considerable Cost implication, and opportunity to make the change on the wrong base-version for that particular client).My plan (if I overcome all the hurdles) is to provide a point-and-click interface so that our DEVs (or even Client) can select which columns they want or, even better, to allow them to just use any column they like (from within the tables included in the FROM statement) and then dynamically just-in-time adjust the SELECT list depending on what the DEVs/Client actually used in the CMS HTML templates (and then cache the Select List so that it is not re-calculated every time ...)We also have a LOT of WHERE clauses along the lines of:WHERE (@Param1 IS NULL OR MyCol1 = @Param1) AND (@Param2 IS NULL OR MyCol2 = @Param2) ... which would obviously be much more efficiently if we move to Dynamic SQL.A key barrier for us, in the past, was the need to have SELECT permission on the underlying tables when using Dynamic SQL in SProcs, but with EXECUTE AS and CERTIFICATE impersonation in SQL 2005 (or maybe SQL 2008, I forget) those concerns have gone away.I think I can write the Sprocs to be maintainable, but time will tell. (My thought is to have the original SELECT list "in the code" but commented out, and in the style of an SProc call, so I can just EXECUTE that bit, manually, if it changes which will update the Metabase in the DEV system with the actual syntax of the various permitted columns in the SELECT list, so the "master version" will always be up-to-date, and then the Client/DEV can choose from that Master list during the Development / Customisation phase. Hope that makes sense? if not picture me waving my arms about furiously in order to explain it better |
 |
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2012-01-06 : 14:28:08
|
quote: Originally posted by Kristen <snip>I think I can write the Sprocs to be maintainable, but time will tell. (My thought is to have the original SELECT list "in the code" but commented out, and in the style of an SProc call, so I can just EXECUTE that bit, manually, if it changes which will update the Metabase in the DEV system with the actual syntax of the various permitted columns in the SELECT list, so the "master version" will always be up-to-date, and then the Client/DEV can choose from that Master list during the Development / Customisation phase. Hope that makes sense? if not picture me waving my arms about furiously in order to explain it better 
Hehe, yeah I've been there. :)I'm not sure if it'd help (and I can't believe I'm about to suggest it), but you might want to investigate how Share Point does it's data access/permissions. I know they have column level permissions; albeit through a different security mechanism. That might give you some food for thought on possibly implementation ideas. Which, might lead to a completely data driven solution, which may or may not be beneficial. |
 |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 14:35:28
|
"Which, might lead to a completely data driven solution, which may or may not be beneficial."Rightly or wrongly that sounds very much like where I'm headed, thanks. I'll take a look at Mr Sharepoint |
 |
|
X002548
Not Just a Number
15586 Posts |
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
Posted - 2012-01-06 : 16:06:17
|
Create and populate a temp table before the dynamic SQL. It'll be visible to the dynamic SQL (temp table scope is creating proc and all procs that it calls). No need for global temp tables that I can see.quote: Less recompiling, less locking, less Transaction Log activity. Perhaps some significant in-memory-only activity on @TableVars which would be TEMPDB activity with #TempTable?
Recompiling maybe, but temp tables don't cause recompiles on every execution any longer. Less locking, well, maybe but the temp table is local to you, so who cares?. Memory vs disk, no, no, no, no, they're treated exactly the same way by SQL.If you're going lots of dynamically generated dynamic SQL, watch the impact on the plan cache, it will grow larger than it would with just procedures.--Gail ShawSQL Server MVP |
 |
|
X002548
Not Just a Number
15586 Posts |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 17:52:00
|
quote: Originally posted by GilaMonster If you're going lots of dynamically generated dynamic SQL, watch the impact on the plan cache, it will grow larger than it would with just procedures.
You mean if I have lots of different variants ofSELECT @strWhereSQL='WHERE 1=1' + CASE WHEN @ParamN IS NULL THEN '' ELSE ' AND A.ColN = @ParamN' + ... more conditional criteria ... each variant combination that gets used will create a different cache plan? Or perhaps something else that I am not aware of?Thanks for your help |
 |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-06 : 17:54:47
|
quote: Originally posted by X002548 Is this new behavior?
Not that I can remember"What am I mistaking this for where this wouldn't work?"Your probably not thinking of this, but if you are then it won't work:DECLARE @sql varchar(8000); SET @sql = 'CREATE TABLE #temp(Col1 int); INSERT INTO #temp(Col1) SELECT 1'EXEC(@sql)SELECT * FROM #temp |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
Kristen
Test
22859 Posts |
Posted - 2012-01-07 : 03:39:15
|
Interesting, thanks. Well I won't be shy about converting them back then, and that will solve my SCOPE on @TableVars . Mine usually only hold a handful of rows, so perhaps no big issue on performance. Never tested them carefully side-by-side for performance though. |
 |
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
Posted - 2012-01-07 : 05:07:40
|
quote: Originally posted by Kristen
quote: Originally posted by GilaMonster If you're going lots of dynamically generated dynamic SQL, watch the impact on the plan cache, it will grow larger than it would with just procedures.
You mean if I have lots of different variants ofSELECT @strWhereSQL='WHERE 1=1' + CASE WHEN @ParamN IS NULL THEN '' ELSE ' AND A.ColN = @ParamN' + ... more conditional criteria ... each variant combination that gets used will create a different cache plan?
Yes. Each different piece of dynamic SQL is considered a different query, compiled and cached, reused only if a piece of dynamic SQL exactly matching is used.I used to work on a system that dynamically built and ran queries, we had so much trouble with compiles and compile waits, plan cache size, etc (mostly on SQL 2000, but also on 2005)--Gail ShawSQL Server MVP |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2012-01-07 : 09:25:47
|
quote: Originally posted by Kristen Interesting, thanks. Well I won't be shy about converting them back then, and that will solve my SCOPE on @TableVars . Mine usually only hold a handful of rows, so perhaps no big issue on performance. Never tested them carefully side-by-side for performance though.
That's what's weird, our most severe issue was with a table variable with only one row in it. It caused a bad execution plan, which caused slowness. Because that stored procedure was executed very frequently, it caused the entire system to slowdown. It was a nightmare.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog |
 |
|
madhivanan
Premature Yak Congratulator
22864 Posts |
|
Next Page
|