| Author |
Topic |
|
ComputerMike
Starting Member
18 Posts |
Posted - 2010-02-02 : 20:08:40
|
| I rewrote a procedure using table variables hoping to get faster execution. I ran both version in Query analzer together. The old way took 85% of the processor then new way 15%. But when I run, old way takes 1 second, new way 15 seconds. around 4000 records both ways. I got the exact opposite effect I thought I would. Was wondering if anyone could explain?It must have to do with memory is my guess. Was wondering if anyone had any tips for making query run faster or why the descrepancy mention above is |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-02-03 : 01:23:37
|
| Need to see the code to be able to provide any advice.The actual elapsed runtime may not be much of a benchmark as the computer could have been waiting on some other event - you can rule that out by repeating the test numerous times, but you then have to be careful that data is not cached in memory etc. |
 |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2010-02-03 : 03:34:07
|
| what is amount of data you stored in table variable? |
 |
|
|
bklr
Master Smack Fu Yak Hacker
1693 Posts |
|
|
GilaMonster
Master Smack Fu Yak Hacker
4507 Posts |
Posted - 2010-02-03 : 07:27:42
|
| Table variables don't tend to perform well with large rowsets, due to lack of statistics and poor row count estimates. To say more, I'd need to see the code.--Gail ShawSQL Server MVP |
 |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2010-02-03 : 07:38:46
|
| I have script that I use in log shipping to determine automatically which log files I need to restore on the secondary server, and when using table variables the script took roughly 1:30 to run and when changing to temp-tables it takes around a second. The query is by no means optimized but it gets me what I need. But I have to admit I was quite shocked to see the difference..- LumbagoIf the facts don't fit the theory, change the facts. Albert Einstein |
 |
|
|
Transact Charlie
Master Smack Fu Yak Hacker
3451 Posts |
Posted - 2010-02-03 : 07:57:00
|
| I got so disillusioned by table variable that I hardly ever use them any more.About the only scenarios I use them in now would be for table valued functions and for the odd case where the recompile time from using a temp table would be an inconvenience.The ability to add indices to the temp tables makes them so much more flexibleCharlie===============================================================Msg 3903, Level 16, State 1, Line 1736The ROLLBACK TRANSACTION request has no corresponding BEGIN TRANSACTION |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-02-03 : 11:47:58
|
The fact that the CPU drops with Table Variables, but the query takes longer, seems odd. My only explanation for that is that the Memory is being paged out - in which case RAM could be configured better Lumbago, assuming a modest number of records on your Restore Script maybe you had indexes and/or PK on #TempTable and nothing on @TableVar - we always declare a PK on @TableVar (and usually for temp tables we only need one index, so we make the PK do that job (adding a tie-break field to make it unique, if necessary) |
 |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2010-02-04 : 02:38:02
|
| @Kristen: I did have primary keys on both table types actually but the query was kinda nasty and the indexing on the system databases were not in my favor. We were also talking about roughly 900 rows of data in the table-variable/temp-table...I could post the script if you'd like. It is indeed quite handy when doing log-shipping of multiple databases.- LumbagoIf the facts don't fit the theory, change the facts. Albert Einstein |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-02-04 : 03:25:11
|
I'm surprised that @TableVar was non-performant on 900 rows - but it just goes to show that things need testing in DEV and no t just "chucking together". Trouble is budget is normally set on the basis of "chuck it together" |
 |
|
|
|