WoodyGuthrie
MIS
I apologize in advance if I'm stepping back too far. Looks like it's been a while for this topic: Temp Tables or Table Variables...
I have a stored procedure that uses 6 tables that are each a unique subset of a master table. The master table is about 600K rows and each of the other tables has between 300K and 500K rows.
The query joins each table to the master table. While each table is a relatively large percentage of the master table, each is only a single column and a relatively low percentage of the rows are used at any given run. The tables are TRUNCATEd every evening and then updated with with a selection from the master table to re-populate each subset.
The obvious advantage for me would be a query that could potentially scan 6 tables of 300-500K rows each or 6 temp tables/variables of 10-20K rows each depending on the date range selected.
...Change the stored procedure to to eliminate these tables and replace them with temp tables or table variables or... leave it as is?
There are no udf's so that's not an issue.
Clean up is really not a problem with either method, however, I do wonder how SQL Server handles session uniqueness for #TempTables..??
According to BOL..: table variables used in stored procedures cause fewer recompilations of stored procedures than with temp tables. Also BOL : Table variables require less locking because transactions last only for the duration of an update to the variable.---logging overhead??
I know temporary tables support indexes which can help. As an interim measure I've created indexes over the 6 tables that are currently in place and that greatly improved performance.
Thanks for your input.
I have a stored procedure that uses 6 tables that are each a unique subset of a master table. The master table is about 600K rows and each of the other tables has between 300K and 500K rows.
The query joins each table to the master table. While each table is a relatively large percentage of the master table, each is only a single column and a relatively low percentage of the rows are used at any given run. The tables are TRUNCATEd every evening and then updated with with a selection from the master table to re-populate each subset.
The obvious advantage for me would be a query that could potentially scan 6 tables of 300-500K rows each or 6 temp tables/variables of 10-20K rows each depending on the date range selected.
...Change the stored procedure to to eliminate these tables and replace them with temp tables or table variables or... leave it as is?
There are no udf's so that's not an issue.
Clean up is really not a problem with either method, however, I do wonder how SQL Server handles session uniqueness for #TempTables..??
According to BOL..: table variables used in stored procedures cause fewer recompilations of stored procedures than with temp tables. Also BOL : Table variables require less locking because transactions last only for the duration of an update to the variable.---logging overhead??
I know temporary tables support indexes which can help. As an interim measure I've created indexes over the 6 tables that are currently in place and that greatly improved performance.
Thanks for your input.