Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chriss Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

read uncommitted / nolock, locks still showing up!

Status
Not open for further replies.

gbrian

IS-IT--Management
Sep 22, 2004
96
US
Hello,


I have a stored procedure that contains 30 queries all combined with UNION ALL (each query has the same columns and I could use UNION, but anyways..)

It takes FOREVER (5 hours?). This data is input into the database monthly and is then never changed (no insert/update/delete going on). So, I thought I would remove record locking.

I first tried adding:
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

to my stored procedure. When I look at the process that is running, it still has MANY locks that show up. I also tried using WITH (NOLOCK) on the queries and it yeilds the same result.

Here are two example queries from the stored procedure:
/**OVERALL VISITS**/
select count(distinct(cs.sessionID)), ''OVERALL: visits'' as what
from ['+@Country+'_logging_stats].dbo.customlogs as cl WITH (NOLOCK) , ['+@Country+'_logging_stats].dbo.customlogssubmissions as cs WITH (NOLOCK)
where cl.submissionid = cs.submissionid and cl.customlogtypeid=15 and cs.submissiondatetime between '''+@DateMin+''' and '''+@DateMax+'''

union all

/**OVERALL VISITORS**/
select count(distinct(cs.userID)), ''OVERALL: visitors'' as what
from ['+@Country+'_logging_stats].dbo.customlogs as cl WITH (NOLOCK) , ['+@Country+'_logging_stats].dbo.customlogssubmissions as cs WITH (NOLOCK)
where cl.submissionid = cs.submissionid and cl.customlogtypeid=15 and cs.submissiondatetime between '''+@DateMin+''' and '''+@DateMax+'''


All of these queries are within the following:
BEGIN
Exec( ' [ALL QUERIES LIKE ABOVE HERE] ')
END

Just in case there could be some weird type of scope problem happening.

Any info would be helpful. Also, if you have any other ideas about how I can optimize this work, that'd be GREAT!

The biggest issue is we are forced to search with wildcards in a 300 character field for most of the other queries. Example: or querystring like ''%lp%tws%'' where querystring is a string of on average 250 characters. Obviously this implementation is crap, but provided we can't change the implementation at this moment, what are my options for database or query optimization?

I guess there are two questions in this post now...sorry about that :)

Thanks in advance SO much for your responses...this has been driving me up the wall!!!!
 
why not pull the data for the specified dates to a temp table first and then run the count distinct part against the temp table where you are dealing with many fewer records. Same thing with the awful querystring of searching using like on a 300 character field. Pull the data for the dates out separately and then do the inexact search against the much smaller data set.

Just looking at your query, yes your database design is seriously flawed and needs to be rethought because you can never have an efficient way to pull data is you must get from dynamic tables and like type searches fo large data fields.

The nolock only affects the select, if you are insert this data once a month into tables then there must be locks in order to do so.

Questions about posting. See faq183-874
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top