Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

database change causes poor performance 1

Status
Not open for further replies.

bcraig

IS-IT--Management
May 3, 2001
52
CA
I have an ASP based front end accessing an SQL database. There are 600,000 records in a table. I need to create a sub-category field as an extension of a category. My problem is after I have added the sub-category field, the system is almost non-responsive. Processor utilization is up to 100% and my ASP pages time out. This is true for any functions, even those not referencing the modified table. Very little in the ASP code has changed. I can run the new code on the old database without problem. Running the old code on the new database produces the same poor performance results. Even querying any tables of the database through the SQL Ent. manager is extreemly slow.

I am at a loss, any suggestions would be greatly appreciated.

Brian
 
hi,
run dbcc checkdb and dbcc dbreindex.

This should improve performance.


Does this solve your problem?

bye
miq
 
No luck with the dbcc checkdb and dbcc dbreindex.

Doesn't seem to matter what the query code is. Running a query on any table within the database causes the trouble. For example, I made the changes to the tickets table. Added a field called sub_cat type INT. When I query the users table with Select * from users where lastname='craig', the server chokes. Running the same query on the original database returns the records immediately.

There are no indexes on the modified table.

Brian
 
The problem is "There are no indexes on the modified table." Add an index on lastnmae and any other columns that will be used in the Where clause.

By the way, does the original table have indexes? Terry L. Broadbent - DBA
SQL Server Page:
If you want to get the best answer for your question read faq183-874.
 
No indexes on the original table. What puzzles me is that querying a table that had not changed, returns results so slowly.
 
What puzzles me is that you found querying a table without any indexes to be fast enough. I can't imagine how the query performance could be acceptable if querying an unindexed table with 600K rows. Add the indexes to both tables and measure performance.

Which version of SQL Server are you running? What is the frequency of queries? How big are the rows in the tables? How much RAM is installed on the server? What is the Cache hit ratio for SQL Server. Terry L. Broadbent - DBA
SQL Server Page:
If you want to get the best answer for your question read faq183-874.
 
I will try indexing for performance gains.

Running SQL 2000. Frequency of queries and Cache hit ratio shouldn't matter as I am the only one on the server during testing and am only querying one database at a time. 512mb RAM, an additional 512 will be installed Monday.

Brian
 
Cache hit ratio is important. If the data pages are not in RAM, SQL has to perfrom disk IO. If the pages are in RAM, SQL just reads the pages from memory which makes the query much faster. You didn't answer my question about row size so I don't know how much RAM is needed to load the full table.

If you run a query on a table without an index, SQL server has to load all the pages into memory. As additional pages are loaded and free RAM is not available, SQL will replace pages. With only 512MB of memory on the server, and 600K rows to load, it is very likely that the cache hit ratio is low. Terry L. Broadbent - DBA
SQL Server Page:
If you want to get the best answer for your question read faq183-874.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top