Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Wanet Telecoms Ltd on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

ADO vs. Query Analyzer Performance 1

Status
Not open for further replies.

innmedia

Programmer
Mar 24, 2000
108
US
Hi,
I have a Select Into (creates a new table) SQL statement that uses the fields from 27 other tables. The new table ends up with 500+ fields, with >20,000 records.

In Query Analyzer it takes about 45 seconds. With an ADO connection from an Access front end it times out, even with the timeout set at 120 seconds.

I am not showing the long SQL statement here, but this is how it is executed:

conn.CommandTimeout = 120
conn.Execute (select_ssql)


Is something wrong here? I would have thought that this just sends the whole SQL statement to SQL Server and it would process it in the same way. Is this not so? Any ideas how to speed this up from the front end. I can't ask my client to use Query Analyzer.

Thanks!
 
First, why are your users creating a new table? Perhaps a temp table would be a better choice? I personally never allow a user to create an object in the database. Could be your users don't have object creation rights.

Have you tried putting this SQL in a stored procedure and running that rather than sending the query from access? Access queries have to process through the jet engine which then translates to SQL Server, an extra step which takes much more time. I know whne we moved some of ouur more time consuming queries to the server directly, things were much faster.
 
SQLSister,

Thanks. The issue behind all of this is that the SQL string is built with alot of variables. Previous attempts showed that sending all this to a stored proc was even slower.

The will not be an issue with rights. There are 2 or 3 users. They need to create tables, since the next step in their work is to connect to the tables using SPSS (a statistical package). They have to know what the table name is.

Since my last post I added indexes (yeah, I should have done this first). The process now does not time out, but still is not as swift as Query Analyzer.

Another question though. I am joining many tables, but the first table is a single field with the distinct values of the key field. The other tables get joined to this with an (Left) outer join. Therefore all values of the key field are present in the new table, but there are alot of Nulls in the other fields coming from the other tables. (Since none of the other tables represent all the field values, just some of them, each). My question is - is there a way on-the-fly, to have zeroes appear instead of Nulls? I can't conceive of a way with the SQL. All I can think of is doing a bunch of Update queries after the fact.
 
you can use the ISNull() function.

Code:
Select
a.Column1
IsNull(b.Column4,0)
IsNull(c.Column2,0)
From tablea a
   JOIN tableb b ON b.column2 = a.column1
   join tablec c on c.column2 = b.column3

"Shoot Me! Shoot Me NOW!!!"
- Daffy Duck
 
MDXer,

Great. Never thought of that. I will give it a try!

Thanks.
 
Actually, I like the isNull statement better; I should have thought of that one myself. You will need case though if for some reason you have empty strings instead of nulls.
 
Actually where do I put the Isnull? The SQL statement is all Sums, for example -- Sum(Table1.Field1) As Field1Sum

Would it be IsNull(Sum(Table1.Field1),0) as Field1Sum ??
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top