glennfishwick
IS-IT--Management
I have a query that has been extracted from an application and I have tuned it a little. However I now get drastically differing performance times depending on where I run it (I have used DBCC DropCleanBuffers & DBCC FreeProcCache to clear memory and plans). i.e.
Run from my PC it takes approx 15 secs. The execution times show:
Compile:
CPU time = 625 ms, elapsed time = 12045 ms
Execution:
CPU time = 16 ms, elapsed time = 513 ms.
Run on the actual server the compile elapsed time is very similar to the CPU time and the query runs in less than a second.
Also when I don't drop the buffers or free the cache it runs immediately. It only runs slow when run the first time from a client only!?
Why is there this big difference in time from the CPU to elapsed? No rows are returned from the query so you would not expect Network to be an issue. I have seen these big differences in CPU & elapsed times before and also been confused then.
Run from my PC it takes approx 15 secs. The execution times show:
Compile:
CPU time = 625 ms, elapsed time = 12045 ms
Execution:
CPU time = 16 ms, elapsed time = 513 ms.
Run on the actual server the compile elapsed time is very similar to the CPU time and the query runs in less than a second.
Also when I don't drop the buffers or free the cache it runs immediately. It only runs slow when run the first time from a client only!?
Why is there this big difference in time from the CPU to elapsed? No rows are returned from the query so you would not expect Network to be an issue. I have seen these big differences in CPU & elapsed times before and also been confused then.