Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Shaun E on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Help!! Low memory problems with large recordsets 1

Status
Not open for further replies.

GSMike

Programmer
Feb 7, 2001
143
US
Asking a lot of questions about SQL Server today.
I am used to working with MS Access, where you can open a table with 100k records no problem. Or 200k.
In SQL Server, when I try to open my table containing 100k records, when it gets to about 10k, I get the "Low Memory" error message from NT.
I am watching my performance monitor also, as the records roll out (on the counter at the bottom right of the Query Tool window) and my memory is disappearing as the number of records returned increases.
Wow, I'm wondering, how can you view 10k, 20k, or even 100k records using SQL Server without it killing your client memory? Is it possible?
Thanks for your help.
-Mike
 
You might want to change your design so that it is more client/server based. You shouldn't need to transfer all records across the network to the front-end, after all how could a user comprehend 10k records at once?
Try and keep the processing on SQL Server.
You should either transfer them one at a time as they are needed, or in small batches.
Use a stored procedure will help improve performance. Use a where clause to restrict the number of records
i.e.
Select * from employees where department = 'IT'
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top