Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Which DataSet to use?

Status
Not open for further replies.

RamHardikar

Programmer
Feb 16, 2001
109
GB
I have DataSet on which I make all my changes and finally I use it to update the changes back in the database. I had this doubt if I should create another temporary DataSet to hold only the rows with modified rowstate and use this to update the database instead of the original DataSet.

Is there any performance gain in doing so? Could someone comment or suggest links on this topic?
 
I would stick with your original DataSet as this will always exist so creating a new one is only going to increase memory usage.

--------------------------------------------------------------------------------------------------------------------------------------------

Need help finding an answer?

Try the search facility ( or read FAQ222-2244 on how to get better results.
 
Lets say my original DataSet has about 50K records and if always am going to modify only a small percentage of the rows (e.g: 200 rows), would it be wise to update the database with a temp DataSet which holds only the modified rows? is it faster that way? or no mmatter what DataSet I use the time taken is the same for updation?
 
If you already have a DataSet with 50k rows (which I would really recommend against in an ASP.NET application) then this is already held in memory.

If you create a new DataSet and use it to update the database then you are going to then have two DataSet's in memory which seems pointless as you would be duplicating information.

--------------------------------------------------------------------------------------------------------------------------------------------

Need help finding an answer?

Try the search facility ( or read FAQ222-2244 on how to get better results.
 
Do you mean for manipulating such huge data, u do not recommend ASP.NET as the solution? Should this be taken as a drawback of ASP.NET?
 
when you have a need for this big datasets I wouldn't recommend any programming language, I would suggest a different appraoch, like paging.

Christiaan Baes
Belgium

If you want to get an answer read this FAQ faq796-2540
There's no such thing as a winnable war - Sting
 
I mean that if you are sending extremely large amounts of data between the server and the client then there are always going to be performance issues.

--------------------------------------------------------------------------------------------------------------------------------------------

Need help finding an answer?

Try the search facility ( or read FAQ222-2244 on how to get better results.
 
Christiaan -

Am using paging in my application. but to implement paging i dont go to the database everytime to get the data for that page but use a DataSet instead.
 
So you mean you keep 50k records in dataset for every user who accesses that page?

--------------------------------------------------------------------------------------------------------------------------------------------

Need help finding an answer?

Try the search facility ( or read FAQ222-2244 on how to get better results.
 
ca8msm-

yes i plan to maintain seperate DataSet for each user. but the report that am handling, for that, for a given time frame there could be only couple of users accessing it. do u feel this wud still eat up the RAM? is there any better approach to follow?
 
do u feel this wud still eat up the RAM?
Most definately. Here's an article you may be interested in:


--------------------------------------------------------------------------------------------------------------------------------------------

Need help finding an answer?

Try the search facility ( or read FAQ222-2244 on how to get better results.
 
Thanks for the link. That article is quite interesting. I still cant believe that going back and forth to fetch data from the database using the DataReader is quicker than doing it using a DataSet. I have not gone thru Freak's guide yet (hope it is available on the net). I am sure it has some interesting analysis.

 
Dont u feel the approach (having objects representing row data in array) that they have discussed in the article would also eat up RAM? if we talk about response time, as per the article, DataReader scores over DataSet. But i feel memory utilization wise using either a DataSet or DataReader would not make a difference.

Any comments pl?
 
I think you should do the paging on the sql-server side and create a stored procedure that only returns a couple of records at a time (only what they can see on the screen).

Christiaan Baes
Belgium

If you want to get an answer read this FAQ faq796-2540
There's no such thing as a winnable war - Sting
 
But that would mean too many hits to the database. For example, if i have 2K pages to display. i'll hv to hit the DB those many times. Just wondering what sites like Google might be doing as they too would huge amounts of searched data to display.
 
Why don't you run your own tests then? I'm pretty confident that if you store your own DataSet containing 50k records and watch the memory usage that you'll be very worried about the amount being used.

You say that there would be too many hits to the database, but are you expecting the users to view every single page? Also, as long as the time taken to get the records is very low then it shouldn't be a problem.

--------------------------------------------------------------------------------------------------------------------------------------------

Need help finding an answer?

Try the search facility ( or read FAQ222-2244 on how to get better results.
 
I was just thinking....possibility of a user visiting every single page is very less since i have thousands of pages. so getting only desired number of records per request seems to be a better option. i better go with that option.

Thank you all for ur valuable inputs.
 
ever tried to go to the last page of a google search that returned 1000000 records. I think google will give up after about a thousand.

BTW the query will still be in memory the next time you ask for it so it will be fast and only one % of the users will ever want to see all pages, if that.

And I think it's better to have a thousand 20 record requests then 20 thousand record request.

Glad we could help.

Christiaan Baes
Belgium

If you want to get an answer read this FAQ faq796-2540
There's no such thing as a winnable war - Sting
 
I have gone away from the days of typing command objects and use the auto generated dataadapters that automatically generate all the command objects.
You don't pull down thousands of records but instead use substitute command objects in conjunction with the generic dataadapters.
DataAdapterEmployees.SelectCommand= cmdGetEmployees
DataAdapterEmployees.SelectCommand.parameters("@emp_id")=1122

If you favor adhoc than you can count on an autogenerated DataAdapter to never have a where clause so you can safely add
DataAdapterEmployees.SelectCommand.CommandText=DataAdapterEmployees.SelectCommand.CommandText & " where emp_id=" & 1122



 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top