Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

creating objects and collections

Status
Not open for further replies.

goaway1234

Programmer
Jun 2, 2004
78
US
Howdy all. I am working on a project that is nearing release, and while testing it on large data sets I have run into brick wall. The program does call management, contact management, and scheduling for small businesses. It has a database back-end that can use access, msde, or sql-server.

I packaged most of the business logic into collections and objects. When the application needs a list of contacts, it creates an object for each record and puts it into a collection. My problem is that I tested it on a database that has 29000+ customer records, and creating an object for each record takes about 15 minutes, and destroying each of those objects takes about as long.

I've done everything that I can think of to speed this up, and the best solution I've come up with is to only create this collection once and keep it in memory until the program closes. But this still isn't good enough.

Does anybody have any suggestions on how I can speed up creating and destroying such a large number of objects?
 
Hi

I must question why you need to create an object for each record. Also, if you need to create an object per record, do you need all 29000+ records at each moment that the program is running? Or can you get along with a subset (i.e. filtered recordset) at any given moment?

I don't know how to speed up the manipulation of so many objects.

Cassandra
 
The whole point of one-object per record was to make a data layer/tier, application layer, and interface layer.

When I do things like populate a list box with all the contact names, I need every single record but can't just read or write the tables directly without violating a whole mess of assumptions that the rest of the program relies on.


nick
 
Still; why on earth would you want to have those 29000+ objects alive at one specific moment in time? Sounds like you've been going a little over the hill with encapsulating your db stuff.

Isn't it easier to create the object when it is needed and populate the listbox with just ID's from which you can then create an object?

Now for your poor perfomance; even for 29000+ records, 15 minutes sounds like an enourmous amount of time. Can you determine what it is that takes so long in creating the objects? Are you making, for instance, out of process com calls making them? Are there related child objects created along with these objects, are you doing heavy string manipulation inside these objects etc?

On a side track; have you checked the amount of memory your app uses after having created all these objects? How big are these objects?

Greetings,
Rick
 
I'd humbly suggest some redesign; I can think of no (good) reason why you'd want all 29000 items in a list box at one time...

That being said, if you insist on retaining all 29000 records I'd advise against using a Collection; whilst reasonably impressive performance-wise with a relatively small number of items, they get very sluggish with large number of items. If you don't want to change your code too much, then consider a Dictionary object. Alternatively you might want to look into disconnected recordsets

 
Hi nwdurcholz.
I'm new to .net, but not to programming.
What I would do is create an ado.net object and keep it in memory. It acts like a dataset + a collection.
It reads the records w/o any programming, and keeps them safe. Then you could bind your dropdown/list box to it.

A collection was great in pre .net, but the more I search, I'm finding .net has many different ways - way better.

Question: Is the user really going to search through 29000+ items? [dazed]
 
Sadly we're not in a .NET forum ...

However, ADO's disassociated recordset idea that I mentioned earlier works a lot like ADO.NET's dataset object
 
Opse! Sorry. I was browsing 3 different forums at the same time in 3 browser sessions.
I guess it had to happen sooner or later.

Sorry 'bout that.
 
Well, the common situation that we have in the program is that a user starts typing a name into a text box. A list-view pops up with the rest of the customer's name, address, etc... and selects the matching name as the user types. Since the point of the UI is to give the user a fast, convenient way to search a very large customer list, I need the whole table.

Anyway I found my problem, and I think that I can use disconnected recordsets to help address this. thanks to everybody who gave suggestions, but I have yet another.

Can anybody tell me about the VB6 Collection object. Does anybody have any idea what data structure it uses internally or anything about the relative speed of getting items by index vs. by key?

Bigfoot: I agree. I wish I could have coded this in .net.
 
A keyword search in this forum should find comments from me on this subject. In summary VB collections use a hashed index technique - but it is a very inefficient one (whether you use the index or key)
 
that would explain a lot. I had built a SortedBySortKey property into my collection. I used a binary search alg. to insert, but everything, including the sort values were stored in collections. I assumed that accessing an item by index would be extremely fast--at least run in constant time.

Anyway, I'll have to come up w/ some other way to sort the list, thanks for the help.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top