rac2,
the OP's statement was that nested loops are bad... they're not - but, as with any feature in programming, they can be abused...
[tt]for i = 0 to 99999999999999[/tt]
or
[tt]while true[/tt]
or
[tt]for i = 0 to 10
i = i -1
next[/tt]
and so on... brandishing these things as bad in themselves is spreading FUD, hence why I corrected the OP.
There are circumstances where
evaluating 6000 iterations is 'necessary'.. though usually this can be alleviated by better design of other components of the solution.
But the issue at hand
is a poor use of loops..
snowboardr,
I don't have time to figure out working with xml
Then output the html code (e.g. an unordered list) inside a string and cache that - when you load the page just write it out and deal with the state client side.. it will be a LOT faster.... BUT it will also be a little 'jerky' (the js will act after the page has loaded so showing the items will happened after load, and not before) - it's not very elegant, but quick and dirty is sometimes a necessity and what you are asking for. When you update the database with new categories etc - output the html code again to memory/cache. Simple. Doing it the right way may seem like a chore, but it will give you lots of benefits.. up to you. You can do the state server side, but you get into the same issues with performance that you will always get when using nested loops in a database.. trust me... memory is far faster than any FS based Database..
schase,
Appreciate your sentiment, you're right in that Getrows is definitely faster than normal recordsets, however.. a couple of things:
1. There are not 3500 separate read requests of the database 'over the wire' in your example, there are at most 1400 - the default cache size for ADO in classic ASP is 1 record (and can be set higher to speed this latency-bound part of the process up) - it will retrieve the entire record in that request - you deal with that cached version (of however many records in the cache size) until you request .resync. (the link you provide does say that he fudged the numbers to make a point and that changing the cachesize to 50 would reduce the reads to just 14). The other (bigger) problem with a recordset is the overhead in creation of a recordset object and all the sub-collections/objects, such as fields etc.. it's a reasonably large collection full of lots of metadata to provide functionality such as transactions... which all takes time to build.
2. This example is not using large data sets.. (20-30 is tiny) so the real gains with getrows will be minimal (but still measurable by a computer)
You can speed up beyond GetRows if you have a defined string pattern by using GetString. It is much faster than iterating over an array in vbscript.
e.g.:
In fact, even faster (especially for large datasets) is creating XML from a recordset using GetString and using XSLT to transform the set of data.. it is very powerful at looping.. far more so than vbscript - which you have to do at some point if you want to output the set of data in the array or recordset..
And to speed things up even more, you should use a fast string concatenation class.. string concats in vbscript become exponentially slower the more you add to the variable.. so iterating in a large loop and concatenating to a single string variable is not good...
A good list to help with performance issues:
and Microsoft's:
A smile is worth a thousand kind words. So smile, it's easy! 