Hi Emma Lu,
consider Dan's suggestion. Plus make a smaller memory allocation the default. It really does not matter much how large the data grows on disk, what they retrieve at the clients into memory matters. Retrieving a 1GB cursor for display in a grid is something that will depend on network bandwidth and not be speed up with more RAM, so they may do such a query once as a mistake. If they get the chance to filter what they want before you retrieve the data, that'll be just fine.
If an appllication runs in a network, it still runs on each single client, only if it runs in Terminal Server, more RAM on that server is indeed a good idea. But then it will only help, if the application does not allocate half of the available RAM for each session.
Allocating more RAM will mostly help in situations, where it's a single user desktop app with the database stored locally and no network between the client and the database file/storage server, in all other situations network is the bottleneck and you would rather seldom retreive much data to each client. If there will be processes running on all data, they should run server side.
I'd rather still recommend to only allocate maximum 128 MB, that's already much RAM for textual/ascii data. And if you make it configurable for the power user, he can try to make it faster with more RAM.
Bye, Olaf.