Contact US

Log In

Come Join Us!

Are you a
Computer / IT professional?
Join Tek-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Parsing a DataBase into Managable Portions for faster queries

Parsing a DataBase into Managable Portions for faster queries

Parsing a DataBase into Managable Portions for faster queries

Really Long Question...

I'm currently developing an application that has a rather large database somewhere around 500,000 records divided up into 8 different tables. The problem lies in the fact that the client wants to have an analysis tool where they can have a selection criterion that would potentially look for distinct records (based on an account id) in all 8 tables at one time. Hence "select * from [table 1] where not in(select * from ....". This would really chug!

What I came up with may not be the best approach? The database is relativly static (updates monthly) so I decided that I would do some of the queries prior and write the results to a textfile.

(inside of file looks like)
=4 10000001100000002100000003100000004
=1 10000005
*1 .... etc

where / & [ & * represent different elements that would be queried against and = represents the result i.e 4 and then 4 account ids. Using Line Input I can then load in only the Account Ids that fit the query but the question is what do I load it into.

I was looking for something temporary and in memory ie db.createquerydef("".. but I am having problems finding something that I don't have to first commit to the database.

I really do not want to have to create a work table and then delete it after every query.

Any suggestions?

RE: Parsing a DataBase into Managable Portions for faster queries

Hi there - what database are you using?

500,000 rows doesn't sound too bad - not Access is it?

Anyway - your question.

My approach would depend upon how many records I was going to read into memory.

If it's not too many (too many defined here as "it would make it go really slow" you could define a record as a data type and then declare an array of your new datatype. How does that sound?

If that goes too slowly - have you thought about using the Jet engine to create a local .mdb file to hold your data whilst you fiddle with it?



Mike Lacey

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Tek-Tips Forums free from inappropriate posts.
The Tek-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Tek-Tips forums is a member-only feature.

Click Here to join Tek-Tips and talk with other members! Already a Member? Login

Close Box

Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close