Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations wOOdy-Soft on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How many records can a table contain without dying???

Status
Not open for further replies.

Apollo6

Technical User
Jan 27, 2000
418
US
I have been dropped into a project that has a VB application mining data from an Oracle database and writing it to an Access database.
The data is then taken from the Access database and writes performance reports to Excel spreadsheets.
The table that holds the bulk of the data now has over a million records in it.
Understand, bringing up one of these reports is very, very cumbersome to say the least but in addition, it appears the table is getting corrupted and has had to be rebuilt once already. Now the VB application is giving run-time errors when trying to write to the Access table.
Besides performance issues, are there limits to the number of records that Access can handle?
 
I guess that would depend on how much data you have in one record.

I don't know which one is correct but from what I have heard the limitations are either 1 Gb per database or 1 Gb per table.
 
The stated maximum is 1G for tables. But software I'm writting at the moment using access seems to run into problems with a lot smaller size of tables. Had to spec SQL server to get over the problem. Find that memory on machine can have big effect. Using an NT workstation with 256M I can run certain queries but my end users cant.
 
While the dbs/table(s) can hold a LOT of data, many other issues will affect actual performance. Memory is certainly a big one, as SQL - by definition - always assembles the total composite recordset without the limiting clauses (where et al) and then eliminates the records which do not meet the criteria. So, is certain instances, while the final results may be modest the 'superset' will not fit into the available resource pool (memory).

Another real world problem is often in the network setup. Most 'office' networks are set up to accomodate the 'normal' office traffic, such as e-mail, and a common pool of documents. This often does NOT favor the transmission of datasets over the same network, and may cause time-out errors or REALLT poor performance. An approach to minimizing the impact in these situations is the classic "stored procedure" on the server. This places much of the burden of processing on the server machine and reduces the ammount of data transfered over the network system/



MichaelRed
mred@duvallgroup.com
There is never time to do it right but there is always time to do it over
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top