Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations wOOdy-Soft on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Memory problem

Status
Not open for further replies.

lifter10

MIS
Dec 3, 2003
45
US
I have another problem. I am running quite a few imports into my database and the import files are quite large. (75,000+ records). Each time I import one of these files, the database increases in size by the size of the table I bring in and does not free up the memory from the deleted data. Is there any way to free up this memory or am I just going to have to write a procedure to compact the database every so often?

Thanks,

Chris
 
Chris,

The only alternative to regular compaction would to change keeping the temporary tables, deleting the records from them then append the new records. You might help the processing speed a bit by adding appropriate indexes.

I did an "advanced search" here and came up with some threads. They may or may not be of use to you, but does illustrate how others deal with compaction.

Thread181-692164

Thread705-642227

Thread181-379233


HTH,
Bob [morning]
 
How about linking the text file, running an action-query to append the data into the main tables and then removing the text file link?

This avoids the use of temporary tables and should reduce the amount of database bloat.

Just a thought....

Ed Metcalfe.

Please do not feed the trolls.....
 
Thank you both for the tips. I am trying the linking and method, but it seems to be giving me the same problem. Here is my process:

1. Link the current quarter text file.
2. Run a delete query on my main table.
3. Run an append query to add the records from the text file into my main table. (This is where the database gets bigger)

I have to have a "fresh" table because I am constantly running an update query to my main table. Any other tips?

Thanks,

Chris
 
Aha, I understand now.

So you are completely emptying and repopulating your main table on each import?

If this is the case then you will need to compact the database to keep it at a reasonably consistant size. The easiest way is to either use the Compact on Close option (which I generally don't like to use), or split the database into a frontend and a backend. Then you can use an additional sub to compact the backend database at the end of each import job.

I don't have the code at home but I'll post it here tomorrow morning if you want it.

Ed Metcalfe.

Please do not feed the trolls.....
 
Just make the compacting a part of the import process, something you do right at the end. If you're deleting records from tables, the only proper way to recover that space is to fill it in with new records or to compact the space down.

Alternately, you can just leave the bloated MDB file as it is, uncompacted, as long as it's not harming anything.

--
Find common answers using Google Groups:

Corrupt MDBs FAQ
 
Here is the code to compact a database. As discussed, run this on the backend database from the frontend...

Ed Metcalfe.

Sub CompactDB()
On Error GoTo CompactDB_Err
Const conFilePath = <put your directory path here>

DBEngine.CompactDatabase conFilePath & "<put your filename here>", conFilePath & "<put temporary filename here>"

If Dir(conFilePath & "<filename without extension>.bak") <> "" Then
Kill conFilePath & "<filename without extension>.bak"
End If

Name conFilePath& "<put your filename here>" As conFilePath &"<filename without extension>.bak"
Name conFilePath & "<put temporary filename here>" As conFilePath & "<put your filename here>"

Exit_CompactDB:
Exit Sub

CompactDB_Err:
MsgBox Err.Description
Resume Exit_CompactDB

End Sub

Please do not feed the trolls.....
 
Thanks Ed and everyone else. Looks like I'll have to include some code to take care of this problem. I appreciate the help.

Thanks,

Chris
 
Hey guys,

New to this website. I'm having an issue trying to catch blank cells. First I import a text file and populate a table. For a certain field name, all cells that are null or blank or nothing must be changed to the character "-". I'm was using IsNull function and ...![fieldname] = "", to catch these cells. It was missing some cells, than I noticed that several of the cells have the value "" (the tab character I think). Any ideas on how to catch those cells?

Thanks
 
If this is running in a business setting, you might also want to use some 3rd party software to protect your database. The prospect of losing a database full of more than 75K records is somewhat freightening. Having to rely upon server backup tapes with their known failure rates is not something I would want to do.

I have used the shareware program MDBTask and it works pretty well. You can set the software to first create a zip file backup of your MDB database (it will save something like 15 incremental zip files backups before it starts overwriting backup zip files if I remember correctly) and then compact/repair the database according to a schedule. I would use it to zip backup of several MDB files around 1:00 AM and then compact them individually beginning at 3:00 AM every day. This way if all of the backup zip files fail, then you can go to the server people with optimistic hope that they will have a recent backup tape that is good.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top