Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

File size limitation??

Status
Not open for further replies.

jabhawk

Programmer
Apr 23, 2002
71
US
I may have a problem when creating a text file opened with FOPEN and/or FCREATE. Our processing has reached the point where the filesize is exceeding 2.1G and the process suddenly is locking up and we are seeing huge memory use.

Is there any limitations to filesize with the following platform:
- Win2000 NTFS 50G-IDE 256M PIII
- VFP 6.0

Open tables:
1 - 986M
2 - 986M
3 - 986M
4 - One low level file unbuffered and we append to the file if exists.

Process: In seq grab one record from each table, combine and write out as a formatted CSV file, repeat. At end of table close first three and open another three, repeat three times.

At 2.146G the process suddenly drops in CPU, the memory use maxes out and VFP does not respond.

Any thoughts?
Jon B

Jonthan A Black
 

As far as a know the limit to one file (that is either the table or the index) is 2.1 g

Mike Gagnon

If you want to get the best response to a question, please check out FAQ184-2483 first.
 
You are going to need to split the outgoing csv file into more than one piece if it is going to exceed 2gb.

Foxpro has a file size limit of 2gb on any file it deals with.

 
The low level file functions do have a 2 gig file limit too. Depending of the needs for one file vs. having the exact data you have stored... perhaps there's a way to transform the data as you're writting it out to make it smaller... are there leading '0' or empty spaces taking up room but adding no info?

Brian
 
Thanks for the confirmations. We broke the export into seperate modules to keep the filesize under 2G and it is working. Our problem is that we are dealing with a "huge" statistical data source that we are breaking into smaller elements.

The final target is MySQL, I am thinking that we may be able to just update the MySql db directly to bypass the CSV file. The process steps are still fuzzy.

JB

Jonthan A Black
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top