Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

File count in directory when over 65,000 files

Status
Not open for further replies.

jeisner

Programmer
Jul 23, 2001
26
AU
I want to get a count of files in a directory which I could get very easy with adir() however there could be more than 65,000 files on occasion and this will crash Fox as it will exceed the maximum array size.

Is there another method of getting a file count?
 
I believe you have reach the VFP limit for number of elements in an Array. From the help file:

Variables and Arrays

Default # of variables 1,024
Maximum # of variables 65,000
Maximum # of arrays 65,000
Maximum # of elements per array 65,000
 
Try
! dir > tdir.txt
crea table tdir(text C(40))
appe from tdir.txt type sdf
c_files = recc() - 10
(not tested)
 
If you just need the count, how about using WSH? (Windows Script Host).

oFSO = CREATEOBJ('Scripting.FileSystemObject')
?oFSO.GetFolder("c:\").Files.Count
oFSO = .null.

Rick
 
Tesar, I do like your idea as it is very fast - and added two lines to it which will make the
file tdir showing only files-names, as all subdirectories and size-infos are NOT written
using the first two letters in capital letters:


! dir > tdir.txt
crea table tdir(text C(40))
appe from tdir.txt type sdf

****************************
*and now:
****************************
delete all for ! isupper(substr(text,1,2))
pack

****************************
*tdir should show file-names only.

Regards from Germany

Klaus
 
This is another approach which I tested with 700 files (used 0.3 seconds) please try
and let me know whether you had problems with more than 65.000 files and how
the performance was.

Regards from Germany
Klaus


*filecount.prg

*This program shows how many files are in a directory
*and makes a string-variable (myfilenumber) of it.
*should work for more than 65,000 files as infos
*are stored in a memo-field.

starttime = SECONDS()

CLOSE DATA
CREATE TABLE filecount (mymemo m)
APPEND BLANK
LIST FILES LIKE *.* TO myfiles.txt
APPEND MEMO mymemo FROM myfiles.txt OVERWRITE
CLEA

myfilenumber = MLINE(mymemo,(MEMLINES(mymemo)-2))

? myfilenumber

? SECONDS()-starttime, " seconds performance"

&& comment:
&& myfilenumber is a text-string where in
&& relation to the language the
&& number of files can be seen
&& or extracted from.
&& "eg: xxxxx byte in yyy files"
&& where yyy is your file-counter requested.

*end of program
 
This is slightly off your topic, but are you aware that having that many files in one directory is a huge performance hit on most operating systems (in fact more that 2000 tends to be an issue). This was definatly true in NT4, Win98 and earlier. I have not tested it in 2000 or xp
 
The directory is an archive directory on a dual xeon Windows 2k server. Seperate NTFS 15,000 RPM HDD than the HDD with the db file server on it. We keep track of the directory for the client to warn them when it hits 100,000 files, any more than that and it takes ages to backup to tape.
 
And Yes we know it is a huge performance hit having that many files in the directory :(
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top