Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations derfloh on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

NT maximum files in a folder

Status
Not open for further replies.

stephentbell

IS-IT--Management
Joined
Aug 9, 2001
Messages
1
Location
US
Hi.

We have several folders on an NT4/sp6 server (NTFS) that have more than 150,000 files in them. The drive is an array (RAID5) and is a total of 585GB (all in one NT partition), of which we're using about 238GB (yes, GB). Each files is about 15k, so each folder is 2-3GB.

When browsing, it takes 4-5 minutes to open each folder. I assume this is because it takes NT that long to enumerate the files in the directory. (We even have a 95 workstations that hangs when trying to open one of these folders.)

What can I do to improve this? Do I need to parse the files out into more subfolders? Do I need to make the array smaller? Do I need to partition the drive into smaller partitions? Increase cache on the array controller (adaptec AAA-13xU2)? Sell my soul to the devil?

I can't find any info about this on the Internet. Does someone know?
 
The sub-folder idea is the way to go. I don't know of any official statistics, but I personally don't like to exceed 1000 files or folders per folder.
 
If you know programming you can also make a simple programming using the system call FindFile with wildcard search (*.*) - you may be able to get the real time the OS need to get the file info. I guess it's very slow because I copy in a structure in memory the attributes info of all theses files, and my opinion is that the windows explorer use such call.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top