Jump to content

Is there a limit of 13107 items per folder?


loblo

Recommended Posts

As the title says.

I've been downloading massive directories of html files with Webreaper and it turned out it didn't download more than 13107 items per folder and I can't add anything with explorer into them, getting a popup error message saying the directory or file cannot be created. I can create dirs and files elsewhere on that volume as there is ample free space.

The only google hit I found that could be related was this but it doesn't help.

http://www.mombu.com/microsoft/comp-databases-paradox/t-2-char-field-limit-at-13107-records-122312.html

Any ideas?

Edited by loblo
Link to comment
Share on other sites


As the title says.

I've been downloading massive directories of html files with Webreaper and it turned out it didn't download more than 13107 items per folder and I can't add anything with explorer into them, getting a popup error message saying the directory or file cannot be created. I can create dirs and files elsewhere on that volume as there is ample free space.

The only google hit I found that could be related was this but it doesn't help.

http://www.mombu.com/microsoft/comp-databases-paradox/t-2-char-field-limit-at-13107-records-122312.html

Any ideas?

A Folder can only hold 65536 Directory Entries. Long File Names take up 2 or more entries depending upon length. Apparently your html Filenames are taking up an average of 5 Directory Entries each.

Link to comment
Share on other sites

LFN for each of those files is 41+4 so that just seems to be about 5 times the length of 8+3 filename and hence should explain that phenomenon I guess, thanks.

Not quite.

Long File Name entries hold up to 13 Unicode characters. Your LFNs are 46 characters (the dot is included). They require 4 Entries. The Short File Names use one for a total of 5 Entries per name.

Link to comment
Share on other sites

65536 entries

32bytes/entries

Max to 2 MB for folder size usages.

is this limitation imposed by IO.SYS or it was from win9x sub-system ?

would ths limitation patchable to allow larger number of files per folder?

Edited by Joseph_sw
Link to comment
Share on other sites

  • 2 weeks later...

According to the Microsoft Extensible Firmware Initiative: FAT32 File System Specification (Microsoft Corporation, Version 1.03, December 6, 2000, pages 33-34) aka FATGEN103:

Similarly, a FAT file system driver must not allow a directory (a file that is actually a container for other files) to be larger than 65,536 * 32 (2,097,152) bytes.

NOTE: This limit does not apply to the number of files in the directory. This limit is on the size of the directory itself and has nothing to do with the content of the directory. There are two reasons for this limit:

1. Because FAT directories are not sorted or indexed, it is a bad idea to create huge directories; otherwise, operations like creating a new entry (which requires every allocated directory entry to be checked to verify that the name doesn’t already exist in the directory) become very slow.

2. There are many FAT file system drivers and disk utilities, including Microsoft’s, that expect to be able to count the entries in a directory using a 16-bit WORD variable. For this reason, directories cannot have more than 16-bits worth of entries.

Of course, as we have seen in this thread, the statement: "This limit does not apply to the number of files in the directory. This limit is on the size of the directory itself and has nothing to do with the content of the directory." is misleading. But the rest of the quoted text makes a lot of sense, IMO.

would this limitation patchable to allow larger number of files per folder?

I think so, but only systems bearing the patch would able to find those surplus files, so it probably is not a good idea.

Link to comment
Share on other sites

Probably the best VISUAL representation of FAT with references to Win98 is at;

(also mentioned as a link on webpage File Allocation Table Wiki ),

http://www.beginningtoseethelight.org/fat16/index.php

Originally published around March 2002, the information is believed to be

still correct. (Webpage Author; NullAck).

-------

In relation to the problem about running out of filename entries due to

}in a single Folder{ too many filesnames from whole downloaded Website

(although the topic originator doesn't explicitly specify it, as the

46 long LFN Filenames are all the same length it seems he may be

taking timed snapshots of one particular website),

parcelling the downloaded Website into a number of Folders is called for

e.g.

not

C:/DnLd/WebsiteFolder/ '->all website in single folder' i.e. [WebsiteFolder]

but rather

C:/DnLd/WebsiteFolderA001/ '->some website' [not more than 13106 entries]

C:/DnLd/WebsiteFolderA002/ '->more website' [ " ]

C:/DnLd/WebsiteFolderA003/ '->yet more website' [ " ]

and so on, thereby avoiding the posters current problem of his exceeding the

maximum possible number of filename entries within a single Folder for W98,

(13106 being applicable here only because of the originators specific filename

length).

Edited by buyerninety
Link to comment
Share on other sites

On the subject, doesn't Microsoft use a namespace extension to mush together the many actual folders that compose the 'temporary internet files' virtual directory?

If that were the case it could probably be hacked to permit any folder to do the same kind of thing in the case it reached filename capacity.

Link to comment
Share on other sites

->I don't know.

But there shouldn't be any need to dive so deep into the OS workings.

Later productions of Windows Powershell could count files in a

designated Folder, trigger an event upon certain number reached, &

create Folders & redirect output to a created Folder - therefore it seems

likely to me that earlier productions of that for W98 (->WSH) may be arranged

to carry out a similar function well BEFORE Max is reached... if you already know

know around about what the max files in Folder is going to be for your specific

downloads, given PATHS and such, like outlined above.

Link to comment
Share on other sites

In relation to the problem about running out of filename entries due to

}in a single Folder{ too many filesnames from whole downloaded Website

(although the topic originator doesn't explicitly specify it, as the

46 long LFN Filenames are all the same length it seems he may be

taking timed snapshots of one particular website),

parcelling the downloaded Website into a number of Folders is called for

e.g.

not

C:/DnLd/WebsiteFolder/ '->all website in single folder' i.e. [WebsiteFolder]

but rather

C:/DnLd/WebsiteFolderA001/ '->some website' [not more than 13106 entries]

C:/DnLd/WebsiteFolderA002/ '->more website' [ " ]

C:/DnLd/WebsiteFolderA003/ '->yet more website' [ " ]

and so on, thereby avoiding the posters current problem of his exceeding the

maximum possible number of filename entries within a single Folder for W98,

(13106 being applicable here only because of the originators specific filename

length).

The online folders contain more files than I can accomodate and there is no website downloader application I found that's got an option to spread files from an unique folder into multiple ones, doing it by hand isn't an option obviously, especially that files should remain hyperlinked to each other properly. I guess the only option that might work would be to download and keep those files on an NTFS partition (I have the Paragon NTFS filesystem driver installed).

Edited by loblo
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...