Arokh Posted July 24, 2008 Posted July 24, 2008 I have about 6200 little files in a directory, which might be used by my program but in most cases the file will only be used once. Now having 6200 files of 2-10kb in a directory looks bad, so I'm looking for another way: One solution I thought of was creating a file which acts a an archive and contains all files. I can't use zip compression because if I need to use one of the files, I would like to get them as fast as possible without decompressing the whole archive first. Is there an easy way to pull this off, or do I have to write something like that myself? Quote
Leaders snarfblam Posted July 24, 2008 Leaders Posted July 24, 2008 Do you need to compress the files? If not, it is as simple as writing all of the file data into a filestream with a small amount of header data to be used to find the offset of an embedded file. Quote [sIGPIC]e[/sIGPIC]
Administrators PlausiblyDamp Posted July 28, 2008 Administrators Posted July 28, 2008 You could use a zip file but specify zero compression, that way there is no performance hit due to compression / decompression. Also if you are using zip compression each file is compressed individually, you would not need to decompress 6000+ files to get to a file near the end of the archive. Quote Posting Guidelines FAQ Post Formatting Intellectuals solve problems; geniuses prevent them. -- Albert Einstein
Arokh Posted July 28, 2008 Author Posted July 28, 2008 I didn't know that you could do that with zip, interesting. I've already done it like marble_eater suggested, I guess I will change it to a ZIP archive sometime in the future. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.