Thread: Filesystem type and file size limit

  1. #1
    Registered User
    Join Date
    Dec 2009

    Filesystem type and file size limit

    Hi, I've searched about this topic A LOT, but I can't find anything OS independent, so I'm asking here.

    First a little context:

    I have a digital video recorder that every 935 mB splits the file it's recording. It does split them in a special way, so simply concatenating the files won't work, so I've written a program to do it for me. So far the program works allright.

    Here comes the problem: the resultant file will most likely be >4 gB some times, so I would like to know if the filesystem will support that big file before attempting to write it, so I've thought of two ways of doing that:

    1.- Check the filesystem of the disk and manually add the size limits to the program for the most common file systems.
    2.- Straight get the file size limit of the file system.

    I would prefer option #2, but I would be happy if I only was able to know if the filesystem is a FAT one, as all others will most likely support more file size than I need.

    Thanks in advance

  2. #2
    Registered User
    Join Date
    Sep 2006
    I would say just check the documentation of the file system you want to use, on the OS you want to use.

    I would stick with 64 bit OS's, only. Forget the 32 bit one's.

    To improve matters, I'd see about defragging the HD, and getting the most contiguous free space possible, before you put the big file onto the HD. That will prevent massive fragmentation (and possible loss of data), later on. Also, make back up's on DVD's and get only the very best one's for this. Verbatim is a brand that's highly recommended. Some disc media is just crap.

    If you want to split big files later on (say for DVD's, etc.), I recommend HJ-Split. It's free, and very good for splitting large files, and joining them back together. or any other file depot should have it.

    If you examine the properties of the HD, you'll know what file system it's set up for.
    Last edited by Adak; 12-22-2009 at 11:03 AM.

  3. #3
    int x = *((int *) NULL); Cactus_Hugger's Avatar
    Join Date
    Jul 2003
    Banks of the River Styx
    There is no portable option. There are OS-specific calls, however, I do not know what they are.

    This is going to depend on where in the filesystem you are writing (as different spots may be on different media with different filesystems). You may also run into quota limits -- one system I work on has plenty of space, and the filesystem can support huge files, but I'm only able to write ~100MB.

    FAT filesystems are becoming more and more rare, probably with the exception of thumbdisks. ext2/3 and NTFS both support large files.

    I would stick with 64 bit OS's, only. Forget the 32 bit one's.
    A 32-bit vs 64-bit OS has absolutely nothing to do with large file support.
    long time; /* know C? */
    Unprecedented performance: Nothing ever ran this slow before.
    Any sufficiently advanced bug is indistinguishable from a feature.
    Real Programmers confuse Halloween and Christmas, because dec 25 == oct 31.
    The best way to accelerate an IBM is at 9.8 m/s/s.
    recursion (re - cur' - zhun) n. 1. (see recursion)

  4. #4
    Registered User
    Join Date
    Sep 2006
    64 bit OS's support bigger disk sizes, and larger disks can have larger sectors in them. That's what you want for holding huge files.

  5. #5
    Join Date
    Oct 2007
    Inside my computer
    ...I don't think so. Even x86 Windows can support terrabytes of space.
    Memory is another matter.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  6. #6
    Registered User
    Join Date
    Sep 2004
    Keep in mind that this is more than a filesystem issue. For instance, older Linux versions will have a 2GB file size limit no matter what file system you are using. This was due to the way inodes were structured in the VFS layer. Also, the user would need a glibc version > 2.2 to include large file support. Lastly, some filesystems have different versions which have different file size limitations.

    The best way to determine the limit is to just write out the file, and if you get an error, display a message to the user. Going through all the work to determine this before-hand is not worth it.
    bit∙hub [bit-huhb] n. A source and destination for information.

  7. #7
    Registered User
    Join Date
    Dec 2009
    There's lots of info about the big files and the C libraries here: Large File Support in Linux

    But that's not what I wanted.

    Well, if there's no portable option, either I'll use a system call or leave it as it is.

    Thanks anyway.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Getting an error with OpenGL: collect2: ld returned 1 exit status
    By Lorgon Jortle in forum C++ Programming
    Replies: 6
    Last Post: 05-08-2009, 08:18 PM
  2. Using VC Toolkit 2003
    By Noobwaker in forum Windows Programming
    Replies: 8
    Last Post: 03-13-2006, 07:33 AM
  3. Dikumud
    By maxorator in forum C++ Programming
    Replies: 1
    Last Post: 10-01-2005, 06:39 AM
  4. Problem with Visual C++ Object-Oriented Programming Book.
    By GameGenie in forum C++ Programming
    Replies: 9
    Last Post: 08-29-2005, 11:21 PM
  5. Please Help - Problem with Compilers
    By toonlover in forum C++ Programming
    Replies: 5
    Last Post: 07-23-2005, 10:03 AM