/* One file can be hundreds of MBs. So, 640 MB in maximum*/
int nMaxByteCount = 64000000;
I read files to get binary info in it. But I don't know the exact sizes before reading. So, I allocate nMaxByteCount for large files first and get support from fread return to get the exact size.
*nFileSize = fread(caFileContent,sizeof(char),nMaxByteCount,DataFile);
Then I use this *nFileSize -which contain the exact file size- for filling information in my data structures. When I create a char array or struct containing these variables, I mostly get segmentation faults after reading big files more than 10 MBs. I think the allocated size of memory which the system gives access to my program is not suitable or enough for my files and further programming. How much size of bytes or KBs or MBs are permitted for my Linux C program on a 64 bit processor? Can I change it? Or, is this the problem?