Hi friends,

I'm having a problem to write a file and read the same file in sequence. The file size is 4 GB. I tried fstream for both and as second option ofstream and ifstream. I'm reading just the first 2 lines of the file. If I read those in separated programs work well. I think the file is so big and when I'm trying to access it, it's not in the HD yet.

Can anyone give some suggestions?

Thanks a lot!

You're going to need a 64-bit version of the file i/o functions. For MS-Windows you can use ReadFile() win32 api function. Don't know about *nix. I know ReadFile() isn't as convenient as fstreams, but AFAIK the only way to get 64-bit version of fstream is to use a 64-bit compiler. ReadFile() give you 64-bit options.

You would probably be better off using one of boost's file i/o functions. I'm not familiar enough with it to give you more information, you just need to read the documentation at the link I gave you.

Actually, fstream on linux (32-bit or 64-bit) should not have a problem with a 4GB+ file. I process them all the time on a 32-bit Ubuntu 9.04 system on my laptop. In any case, seeing the source code for the write+read operations would be helpful.

I also don't think fstream will have a problem with 4+GB files, I think sadsdw's problem is more likely due to memory usage (speaking from experience).

if pseudorandom21 is correct, perhaps having random access to the file data would be an alternative to having to read in the entire file at once.

I think the file is so big and when I'm trying to access it, it's not in the HD yet.

You might have to call ostream's flush() method to force the os to write the data to the file.

Thanks Ancient Dragon, Clinton Portis, pseudorandom21 and rubberman. Worked well on linux and using fstream with flush(). Thanks a lot!

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.