I have a program that dynamically allocates an array and uses it. The thing is that under certain situations this array will get infinitely long. As such I would like to know if/when the array has gotten too long and then start saving it to a file (where length will not be an issue) Here is an example:

unsigned char *myArray=new unsigned char[len];
//if len>max allowed memory:
FILE *myFile=fopen("myFile.dataFile","wb+");
for (int i=0; i<len; ++i)
{
    fprintf("%c",myOtherData[i],myFile);
}
//else
for (int i=0; i<len; ++i)
{
    myArray[i]=myOtherData[i];
}

I was thinking that maybe new[] returns NULL if there is not enough space or something?

Are you sure your implementation of new doesn't already provide virtual memory backed by disk space? Have you run into the issue of new returning NULL (as opposed to throwing bad_alloc )?

How can I check this, and how can I be sure that it will be platform-independant?

I'm pretty sure new throwing an exception is platform-independent. Following that, you can install an exception handler with the effect of doing what you are suggesting (writing to file). Although, I think you would be hard pressed to find a system today that doesn't already transparently support this with virtual memory.

Wow, I just tested it with this code:

#include <iostream>
using namespace std;

int main()
{
    long long int *test=new long long int[100000000000];
    test[                                  99999999999]=10;
    cout<<test[                            99999999999];
    return 0;
}

And it performed without a problem. Considering that I only have 4 gigs of RAM on my machine, can I assume that my compiler is transparently supporting the whole file-writing thing? (if this works then my mass1venum dll (I wrote it as an excercise) will be able to be a lot simpler)

Indeed, pretty much all modern operating systems have good mechanisms to deal with memory and use the hard-drive if necessary. Of course, there will always be a chance that you run out of memory (whether the OS doesn't want to take up more HDD memory or because you run out of HDD space). So, there is really no need for you to write code that does this type of thing, and the OS will do this much better than you can ever hope to. For instance, the OS will swap memory between the RAM and HDD such that your program is always working on RAM memory (the chunks of memory not currently used are basically sleeping on the HDD). This is a kind of mechanism you would have a really hard time doing yourself.

As for the new operator, the C++ standard prescribes that it must throw a bad_alloc exception if you run out of memory, so that is entirely platform independent. If you want the "return NULL" behaviour, you must use the no-throw new operator, as in new(nothrow) int[1024];

If you are going to be working with large chunks of data, it might be a good idea to consider a container like std::deque which stores the complete array as a number of big chunks of data, as opposed to one contiguous array. This will generally be less demanding for the OS because the OS won't have to make space for one huge chunk of memory. But, of course, the memory won't be contiguous, so it might not be appropriate for your application.

Thank you, that is exactly what I was hoping to hear!

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.