Hi all!

I'm trying to create a program that produces a memory leak. I have a program that looks like it works, but I was wanting to verify whether it actually is causing a memory leak, and that it's not just some other type of issue. It's in C++, which is a language I have very little experience in. I've mostly used C#, but as it is garbage collected I thought that it may be easier to use a different language.

My code is as follows:

void memLeak()
{
  int *data = new int;
  *data = 15;
  int *data1 = new int;
  *data = 15;
  ///
  /// ... all the way up to twenty of these...
  ///
  int *data20 = new int;
  *data = 15;
}

int main(){

    int ch = 1;
    int x = 0;
    while (ch == 1)
    {
        while (x < x+1000000)
        {
            memLeak();
            x++;
        }
        x = 0;
    }

return 0;
}

It compiles without any issues, and runs in a blank console. By monitoring it in Task Manager, I can see that it gets up to about 2GB memory usage, but at that point it halts and this message appears:

This application has requested the runtime to terminate it in an unusual way

The process in Task Manager then ends and disappears. My question: is this program causing a memory leak, and then Windows is just terminating it to prevent it from getting to a critical state? Or is my program actually doing something different?

Cheers for any help!

Every time you call new, you allocate some memory on the heap. Your function memLeak does this twenty times. When the function ends, all the pointers to that memory (the pointers named data1 to data20) are destroyed. The memory is still allocated. You have no way to delete it because you don't have pointers to it, so that's twenty memory leaks every time you call memLeak.

Are you doing this just to watch the effects of a memory leak? To avoid memory leaks, everything you allocate with new must be deallocated with delete. Don't lose the pointers to the memory you allocated; you need them for the delete.

Ah, my apologies, I should have been clearer. You're exactly right, I'm creating a memory leak on purpose.

I had expected the program would run until there was no memory left to allocate, so the computer would freeze or bluescreen or something. Instead, it only runs until it has allocated about 2GB, then the program terminates. Do you know why it stops at 2GB, instead of exhuasting all available memory?

The problem of it only going up to 2gb might not be in your code, it might be the program your using to write the code and execute it only allows 2gb to be used(So it wont crash your PC when you have a non ending loop) try looking into the settings of the programme your using.

As mat1998x said, the limit of 2Gb is probably imposed by the compiling or running environment (incl. the OS). It is possible that compilation in "debug mode" would include a limit like that, under the assumption that undebugged code might have leaks, I would doubt that any compiler would produce such a limit under "release mode". It is also possible that the limit is imposed by the runtime environment, for example, if you ran the program through your IDE's "run" button, then it is possible that the program is actually running in a sandbox environment with extra protections. Finally, it is also possible that your operating system has a limit (probably relative to your available RAM and the amount of RAM required by the OS itself) so that a single user-land application cannot overrun the system by grabbing all the memory.

Another very likely source of the limit is due to the inherent limits of the system. For example, on a 32bit computer, the address values (pointers) can only map up to 4Gb of memory, and, since the virtual address space created by the operating system for the process will often make a distinction between heap-bound addresses (for dynamically allocated memory) and static addresses (addresses to execution points, stack pointers (local variables), global variables, etc.), you need at least one bit to make that distinction, which means you can only map 2Gb of dynamically allocated memory. So, on 32bit systems, the operating system is forced to impose a limit like that. On a 64bit system, there are usually no such limits (64bit pointers can map up to 16 Exabytes (about 16 million Terabytes)).

Just so you know, the typical outcome of a memory-leaking program, where the leaks are accumulated in an infinite loop (or nearly infinite) and without a limit being imposed on it, is that it ends up displacing virtually all other processes from the main memory (RAM) into "virtual memory" (or "swap-space" in Unix-like systems) which is just HDD memory made to look like RAM, i.e., the programs are running off the hard-drive instead of the RAM. This causes the entire system to slow down considerably, usually rendering it unusable, at least, until both the RAM limit and virtual memory limit combined are exhausted, at which point, the program that requested the memory in question will most likely crash (abnormal termination, out-of-memory condition, etc.). Generally, there is a much higher probability that the program which is refused the memory it requested due to an out-of-memory condition is the programming that is causing the leak (i.e., it is constantly requesting more and more memory). If this is the case, then the leaking program will crash, and all its leaked memory will be reclaimed by the operating system, and the system will gradually come back to life as processes that have been migrated to the hard-drive are migrated back into RAM memory. There is always the off-chance that it just so happens that at the moment when the system runs out of memory, another program (not the one leaking) will request memory, usually causing it to crash, but this is very unlikely, and it is pretty much the only way that a memory leak could cause other programs to fail. So, normally, the system will slow down considerably up until the leaking program crashes, and that's it, although it can be very frustrating when the system becomes completely unresponsive, but it is usually still alive and working correctly, just very very slow. And finally, most of the system processes, i.e., the things that could really do harm if they crashed suddenly, are generally able to cope with out-of-memory conditions and continue to operate normally (or in a reduced way), that is if it is a half-decent operating system. So, it is very unlikely that the system will crash as a result of a leaking program. But, obviously, you can create a malicious program that will cause run-away memory-leaks and then stop leaking when an out-of-memory condition is hit (without crashing), which would cause the system to be virtually unusable and causing some other applications to crash. In any case, you can always recover from such a situation by manually killing the process that takes up all the system memory (i.e., through the task manager (Windows) or through the kill command (Unix-like systems)), although manually killing a process when the system is almost completely unresponsive can be a bit difficult.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.