Hi,
I was wondering:
if the memory allocation in the following codes is static or dynamic,
why the upper limits on their sizes are different,
why some of them would cause segment fault?
what you would suggest to allocate array according to different need?
1.
double data[1048000];
for(int i=0; i<1048000; i++){
data[i] = i;
}
this gives segfault error at the first iteration ie i=0. If I change size to be 1047000, it will be fine. Is the size too big? What is the maximum size for it?
2.
int size=1048000;
double data[size];
for(int i=0; i<size; i++){
data[i] = i;
}
Same error as the code before. Is the allocation for the array also static?
3.
int size=atoi(argv[1]);
double data [size];
for(int i=0; i<size; i++){
data[i] = i;
}
Same error as the code before. Is this one static allocation for the array? The size is dynamically determine by command line argument. This one is actually a simplified case of what I met in real today where the double array is of size 3241728 and causing segfault somewhere down the running . I changed the code to the one below and all is fine now.
4.
int size=atoi(argv[1]);
double *data = new double[size];
for(int i=0; i<size; i++){
data[i] = i;
}
delete [] data;
This one I know is dynamical allocation. It seems that there is virtually no limit on the size of the array.
Thanks in advance for looking at these somehow tedious things!