4

The following occurs in a linux 2.6.32-220.7.1.el6.x86_64 and g++ 4.4.6.

The following code:

#include <iostream>
#include <cstdlib>

int PROB_SIZE   = 10000000;
using namespace std;

int main(int argc, char *argv[])    {

    unsigned int numbers[PROB_SIZE];
    cout << "Generating " << PROB_SIZE << " random numbers... " << flush;

    return 0;
}

Produce the following SIGSEGV: (gdb) run Starting program: /home/cpd20202/sorting/error

Program received signal SIGSEGV, Segmentation fault.
0x000000000040093b in main (argc=1, argv=0x7fffffffe4f8) at error.cpp:13
13      cout << "Generating " << PROB_SIZE << " random numbers... " << flush;
Missing separate debuginfos, use: debuginfo-install glibc-2.12-1.47.el6_2.5.x86_64 libgcc-4.4.6-3.el6.x86_64 libstdc++-4.4.6-3.el6.x86_64
(gdb) where
#0  0x000000000040093b in main (argc=1, argv=0x7fffffffe4f8) at error.cpp:13

I'm really out of ideas.

0

5 Answers 5

8

It's because your array is larger than the size of the stack. Therefore, your program crashes when it tries to push something new during the function call.

The error you get is conceptually the same as a stack overflow, except it's caused by a local variable being incredibly large rather than by nesting too many function calls.

The stack is a small area of memory for use by functions for housekeeping and local variables. It's never really big, a few megabytes at most. This is why you will need a dynamic allocation to get rid of your problem. Most dynamic allocations will tap on the heap, which is often limited only by your physical memory.

You will need to allocate the array on the heap. For that, you have several options, the simplest of which probably being to use a std::vector<int>. They behave roughly the same as normal arrays and their storage is automatically managed, so this shouldn't be a problem.

#include <vector>
#include <iostream>

int PROB_SIZE   = 10000000;
using namespace std;

int main()
{
    vector<int> numbers(PROB_SIZE);
    cout << "Generating " << PROB_SIZE << " random numbers... " << flush;

    return 0;
}
Sign up to request clarification or add additional context in comments.

1 Comment

Simplest, and usually most effective*.
5

Your "numbers" array is being allocated on the stack, and is probably too big. You will need to dynamically allocate the array.

7 Comments

This machine has 48GB of memory...
@RSFalcon7, that's the heap size. The stack is much, much smaller. Therefore, you need to allocate the memory on the heap instead of the stack.
@RSFalcon7 the amount of memory in the system has nothing to do with the size of the stack, it's usually ~1MB.
@RSFalcon7, the amount of memory is not related to the size of the stack. Processes start with a (small) fixed amount of memory for use by functions and function calls. If you need more, you need to tap in the "heap", which has the potential of all your physical memory.
@RSFalcon7, declaring it globally means it is no longer allocated on the stack. To respond to your previous post, the stack size is machine-dependent. You can increase it if you really want to.
|
3

Your process does not have enough stack space to allocate ten million integers. That's 40 megabytes (or 80 if your int are 64-bit), and processes normally start with a stack of about one megabyte.

You have two basic choices:

  • Allocate your array as a global variable (by moving its declaration outside main).
  • Allocate your array on the heap using malloc, new, or std::vector.

2 Comments

Even though that would solve the immediate problem, I would advise against moving variables to the global scope.
...or keeping the local scope while giving it static storage duration by defining it as static (though std::vector is probably the right choice).
0

It's not cout. You allocate a very large array, numbers, on stack and blow your stack. Stack is typically 8 mb or so where as the array is 40MB or so.

int v[size]; // stack
int v* = new int[size]; // heap

6 Comments

@chris ulimit -s on my vanilla ubuntu (amd64) machines returns 8192.
From what I've seen from most people, it's 1MB unless they change it. It's bound to differ from person to person, but I'm pretty sure 1MB is most common.
@chris hmm, i just tested five Linux machines (all x64) and the lowest is 8MB, could be 32/64 bit thing.
I tested this on my x64 Windows and it seems to be just over 2MB.
I know the difference between then ;) What I didn't know is that my stack only has 1MB: $ulimit -s 10240
|
0

You are allocating way too much space for the stack to handle (10 millions ints is a very large amount).

If you actually require this much, I suggest you use heap space instead by using:

malloc(sizeof(int) * 10000000);

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.