1

I recently discovered that a lot places in our code were doing something like this:

int * int_array = new int[1000];

// ... do things with int_array

delete int_array;

The problem of course is that it should be using the delete [] operator, not the regular delete operator.

The mystery is: This code has been working for literally years, when compiled by Visual Studio 2003 and 2005 on Windows, and GCC / clang on OS X. Why hasn't this caused things to go terribly wrong before now?

As I understand it, we're telling the compiler to deallocate memory the "wrong" way, and usually if you do that, something terrible happens and your program crashes. Why isn't this happening for us? Do modern compilers automatically "do the right thing" for you, or enough of the right thing that it doesn't matter for basic types, or something else? I can't accept that we simply got lucky, as this code has been in use for years, by thousands of customers under multiple different operating systems.

Note that I'm NOT asking for an excuse to do things wrong, just trying to understand why we aren't in trouble for doing things wrong. :)

2
  • 3
    Undefined behaviour is undefined. Commented Feb 3, 2013 at 20:01
  • Most of software out there contains lot of defects and undefined behaviors that do not manifest themselves as actual externally observable issues. Commented Feb 3, 2013 at 20:07

5 Answers 5

5

This is the nature of undefined behavior -- it might do exactly what you intended it to do. The problem is, with the next version of the compiler, operating system, library, or CPU ... it might do something completely different.

Most likely, you're getting away with it for two reasons:

  1. int doesn't have a destructor. So the failure to correctly destroy each element in the array has no consequences.

  2. On this platform, new and new[] use the same allocator. So you're not returning a block to the wrong allocator.

Sign up to request clarification or add additional context in comments.

1 Comment

This is spot on. Having looked at the source for the global new / delete in gcc's libstdc++ and clang's libc++, it would seem that delete [] simply calls delete in these implementations.
3

Let's run through what happens, step by step (but we'll ignore exceptions):

int *foo = new int[100];

This allocates 100*sizeof(int) bytes of memory and calls int::int() 100 times, once for each element in the array.

Since int is a builtin type, it has a trivial constructor (i.e, it does nothing).

Now, how about:

delete foo;

This will call int::~int() on the address pointed to by foo, and then delete the memory pointed to by foo. Again, since int is a built-in type, it has a trivial destructor.

Compare this to:

delete [] foo;

which will call int::~int() for each of the items in the array pointed to by foo and then delete the memory pointed to by foo.

The basic difference here is that elements 1..99 don't get destructed. For int that's not a problem, and that's probably why your existing code works. For arrays of objects that have a real destructor, you'll see a lot more misbehavior.

P.S. Most implementations will delete the whole block of memory pointed to by foo if you write delete foo; even if you allocated it with array-new - but you would be foolish to count on that.

1 Comment

I didn't see this answer earlier. It's the clearest explanation so far.
1

This does not work. deleting arrays without [] will leak memory. It'll affect the performance if your application runs for long time. For short lived programs there will be no issue.

Another thing to note is delete will destroy all the memory which were allocated by new. If you have allocated an array by new (not new []) you can use delete to destroy it`.

1 Comment

@bikeshedder see the second paragraph.
1

That will not cause the program to crash, but you will leak 999 * sizeof(int) bytes of memory each time you do that.

3 Comments

Why isn't it 1000 * sizeof(int)?
I would assume that is because you destroy the first element in the array. But not the rest.
I don't think you can, in general, make this claim. Many implementations will release the contiguous block of memory, as in a call to free. Otherwise, I think a crash is more likely than a leak.
1

The regular delete operator may actually free all the memory from the array, depending on the allocator, but the real problem is it won't ever run any destructors on the array elements.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.