1

What's the correct way to free the memory here? The pointer-array contains duplicates!

class HashTable
{
   Bucket<E>** index = new Bucket<E>*[indexSize];
   ...
}

~ExtHash( ) 
{
  for (size_t i = 0; i < indexSize; ++i) 
     delete index[i];

   delete[] index;
 }
6
  • By duplicates you mean pointers to the same elements? Commented Apr 20, 2012 at 12:04
  • yes, like this index[0] --> 0x123, index[1] --> 0x456, index[2] --> 0x123 Commented Apr 20, 2012 at 12:08
  • Isn't C++ robust to this automatically? Commented Apr 20, 2012 at 12:09
  • @Tomas: No; if you destroy an object, then any pointer or reference to that object becomes a live hand grenade. Deleting an object twice (as this code would do) gives undefined behaviour. Commented Apr 20, 2012 at 12:13
  • Thanks @Mike. Well, it can't be 100% robust, but just by storing some sequence of bytes that is unlikely to happen... I think C++ could be able to be 99.99% robust :-) Commented Apr 20, 2012 at 12:20

6 Answers 6

3

I would think hard about whether you want this container to be responsible for deleting the objects; it would be simpler to store them elsewhere, and just use this container to refer to them, not to manage their lifetimes.

Alternatively, you could use std::shared_ptr to manage the objects; then they will be deleted automatically when you've discarded all of them.

If you really want to do it this way, you'll need to remove the duplicates after deleting each one; something like

for (size_t i = 0; i < indexSize; ++i) {
    Bucket<E> * victim = index[i];
    indexSize = std::remove(index+i+1, index+indexSize, victim) - index;
    delete victim;
}

[NOTE: this code may well be wrong; I certainly made a couple of mistakes writing it. If you really want to manage dynamic objects the hard way, then you'll need to test it thoroughly]

Sign up to request clarification or add additional context in comments.

Comments

2

Sort index, remove adjacent duplicates with std::unique. O(N log N) and that's as good as it gets.

1 Comment

Yes, that's more efficient and less error prone than the dodgy code in my answer.
1

Use a set to remove duplicates:

std::set<Bucket*> uniqueBuckets(index, index + indexSize);

for(Bucket* bucket: uniqueBuckets)
    delete bucket;

Comments

1

In your situation it's dangerous to store raw pointers in such way. Better use std::shared_ptr and just reset shared pointers. And after all pointers will be reseted they will be safely freed.

Comments

1

Perhaps like this:

~ExtHash( ) 
{
  std::sort(index, index + indexSize);
  size_t new_end = std::unique(index, index + indexSize) - index;

  for (size_t i = 0; i < new_end; ++i) 
     delete index[i];

   delete[] index;
 }

Comments

1

Each time you create a pointer, push_back it into a vector. That vector will hold all your created pointers, and will hold each one of them only once.

Later, iterate that vector and delete the pointers. It's like writing your own simple garbage collector.

1 Comment

Alternatively, keep the objects themselves in a list (not a vector, since you want pointers to them to remain valid as more are added). Then you won't need new and delete at all; the list will automatically destroy them for you.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.