1
char *toBin(unsigned long long num)
{
  char s[9999];
  const int size= 9999;
  char *temp = new char[size];
  int i, j;
  int binBase = 2;

  for(i=0; num!= 0; i++)
  {
      s[i] = num % binBase + '0';
      num /= binBase;
  }//get backwards binary

  for(j=0; i>=0; j++)
  {
      i--;
      temp[j]=s[i];
  }//reverse binary sequence

  //increments one more so need to decrement
  temp[--j] = '\0';
  return temp;
  delete temp;
}

If I had "82" in a char array, I first converted the "82" to the decimal 82 and gave 82 to the function above to convert to binary. This only works within a certain limit of digits.

I think my problem is that unsigned long long can only hold a certain amount of digits so when it exceeds that limit, some random stuff sent to the function, which prints out the wrong binary.

The above function converts a unsigned long long num to binary. I realized that a better approach would be to look at each character in the char array, rather than converting it to a number first, and and use those values to convert to binary. However, I'm having a hard time trying to put it to code. Any suggestions or help would be appreciated.

1
  • Not really related to how to get a number to print in binary but you're allocating a lot of memory in this function that isn't needed. You need at most sizeof(unsigned long long)*8+1 characters (+1 for the null-terminator). Also, delete temp; is never being called because a function ends when return is reached. Commented Apr 11, 2014 at 0:17

2 Answers 2

1

the GNU Multiple Precision (GMP) library has the functionality to print any size number in any base. The following is an example.

#include <iostream>
#include <gmpxx>

int main()
{
    std::string numstr = "1234123412341234123412341234";

    // initialize mpz_class to have numstr in it (given in base 10)
    mpz_class bigint(numstr, 10);

    // convert to binary (base 2)
    std::string binstr = bigint.get_str(2);

    std::cout << numstr << " -> " << binstr << std::endl;
    return 0;
}

output of program

1234123412341234123412341234 -> 111111110011010111110011000011110100100000101001110100011000000101000011000000000111110010
Sign up to request clarification or add additional context in comments.

Comments

0
  • Allocating an array of 9999 char on the stack is ill-advised because it might be too big for your platform. Anyway, you can easily calculate the maximum number of digits (+terminator) needed: CHAR_BIT*sizeof num+1.
  • Also, there is no need to allocate an equal amount on the heap.
  • Do not be afraid of pointers:
    • Init a pointer to the last byte
    • Set that to zero
    • Add all digits beginning with the smallest (at least one, even if the number is 0).
    • Return a copy of the finished number.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.