2

Is there any way to construct a std::bitset from a hexadecimal std::string or QString and vise versa without performing binary shift operations? I know how to do it that way but I was wondering if it's possible to do this using C++ streams or something similar.

here's my code so far (trying to avoid coming under fire of moderators) :

QString data("aabbccddeeff");
QByteArray temp = QByteArray::fromHex(data.simplified().toLatin1());
QBitArray bits(temp.count()*8); 
for(int i=0; i<temp.count(); ++i) {
    for(int b=0; b<8;b++) {
        bits.setBit( i*8+b, temp.at(i)&(1<<(7-b)) );
    }
}
2
  • What's the max length of your resulting bits? Is it longer than the widest integer on your platform? Commented May 8, 2016 at 14:35
  • @JohnZwinck it's 128 bits or 16 bytes max. Commented May 8, 2016 at 14:35

4 Answers 4

1

You could convert the hex string to an integral, and construct a bitset from that.

#include <iostream>
#include <sstream>
#include <bitset>
#include <string>

using namespace std;

int main()
{
    string s = "0xA";
    stringstream ss;
    ss << hex << s;
    unsigned n;
    ss >> n;
    bitset<32> b(n);
    // outputs "00000000000000000000000000001010"
    cout << b.to_string() << endl;
}
Sign up to request clarification or add additional context in comments.

1 Comment

ss >> n will initialize n with ascii value of the character supplied to ss by line ss << hex << s.
0

based on answer provided below by Trevor, I wrote the following code:

std::vector<std::bitset<8ul> > Bytes2Bits(QByteArray bytes)
{
    std::vector<std::bitset<8ul> > v;
    for(int i = 0 ; i < bytes.size(); ++i)
    {
        QByteArray temp;
        temp.append(bytes.at(i));
        std::string s = temp.toHex().toStdString();
        std::stringstream ss;
        ss << s;
        int n;
        ss >> n;
        std::bitset<8ul> b(n);
        v.push_back(b);
    }
    return v;
}

I hope it's useful to others seeking the same solution.

Comments

0

How about this?

std::string str("deadBEEF");
std::bitset<128> bits; // result

char* raw = reinterpret_cast<char*>(&bits) + sizeof(bits) - 1; // last byte                                         
assert(str.size() <= 2 * sizeof(bits));

for (size_t ii = 0; ii < str.size(); ++ii) {
    char ch = str[ii];

    if (ch >= '0' && ch <= '9') {
        ch -= '0';
    } else if (ch >= 'a' && ch <= 'f') {
        ch -= 'a' - 10;
    } else if (ch >= 'A' && ch <= 'F') {
        ch -= 'A' - 10;
    } else {
        throw std::runtime_error("invalid input");
    }

    if (ii % 2 == 0) {
        ch <<= 4; // nibble                                                                                         
    }

    *raw |= ch;

    if (ii % 2) {
        --raw;
    }
}

cout << bits << endl;

The above assumes that a std::bitset has exactly one data member: an array of integers large enough to hold the templated number of bits. I think this is a fair assumption but it certainly is not guaranteed to be portable. The good news it that it will be fast--partly because it does no dynamic memory allocation.

Comments

0
QBitArray BitConverter::GetBits(quint64 value, bool lsb)
{
    QString binary;
    binary.setNum(value, 2);

    QBitArray bits(binary.count());
    if (lsb)
    {
        for (int i = 0; i < binary.count(); i++)
        {
            bits[i] = binary[i] == '1';
        }
    }
    else
    {
        for (int i = binary.count() - 1, j = 0; i >= 0; i--, j++)
        {
            bits[j] = binary[i] == '1';
        }
    }
    return bits;
}

2 Comments

Please include an explanation of how and why this solves the problem would really help to improve the quality of your post, and probably result in more up-votes.
Adding some explanation might help us understand the code written better.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.