0

I'm writing a helper function for an iOS Objective-C app that is supposed to encrypt a string with the provided key using the AES-128 algorithm.

I already have a working implementation of this function in Swift, but I'm struggling translating it into Objective-C. The function doesn't take into account any initialization vector, which is hard-coded as an empty string: this is intended.

Here is the Swift code that works as expected:

@objc func AES128(_ string: String, withKey: String) -> String {
    let data = string.data(using: String.Encoding.utf8)!
    let keyData = withKey.data(using: String.Encoding.utf8)!
    let ivData = "".data(using: String.Encoding.utf8)!
    
    let cryptLength = size_t(data.count + kCCBlockSizeAES128)
    var cryptData = Data(count: cryptLength)
    let keyLength = size_t(kCCKeySizeAES128)
    let options = CCOptions(kCCOptionPKCS7Padding)
    var numBytesEncrypted: size_t = 0

    let cryptStatus = cryptData.withUnsafeMutableBytes { cryptBytes in
        data.withUnsafeBytes { dataBytes in
            ivData.withUnsafeBytes { ivBytes in
                keyData.withUnsafeBytes { keyBytes in
                    CCCrypt(CCOperation(kCCEncrypt), CCAlgorithm(kCCAlgorithmAES), options, keyBytes, keyLength, ivBytes, dataBytes, data.count, cryptBytes, cryptLength, &numBytesEncrypted)
                }
            }
        }
    }

    if UInt32(cryptStatus) == UInt32(kCCSuccess) {
        cryptData.removeSubrange(numBytesEncrypted..<cryptData.count)
    }

    return cryptData.base64EncodedString();
}

This function works like a charm and encrypting the string Hello World with the key ABCDEFGHIJKLMNOP results in a base64 encoded string of 3G4OU62dM9zXhkbXy8pmuA==.

I translated it into Objective-C:

+(NSString *)AES128:(NSString *)string withKey:(NSString *)key {
    NSData *data = [string dataUsingEncoding:NSUTF8StringEncoding];
    NSData *keyData = [key dataUsingEncoding:NSUTF8StringEncoding];
    NSData *ivData = [@"" dataUsingEncoding:NSUTF8StringEncoding];
    
    size_t cryptLength  = data.length + kCCBlockSizeAES128;
    NSMutableData *cryptData = [NSMutableData dataWithLength:cryptLength];
    size_t keyLength = kCCKeySizeAES128;
    CCOptions options = kCCOptionPKCS7Padding;
    size_t numBytesEncrypted = 0;
    
    CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES, options, keyData.bytes, keyLength, ivData.bytes, data.bytes, data.length, cryptData.mutableBytes, cryptLength, &numBytesEncrypted);
    
    if (cryptStatus == kCCSuccess) {
        [cryptData replaceBytesInRange:NSMakeRange(numBytesEncrypted, cryptData.length - numBytesEncrypted) withBytes:NULL length:0];
    }

    return [cryptData base64EncodedStringWithOptions:0];
}

The problem is that the Objective-C version provides wrong results, I'd say random results, producing for the same combination of Hello World + ABCDEFGHIJKLMNOP different encrypted values. 7m+HH5NusyA1VAfZ78KYCw== or NY5p8XtYLoAE/4VbCCrPIg== are some of those wrong values. If you decrypt these strings you get Hello Wold and Hello Worod respectively.

What did I do wrong while translating?

Thank you for your kind help!

1
  • Put a breakpoint on the call to CCCrypt() in both versions and compare the parameters. If they're the same, then probably the problem occurs after that. If they're different, that'd explain the problem. Also, looking at which parameters are different, if any, may point you toward the problem. Commented Feb 16, 2021 at 19:21

1 Answer 1

1

This seems to be an issue with your handling of the Initialization Vector. If you notice, the CCCrypt function accepts a pointer to the byte buffer content of your IV, but you aren't providing a length parameter. This means that the length of the buffer is already known.

According to Wikipedia (https://en.wikipedia.org/wiki/Initialization_vector) the length for an IV is "generally the cipher's block size".

So basically, when you are doing

NSData *ivData = [@"" dataUsingEncoding:NSUTF8StringEncoding];

the ivData.bytes property would now be pointing somewhere, where in the best case scenario, it would have 0x00 for the full length, but very likely wouldn't. The CCCrypt() function would be starting at the memory address you are passing (a pointer to an empty array) and continuing to read for the next [IV Length] bytes, using whatever random data it finds there.

To fix this, you need to make sure somehow, that you are passing a pointer to a buffer containing your intended content of all zeroes.

I did this simply with

void* ivBuffer = calloc(1, kCCBlockSizeAES128);

which allocates a single block of 128 bits (16 bytes) and initializes it with all zeroes.

Your call to CCCrypt() then becomes

CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES, options, keyData.bytes, keyLength, ivBuffer, data.bytes, data.length, cryptData.mutableBytes, cryptLength, &numBytesEncrypted);

and you just have to remember to call free(ivBuffer) to free the memory that was allocated for this buffer.

Also, when you are handling the success status, you can just do

if (cryptStatus == kCCSuccess)
{
    cryptData.length = numBytesEncrypted;
}

and it will truncate the data to be just the portion you want to use. Up to you, since the result is the same as what you already had.

Sign up to request clarification or add additional context in comments.

3 Comments

Thank you @BenW, your proposed edits fixed the issue, I'm going to accept your answer. Just out of curiosity, do you know why the Data initialization with an empty string in Swift produces different results than the NSData in Objective-C? Or is it because in Objective-C we are dealing with pointers rather than the actual data?
It's hard to say. According to the Swift docs, Data is bridged to the NSData class and can be used interchangeably in code that works with ObjC APIs, but there still could be some difference with how initialization happens. Also, we are really talking about the difference between Swift strings and NSStrings, as you are asking the String/NSString object to produce a Data/NSData representation of itself using a certain encoding method, so there could also be an implementation difference there.
I was also able to get it to work by creating the ivData with [[NSData alloc] init] , however, I wasn't able to prove to myself that this would work every single time. The only way to absolutely ensure that it will work every time is if you are positive of the contents of of the space you are referencing. So I guess what I'm saying is that I can't explain why it is working in the Swift version either, but there must be some implementation detail that we don't know about. The safer way to do it in Swift would probably be with Data(kCCBlockSizeAES128), which initializes the new data with 0s.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.