4

I currently have some image data in a C array, which contains RGBA data.

float array[length][4]

I am trying to get this to a UIImage, which it looks like these are initialized with files, NSData, and URLs. Since the other two methods are slow, I am most interested in the NSData approach.

I can get all of these values into an NSArray like so:

for (i=0; i<image.size.width * image.size.height; i++){
    replace = [UIColor colorWithRed:array[i][0] green:array[i][1] blue:array[i][2] alpha:array[i][3]];
    [output replaceObjectAtIndex:i withObject:replace];
}

So, I have a NSArray full of objects that are a UIColor. I have tried many methods, but how do I convert this to a UIImage?

I think it would be straight forward. A function sorta like imageWithData:data R:0 B:1 G:2 A:3 length:length width:width length:length would be nice, but there is no function as far as I can tell.

1 Answer 1

6

imageWithData: is meant for image data in a standard image file format, e.g. a PNG or JPEG file that you have in memory. It's not suitable for creating images from raw data.

For that, you would typically create a bitmap graphics context, passing your array, pixel format, size, etc. to the CGBitmapContextCreate function. When you've created a bitmap context, you can create an image from it using CGBitmapContextCreateImage, which gives you a CGImageRef that you can pass to the UIImage method imageWithCGImage:.

Here's a basic example that creates a tiny 1×2 pixel image with one red pixel and one green pixel. It just uses hard-coded pixel values that are meant to show the order of the color components, normally, you would get this data from somewhere else of course:

size_t width = 2;
size_t height = 1;
size_t bytesPerPixel = 4;
//4 bytes per pixel (R, G, B, A) = 8 bytes for a 1x2 pixel image:
unsigned char rawData[8] = {255, 0, 0, 255,  //red
                            0, 255, 0, 255}; //green

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
size_t bytesPerRow = bytesPerPixel * width;
size_t bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast);

CGImageRef cgImage = CGBitmapContextCreateImage(context);
//This is your image:
UIImage *image = [UIImage imageWithCGImage:cgImage];
//Don't forget to clean up:
CGImageRelease(cgImage);
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
Sign up to request clarification or add additional context in comments.

3 Comments

Could you give me some more details on that? An example would be extremely helpful, if you want to. From what I understand, I make a CGBitmapContextRef with the CGBitmapContextCreate function, but can not find CGBitmapContextRef anywhere. I then pass that creation to CGBitmapContextCreateImage then pass that to imageWithCGImage. Correct?
Sorry, CGBitmapContextRef was incorrect, the CGBitmapContextCreate function actually just returns a regular CGContextRef. I'll see if I can find a simple example...
I've added a basic example to the answer.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.