I’m trying to create a CCSprite from a UIImage (taken from the camera device), but all that’s being displayed is a black sprite.
Here’s the code that I’ve gathered from researching:
CGImageRef imageRef = [currentImage CGImage]; NSUInteger width = CGImageGetWidth(imageRef); NSUInteger height = CGImageGetHeight(imageRef); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); unsigned char *raw =(unsigned char*)calloc(height*width*4,sizeof(unsigned char)); NSUInteger bytes = 4; NSUInteger bytesPerRow = bytes * width; NSUInteger bitsPerComponent = 8; CGContextRef context = CGBitmapContextCreate(raw, width, height, bits, bytesPerRow, colorSpace, KCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); CGColorSpaceRelease(colorSpace); CGContextDrawImage(context, CGRectMake(0,0,width,height), imageRef); CGContextRelease(context); CCTexture2D *pTexture = new CCTexture2D(); pTexture->initWithData(raw, kCCTexture2DPixelFormat_RGB888, width, height, CCSizeMake(width, height)); CCSprite *result = CCSprite::createWithTexture(pTexture);
Is this the proper way to convert a UIImage into a CCSprite?
Additionally, I’m getting this message in the log:
“Snapshotting a view that has not been rendered results in an empty snapshot. Ensure your view has been rendered at least once before snapshotting or snapshot after screen updates”.
I’m not fully sure if this is actually relevant because I tried directly adding the actual UIImage taken from the camera to a UIView and it’s displaying properly. I’ve tried the solutions that I found about it, but it’s still giving that message.