从 UIImageView 获取像素数据——适用于模拟器,而不是设备

Getting pixel data from UIImageView -- works on simulator, not device(从 UIImageView 获取像素数据——适用于模拟器,而不是设备)

本文介绍了从 UIImageView 获取像素数据——适用于模拟器,而不是设备的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

基于对 上一个问题,我在 UIImageView 上创建了一个类别,用于提取像素数据.这在模拟器中工作正常,但在部署到设备时却不行.我应该说不是总是——奇怪的是,如果 point.x == point.y; 它确实会获取正确的像素颜色;否则,它会为我提供该线另一侧像素的像素数据,就像镜像一样.(因此,点击图像右下角的像素会给我左上角相应像素的像素数据,但点击左下角的像素会返回正确的像素颜色).触摸坐标(CGPoint)正确.

Based on the responses to a previous question, I've created a category on UIImageView for extracting pixel data. This works fine in the simulator, but not when deployed to the device. I should say not always -- the odd thing is that it does fetch the correct pixel colour if point.x == point.y; otherwise, it gives me pixel data for a pixel on the other side of that line, as if mirrored. (So a tap on a pixel in the lower-right corner of the image gives me the pixel data for a corresponding pixel in the upper-left, but tapping on a pixel in the lower-left corner returns the correct pixel colour). The touch coordinates (CGPoint) are correct.

我做错了什么?

这是我的代码:

@interface UIImageView (PixelColor)
- (UIColor*)getRGBPixelColorAtPoint:(CGPoint)point;
@end

@implementation UIImageView (PixelColor)

- (UIColor*)getRGBPixelColorAtPoint:(CGPoint)point
{
    UIColor* color = nil;

    CGImageRef cgImage = [self.image CGImage];
    size_t width = CGImageGetWidth(cgImage);
    size_t height = CGImageGetHeight(cgImage);
    NSUInteger x = (NSUInteger)floor(point.x);
    NSUInteger y = height - (NSUInteger)floor(point.y);

    if ((x < width) && (y < height))
    {
        CGDataProviderRef provider = CGImageGetDataProvider(cgImage);
        CFDataRef bitmapData = CGDataProviderCopyData(provider);
        const UInt8* data = CFDataGetBytePtr(bitmapData);
        size_t offset = ((width * y) + x) * 4;
        UInt8 red = data[offset];
        UInt8 blue = data[offset+1];
        UInt8 green = data[offset+2];
        UInt8 alpha = data[offset+3];
        CFRelease(bitmapData);
        color = [UIColor colorWithRed:red/255.0f green:green/255.0f blue:blue/255.0f alpha:alpha/255.0f];
    }

    return color;
}

推荐答案

我觉得 R B G 是错误的.你有:

I think R B G is wrong. You have:

UInt8 red =   data[offset];     
UInt8 blue =  data[offset+1];
UInt8 green = data[offset+2];

但你真的不是指R G B吗?:

But don't you really mean R G B? :

UInt8 red =   data[offset];     
UInt8 green = data[offset+1];
UInt8 blue =  data[offset+2];

但即使解决了这个问题,苹果仍然存在问题 字节交换(很棒的文章)在设备上时 R 和 B 值,但在模拟器上时.

But even with that fixed there's still a problem as it turns out Apple byte swaps (great article) the R and B values when on the device, but not when on the simulator.

CFDataGetBytePtr 返回的 PNG 像素缓冲区存在类似的模拟器/设备问题.

I had a similar simulator/device issue with a PNG's pixel buffer returned by CFDataGetBytePtr.

这为我解决了问题:

#if TARGET_IPHONE_SIMULATOR
        UInt8 red =   data[offset];
        UInt8 green = data[offset + 1];
        UInt8 blue =  data[offset + 2];
#else
        //on device
        UInt8 blue =  data[offset];       //notice red and blue are swapped
        UInt8 green = data[offset + 1];
        UInt8 red =   data[offset + 2];
#endif

不确定这是否能解决您的问题,但您的行为异常的代码看起来与我修复之前的样子很接近.

Not sure if this will fix your issue, but your misbehaving code looks close to what mine looked like before I fixed it.

最后一件事:我相信即使在调用 CFRelease(bitmapData) 之后,模拟器也会让您访问像素缓冲区 data[].根据我的经验,在设备上不是.您的代码不应该受到影响,但如果这对其他人有帮助,我想我会提到它.

One last thing: I believe the simulator will let you access your pixel buffer data[] even after CFRelease(bitmapData) is called. On the device this is not the case in my experience. Your code shouldn't be affected, but in case this helps someone else I thought I'd mention it.

这篇关于从 UIImageView 获取像素数据——适用于模拟器,而不是设备的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!

本文标题为:从 UIImageView 获取像素数据——适用于模拟器,而不是设备

基础教程推荐