用相机拍摄的照片不包含任何 ALAsset 元数据

Photo taken with camera does not contain any ALAsset metadata(用相机拍摄的照片不包含任何 ALAsset 元数据)

本文介绍了用相机拍摄的照片不包含任何 ALAsset 元数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

最奇怪的事情正在发生.我有一张操作表,用户可以选择用相机拍照或从相机胶卷中选择一张.在 UIImagePicker 从选择中返回后,我使用 ALAssetsLibrary 来确定照片中嵌入的 GPS 信息.从相机胶卷中选择一张照片效果很好,我可以检索 GPS 信息.但是,用相机拍照绝对没有 GPS 信息,事实上我根本没有元数据.有谁知道我在这里做错了什么?

The weirdest thing is happening. I have an action sheet which gives the user the choice to either take a photo with the camera or choose one from the camera roll. Upon the UIImagePicker returning from selection I use the ALAssetsLibrary to determine the GPS information embedded in the photo. Choosing a photo from the camera roll works perfectly and I am able to retrieve the GPS information. However, taking a photo with the camera provides absolutely no GPS information, in fact I have no metadata at all. Does anyone know what I'm doing wrong here?

代码如下:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{    
    NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
    if([mediaType isEqualToString:(__bridge NSString *)kUTTypeImage])
    {        
        void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
        {
            // get images metadata
            NSDictionary *metadata = asset.defaultRepresentation.metadata;
            NSLog(@"Image Meta Data: %@",metadata);

            // get coords 
            CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
            NSLog(@"coordLat: %f , coordLon: %f", location.coordinate.latitude, location.coordinate.longitude);

            // do more here - rest of code snipped to keep this question short

    };
    NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library assetForURL:assetURL
             resultBlock:ALAssetsLibraryAssetForURLResultBlock
            failureBlock:^(NSError *error) 
            {
                // handle error 
            }];

    // rest of code snipped to keep this question short

正如我解释的那样,使用相机时会输出以下内容.

As I explained, the following is outputted when using the camera.

2012-04-15 17:58:28.032 MyApp[511:707] Image Meta Data: (null)
2012-04-15 17:58:28.041 MyApp[511:707] coordLat: 0.000000 , coordLon: 0.000000

但是,如果我选择现有照片,或者退出应用程序,用相机拍摄一张新照片,然后返回应用程序并从相机胶卷中选择该照片,我会从 NSLog 获得以下输出.

However, if I choose an existing photo, or exit out of the app, take a new photo with the camera, then go back into the app and select that photo from the camera roll i get the following output from NSLog.

2012-04-15 17:57:03.286 MyApp[511:707] Image Meta Data: {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 6;
PixelHeight = 1936;
PixelWidth = 2592;
"{Exif}" =     {
    ApertureValue = "2.970854";
    BrightnessValue = "2.886456";
    ColorSpace = 1;
    ComponentsConfiguration =         (
        1,
        2,
        3,
        0
    );
    DateTimeDigitized = "2012:04:15 17:24:02";
    DateTimeOriginal = "2012:04:15 17:24:02";
    ExifVersion =         (
        2,
        2,
        1
    );
    ExposureMode = 0;
    ExposureProgram = 2;
    ExposureTime = "0.06666667";
    FNumber = "2.8";
    Flash = 24;
    FlashPixVersion =         (
        1,
        0
    );
    FocalLength = "3.85";
    ISOSpeedRatings =         (
        80
    );
    MeteringMode = 5;
    PixelXDimension = 2592;
    PixelYDimension = 1936;
    SceneCaptureType = 0;
    SensingMethod = 2;
    Sharpness = 2;
    ShutterSpeedValue = "3.9112";
    SubjectArea =         (
        1295,
        967,
        699,
        696
    );
    WhiteBalance = 0;
};
"{GPS}" =     {
    Altitude = "14.9281";
    AltitudeRef = 0;
    ImgDirection = "107.4554";
    ImgDirectionRef = T;
    Latitude = "32.7366666666667";
    LatitudeRef = N;
    Longitude = "71.679";
    LongitudeRef = W;
    TimeStamp = "21:26:20.00";
};
"{TIFF}" =     {
    DateTime = "2012:04:15 17:24:02";
    Make = Apple;
    Model = "iPhone 4";
    Orientation = 6;
    ResolutionUnit = 2;
    Software = "5.0.1";
    XResolution = 72;
    YResolution = 72;
    "_YCbCrPositioning" = 1;
};
}
2012-04-15 17:57:03.302 MyApp[511:707] coordLat: 32.7366666666667 , coordLon: -71.679

PS - 我正在使用带有 ARC 的 xCode 4.3

PS - I'm using xCode 4.3 w/ ARC

推荐答案

中- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info

保存照片时试试这个

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library writeImageToSavedPhotosAlbum:image.CGImage
                                 metadata:[info objectForKey:UIImagePickerControllerMediaMetadata]
                          completionBlock:^(NSURL *assetURL, NSError *error) {
        NSLog(@"assetURL %@", assetURL);
    }];

这篇关于用相机拍摄的照片不包含任何 ALAsset 元数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!

本文标题为:用相机拍摄的照片不包含任何 ALAsset 元数据

基础教程推荐