Seamless audio recording while flipping camera, using AVCaptureSession amp; AVAssetWriter(翻转相机时无缝录音,使用 AVCaptureSession amp;AVAssetWriter)
问题描述
我正在寻找一种在前后摄像头之间切换时保持无缝音轨的方法.市场上的许多应用程序都可以做到这一点,例如 SnapChat……
I’m looking for a way to maintain a seamless audio track while flipping between front and back camera. Many apps in the market can do this, one example is SnapChat…
解决方案应使用 AVCaptureSession 和 AVAssetWriter.此外,它不应该明确使用 AVMutableComposition,因为有一个 bug 在 AVMutableComposition 和 AVCaptureSession ATM 之间.另外,我负担不起后期处理时间.
Solutions should use AVCaptureSession and AVAssetWriter. Also it should explicitly not use AVMutableComposition since there is a bug between AVMutableComposition and AVCaptureSession ATM. Also, I can't afford post processing time.
目前,当我更改视频输入时,录音会跳过并变得不同步.
Currently when I change the video input the audio recording skips and becomes out of sync.
我将包含可能相关的代码.
I’m including the code that could be relevant.
翻转相机
-(void) updateCameraDirection:(CamDirection)vCameraDirection {
if(session) {
AVCaptureDeviceInput* currentInput;
AVCaptureDeviceInput* newInput;
BOOL videoMirrored = NO;
switch (vCameraDirection) {
case CamDirection_Front:
currentInput = input_Back;
newInput = input_Front;
videoMirrored = NO;
break;
case CamDirection_Back:
currentInput = input_Front;
newInput = input_Back;
videoMirrored = YES;
break;
default:
break;
}
[session beginConfiguration];
//disconnect old input
[session removeInput:currentInput];
//connect new input
[session addInput:newInput];
//get new data connection and config
dataOutputVideoConnection = [dataOutputVideo connectionWithMediaType:AVMediaTypeVideo];
dataOutputVideoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
dataOutputVideoConnection.videoMirrored = videoMirrored;
//finish
[session commitConfiguration];
}
}
样本缓冲区
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
//not active
if(!recordingVideo)
return;
//start session if not started
if(!startedSession) {
startedSession = YES;
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
}
//Process sample buffers
if (connection == dataOutputAudioConnection) {
if([assetWriterInputAudio isReadyForMoreMediaData]) {
BOOL success = [assetWriterInputAudio appendSampleBuffer:sampleBuffer];
//…
}
} else if (connection == dataOutputVideoConnection) {
if([assetWriterInputVideo isReadyForMoreMediaData]) {
BOOL success = [assetWriterInputVideo appendSampleBuffer:sampleBuffer];
//…
}
}
}
也许调整音频采样时间戳?
Perhaps adjust audio sample timeStamp?
推荐答案
嘿,我遇到了同样的问题,发现在切换相机后下一帧被推得太远了.之后这似乎改变了每一帧,从而导致视频和音频不同步.我的解决方案是在切换相机后将每个错位的帧移到正确的位置.
Hey I was facing the same issue and discovered that after switching cameras the next frame was pushed far out of place. This seemed to shift every frame after that thus causing the the video and audio to be out of sync. My solution was to shift every misplaced frame to it's correct position after switching cameras.
抱歉,我的答案将在 Swift 4.2 中
您必须使用 AVAssetWriterInputPixelBufferAdaptor
才能将示例缓冲区附加到指定的演示时间戳.
You'll have to use AVAssetWriterInputPixelBufferAdaptor
in order to append the sample buffers at a specify presentation timestamp.
previousPresentationTimeStamp
是前一帧的演示时间戳,currentPresentationTimestamp
是你猜到的当前帧的演示时间戳.maxFrameDistance
在测试时运行良好,但您可以根据自己的喜好进行更改.
previousPresentationTimeStamp
is the presentation timestamp of the previous frame and currentPresentationTimestamp
is as you guessed the presentation timestamp of the current. maxFrameDistance
worked every well when testing but you can change this to your liking.
let currentFramePosition = (Double(self.frameRate) * Double(currentPresentationTimestamp.value)) / Double(currentPresentationTimestamp.timescale)
let previousFramePosition = (Double(self.frameRate) * Double(previousPresentationTimeStamp.value)) / Double(previousPresentationTimeStamp.timescale)
var presentationTimeStamp = currentPresentationTimestamp
let maxFrameDistance = 1.1
let frameDistance = currentFramePosition - previousFramePosition
if frameDistance > maxFrameDistance {
let expectedFramePosition = previousFramePosition + 1.0
//print("[mwCamera]: Frame at incorrect position moving from (currentFramePosition) to (expectedFramePosition)")
let newFramePosition = ((expectedFramePosition) * Double(currentPresentationTimestamp.timescale)) / Double(self.frameRate)
let newPresentationTimeStamp = CMTime.init(value: CMTimeValue(newFramePosition), timescale: currentPresentationTimestamp.timescale)
presentationTimeStamp = newPresentationTimeStamp
}
let success = assetWriterInputPixelBufferAdator.append(pixelBuffer, withPresentationTime: presentationTimeStamp)
if !success, let error = assetWriter.error {
fatalError(error.localizedDescription)
}
另外请注意 - 这很有效,因为我保持帧速率一致,因此请确保您在整个过程中完全控制捕获设备的帧速率.
Also please note - This worked because I kept the frame rate consistent, so make sure that you have total control of the capture device's frame rate throughout this process.
我在这里有一个使用这个逻辑的仓库
这篇关于翻转相机时无缝录音,使用 AVCaptureSession &AVAssetWriter的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:翻转相机时无缝录音,使用 AVCaptureSession &AVAssetWriter
基础教程推荐
- 如何让对象对 Cocos2D 中的触摸做出反应? 2022-01-01
- android 应用程序已发布,但在 google play 中找不到 2022-01-01
- Android:对话框关闭而不调用关闭 2022-01-01
- Kivy Buildozer 无法构建 apk,命令失败:./distribute.sh -m “kivy"d 2022-01-01
- 如何在 UIImageView 中异步加载图像? 2022-01-01
- 如何在没有IB的情况下将2个按钮添加到右侧的UINavigationbar? 2022-01-01
- UIWebView 委托方法 shouldStartLoadWithRequest:在 WKWebView 中等效? 2022-01-01
- 在 gmail 中为 ios 应用程序检索朋友的朋友 2022-01-01
- 如何在 iPhone 上显示来自 API 的 HTML 文本? 2022-01-01
- 当从同一个组件调用时,两个 IBAction 触发的顺序是什么? 2022-01-01