题 可以同时使用AVCaptureVideoDataOutput和AVCaptureMovieFileOutput吗?


我想用我的代码同时录制视频和抓取帧。

我在用 AVCaptureVideoDataOutput 用于抓斗和 AVCaptureMovieFileOutput 用于视频录制。但无法工作并在同时工作但个人工作时获得错误代码-12780。

我搜索了这个问题,但没有得到答案。有没有人有相同的经历或解释? 一段时间我真的很烦。

谢谢。


31
2018-02-09 11:14


起源


“可以使用AVCaptureMovieFileOutput将视频直接捕获到文件中。但是,此类没有可显示的数据, 不能 与AVCaptureVideoDataOutput同时使用。“ 在这里找到: 链接 ..只是为了澄清问题的实际原因 - Csharpest


答案:


我无法回答所提出的具体问题,但我已成功录制视频并同时使用以下框架抓取框架:

  • AVCaptureSession 和 AVCaptureVideoDataOutput 将帧路由到我自己的代码中
  • AVAssetWriterAVAssetWriterInput 和 AVAssetWriterInputPixelBufferAdaptor 将帧写入H.264编码的电影文件

那是没有调查音频。我最终得到了 CMSampleBuffers 从捕获会话,然后将它们推入像素缓冲适配器。

编辑: 所以我的代码看起来或多或少都是这样的,你可以在没有问题的情况下浏览并忽略范围问题:

/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;

AVCaptureDeviceInput *deviceInput = input with device as above, 
                                    and attach it to the session;

AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
                                   delegate and a suitable dispatch queue affixed.

/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
    [NSDictionary dictionaryWithObjectsAndKeys:

            [NSNumber numberWithInt:640], AVVideoWidthKey,
            [NSNumber numberWithInt:480], AVVideoHeightKey,
            AVVideoCodecH264, AVVideoCodecKey,

            nil];

AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput 
                                   assetWriterInputWithMediaType:AVMediaTypeVideo
                                                  outputSettings:outputSettings];

/* I'm going to push pixel buffers to it, so will need a 
   AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
   asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
           [[AVAssetWriterInputPixelBufferAdaptor alloc] 
                initWithAssetWriterInput:assetWriterInput 
                sourcePixelBufferAttributes:
                     [NSDictionary dictionaryWithObjectsAndKeys:
                          [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], 
                           kCVPixelBufferPixelFormatTypeKey,
                     nil]];

/* that's going to go somewhere, I imagine you've got the URL for that sorted,
   so create a suitable asset writer; we'll put our H.264 within the normal
   MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
                                initWithURL:URLFromSomwhere
                                fileType:AVFileTypeMPEG4
                                error:you need to check error conditions,
                                      this example is too lazy];
[assetWriter addInput:assetWriterInput];

/* we need to warn the input to expect real time data incoming, so that it tries
   to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;

... eventually ...

[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];

... elsewhere ...

- (void)        captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
           fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // a very dense way to keep track of the time at which this frame
    // occurs relative to the output stream, but it's just an example!
    static int64_t frameNumber = 0;
    if(assetWriterInput.readyForMoreMediaData)
        [pixelBufferAdaptor appendPixelBuffer:imageBuffer
                         withPresentationTime:CMTimeMake(frameNumber, 25)];
    frameNumber++;
}

... and, to stop, ensuring the output file is finished properly ...

[captureSession stopRunning];
[assetWriter finishWriting];

50
2018-02-09 12:03



你介意请发一个关于如何做的示例代码吗?你的现实生物会增加10倍! :d - SpaceDog
哦,有业力吗?然后添加了一些非常基本的示例代码! - Tommy
感谢您的代码,我得到它与图像一起工作。如何为视频添加声音,任何线索? - Imran Raheem
在输出av录制的媒体上设置方向: developer.apple.com/library/ios/#qa/qa1744/_index.html#//... - EeKay
你能整合声音吗? @ImranRaheem - DivineDesert


这是汤米答案的快速版本。

 // Set up the Capture Session 
 // Add the Inputs 
 // Add the Outputs


 var outputSettings = [
    AVVideoWidthKey : Int(640),
    AVVideoHeightKey : Int(480),
    AVVideoCodecKey : .h264
]

    var assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,outputSettings: outputSettings)

    var pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput, sourcePixelBufferAttributes:
        [ kCVPixelBufferPixelFormatTypeKey : Int(kCVPixelFormatType_32BGRA)])


     var assetWriter = AVAssetWriter(url: URLFromSomwhere, fileType: AVFileTypeMPEG4 , error : Error )
         assetWriter.addInput(assetWriterInput)
         assetWriterInput.expectsMediaDataInRealTime = true
         assetWriter.startWriting()
         assetWriter.startSession(atSourceTime: kCMTimeZero)

    captureSession.startRunning()


  func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

    var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    // a very dense way to keep track of the time at which this frame
    // occurs relative to the output stream, but it's just an example!
    var frameNumber: Int64 = 0

           if assetWriterInput.readyForMoreMediaData {
    pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: CMTimeMake(frameNumber, 25))
               }
                frameNumber += 1   }

      captureSession.stopRunning()
      assetWriter.finishWriting()

我不保证100%的准确性,因为我是新手。


1
2018-01-05 11:14