AVCaptureVideoDataOutput을 이용해서 카메라 만들기1
iOS 비디오 녹화 어플을 만들다가 생각지도 못한 난관에 봉착했다.
비디오를 녹화하는 도중에 녹화한 데이터크기를 실시간으로 체크하는 기능을 추가해야하는 상황이 발생했다.
젠장…
AVCaptureMovieFileOutput을 이용해서 비디오를 커스터마이징 하고 있었는데, AVCaptureMovieFileOutput으로는 실시간으로 데이터를 체크가 불가능했다.
대신, AVCaptureVideoDataOutput 과 AVAssetWriter를 이용해야지 데이터 체크가 가능했다.
그리고 AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate 메소드를 이용해야 했다.
초기화는 AVCaptureMovieFileOutput 과 비슷하다.
초기화 할때 세션을 만들고 그 세션에 디바이스인풋과 아웃풋을 넣고 세션을 가동시키는 것은 동일하다.
AVCaptureVideoDataOutput의 다른 점은 세션이 가동될때 실시간으로 데이터를
몇 가지 알아본 것들을 예제 파일로 정리해본다.
클래스 설명
IDCaptureSessionPipelineViewController
- UIViewController 를 부모로 상속하고, (IDCaptureSessionCoordinator 클래스의 ) IDCaptureSessionCoordinatorDelegate 프로토콜을 따른다.
예) coordinatorDidBeginRecording() didFinishRecordingToOutputFileURL()
- IDCaptureSessionCoordinator을 프로퍼티로 가지고 있다. @property (nonatomic, strong) IDCaptureSessionCoordinator *captureSessionCoordinator;
- 전체 ui, 기능을 컨트롤 한다.
- IDCaptureSessionCoordinator(부모 클래스)를 먼저 초기화 하고, IDCaptureSessionAssetWriterCoordinator(자식 클래스)를 초기화 한다.
- IDCaptureSessionCoordinator 에 델리게이트(자신)와 콜백큐(디스페치 메인큐)를 할당해준다.
- IDCaptureSessionCoordinator 클래스를 호출 해서 AVCaptureVideoPreviewLayer에 세션을 넣어주고 반환값을 받는다.
- IDCaptureSessionCoordinator 의 세션을 실행 시킨다. dispatch_sync(_sessionQueue, ^{ [self->_captureSession startRunning]; });
IDCaptureSessionCoordinator(부모)
- NSObject를 상속받는다.
- AVCaptureSession, AVCaptureDevice, dispatch_queue_t(세션큐) ,dispatch_queue_t(콜백큐), AVCaptureVideoPreviewLayer 를 프로퍼티로 가지고 있다.
- IDCaptureSessionCoordinatorDelegate 프로토콜을 구현해 놓았다.
- 세션을 만들고, 인풋, 아웃풋객체를 만들고 세션에 넣어주는 역할을 하는 클래스다.
- 초기화 될때, 세션큐를 생성한다.
- 초기화 될때, AVCaptureSession 을 만들어서 AVCaptureDeviceInput(AVMediaTypeVideo)을 AVCaptureSession에 넣어주고, AVCaptureDevice를 초기화 해준다.
- 그다음 AVCaptureDeviceInput(AVMediaTypeAudio)을 AVCaptureSession에 넣어준다.
IDCaptureSessionAssetWriterCoordinator(자식)
- IDCaptureSessionCoordinator을 상속하는 클래스
- <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, IDAssetWriterCoordinatorDelegate> 프로토콜을 따른다.
- IDAssetWriterCoordinatorDelegate는 IDAssetWriterCoordinator클래스에 구현해 놓았다.
- (void)writerCoordinatorDidFinishPreparing:(IDAssetWriterCoordinator *)coordinator;
- (void)writerCoordinator:(IDAssetWriterCoordinator *)coordinator didFailWithError:(NSError *)error;
- (void)writerCoordinatorDidFinishRecording:(IDAssetWriterCoordinator *)coordinator;
- 비디오아웃풋, 오디오 아웃풋을 생성해서 세션에 넣어주는 것이 주요 역할이다. 그리고 세션이 시작되면 해당 델리게이트 메소드를 통해서 버퍼를 얻는다.
- AVCaptureVideoDataOutput(비디오, 오디오), AVCaptureConnection(비디오, 오디오),AVAssetWriter, CMFormatDescriptionRef, IDAssetWriterCoordinator(클래스), NSDictionary *videoCompressionSettings를 프로퍼티로 가지고 있다.
- 초기화 될때 videoDataOutputQueue를 만들고 실행시킽다. 초기화될때 audioDataOutputQueue를 만든다.
- 초기화 될때 , videoDataOutput 을 생성하고, audioDataOutput을 생성하고 위에서 만든 비디오큐, 오디오 큐를 동시에 setSampleBufferDelegate 델리게이트 메소드에 넣어준다.
- 비디오 아웃풋을 IDCaptureSessionCoordinator(클래스의) 세션에 넣어주고 videoConnection 객체를 만든다.
- 오디오 아웃풋을 IDCaptureSessionCoordinator(클래스의) 세션에 넣어주고 audioConnection 객체를 만든다.
- 비디오, 오디오 아웃풋을 이용해서 videoCompressionSettings, audioCompressionSettings 객체를 초기화 해준다.
- 세션이 실행되면 델리게이트 메소드인 (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection 가 계속 호출된다.
- 여기서 레코딩을 하는지 아니면 그냥 대기 상태인지를 체크해서 AVAssetWriter에 기록해준다.
IDAssetWriterCoordinator
(2편에서...정리)
소스코드
IDCaptureSessionPipelineViewController
#import <UIKit/UIKit.h>
@interface IDCaptureSessionPipelineViewController : UIViewController
@end
#import "IDFileManager.h"
#import "IDPermissionsManager.h"
#import "IDCaptureSessionPipelineViewController.h"
#import "IDCaptureSessionAssetWriterCoordinator.h" //IDCaptureSessionCoordinator을 상속
// (IDCaptureSessionCoordinator 클래스의 ) IDCaptureSessionCoordinatorDelegate 프로토콜을 따르기 때문에 해당 메소드를 이곳에 구현해 놓았다.
// coordinatorDidBeginRecording() didFinishRecordingToOutputFileURL(),
@interface IDCaptureSessionPipelineViewController () <IDCaptureSessionCoordinatorDelegate>
@property (nonatomic, strong) IDCaptureSessionCoordinator *captureSessionCoordinator; //부모: 세션생성 하고 인풋디바이스를 세션에 넣어주는 클래스
@property (nonatomic, assign) BOOL recording;
@property (nonatomic, assign) BOOL dismissing;
@property (retain, nonatomic) IBOutlet UIBarButtonItem *recordButton;
@end
@implementation IDCaptureSessionPipelineViewController
- (void)viewDidLoad {
[super viewDidLoad];
NSLog(@"IDCaptureSessionPipelineViewController setupWithPipelineMode IDCaptureSessionAssetWriterCoordinator 객체 생성시작!");
_captureSessionCoordinator = [IDCaptureSessionAssetWriterCoordinator new]; //부모객체 = 자식 new
//캡쳐세션에 델리게이트 와 큐 설정
[_captureSessionCoordinator setDelegate:self callbackQueue:dispatch_get_main_queue()];
[self configureInterface];
}
- (IBAction)toggleRecording:(id)sender
{
NSLog(@"IDCaptureSessionPipelineViewController - toggleRecording 버튼");
}
- (IBAction)closeCamera:(id)sender
{
NSLog(@"IDCaptureSessionPipelineViewController - closeCamera 버튼");
}
#pragma mark - Private methods
- (void)configureInterface
{
NSLog(@"IDCaptureSessionPipelineViewController - configureInterface 호출 - AVCaptureVideoPreviewLayer 객체 생성");
AVCaptureVideoPreviewLayer *previewLayer = [_captureSessionCoordinator previewLayer];
previewLayer.frame = self.view.bounds;
[self.view.layer insertSublayer:previewLayer atIndex:0];
[_captureSessionCoordinator startRunning];
}
- (void)stopPipelineAndDismiss
{
NSLog(@"IDCaptureSessionPipelineViewController - stopPipelineAndDismiss 호출");
}
- (void)checkPermissions
{
NSLog(@"IDCaptureSessionPipelineViewController - IDPermissionsManager 객체 생성 + 카메라 오디오 권한 체크 시작");
}
#pragma mark = IDCaptureSessionCoordinatorDelegate methods
- (void)coordinatorDidBeginRecording:(IDCaptureSessionCoordinator *)coordinator
{
NSLog(@"IDCaptureSessionPipelineViewController coordinatorDidBeginRecording 호출");
}
//녹화가 끝날때 (정지 버튼을 눌렀을 때)호출
- (void)coordinator:(IDCaptureSessionCoordinator *)coordinator didFinishRecordingToOutputFileURL:(NSURL *)outputFileURL error:(NSError *)error
{
NSLog(@"IDCaptureSessionPipelineViewController didFinishRecordingToOutputFileURL 호출");
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end
IDCaptureSessionCoordinator(부모)
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@protocol IDCaptureSessionCoordinatorDelegate;
@interface IDCaptureSessionCoordinator : NSObject
@property (nonatomic, strong) AVCaptureSession *captureSession; //세션 객체
@property (nonatomic, strong) AVCaptureDevice *cameraDevice; //디바이스 객체
@property (nonatomic, strong) dispatch_queue_t delegateCallbackQueue; //콜백 큐
@property (nonatomic, weak) id<IDCaptureSessionCoordinatorDelegate> delegate;
- (void)setDelegate:(id<IDCaptureSessionCoordinatorDelegate>)delegate callbackQueue:(dispatch_queue_t)delegateCallbackQueue;
- (BOOL)addInput:(AVCaptureDeviceInput *)input toCaptureSession:(AVCaptureSession *)captureSession; //인풋객체 넣기
- (BOOL)addOutput:(AVCaptureOutput *)output toCaptureSession:(AVCaptureSession *)captureSession; //아웃풋 객체 넣기
- (void)startRunning;
- (void)stopRunning;
- (void)startRecording;
- (void)stopRecording;
- (AVCaptureVideoPreviewLayer *)previewLayer; //미리보기
@end
@protocol IDCaptureSessionCoordinatorDelegate <NSObject>
@required
- (void)coordinatorDidBeginRecording:(IDCaptureSessionCoordinator *)coordinator;
- (void)coordinator:(IDCaptureSessionCoordinator *)coordinator didFinishRecordingToOutputFileURL:(NSURL *)outputFileURL error:(NSError *)error;
@end
#import "IDCaptureSessionCoordinator.h"
@interface IDCaptureSessionCoordinator ()
@property (nonatomic, strong) dispatch_queue_t sessionQueue;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@end
@implementation IDCaptureSessionCoordinator
-(instancetype)init{
NSLog(@"IDCaptureSessionCoordinator init 진입");
self = [super init];
if(self){
//세션 큐 생성
_sessionQueue = dispatch_queue_create("com.abc.my.session", DISPATCH_QUEUE_SERIAL);
//캡쳐 세션 생성
_captureSession = [self setupCaptureSession];
}
return self;
}
/*
AVFoundation비디오 캡처에 사용할 때는 사용자 정의 사용자 인터페이스를 제공해야합니다.
모든 카메라 인터페이스의 주요 구성 요소는 실시간 미리보기입니다.
이것은 AVCaptureVideoPreviewLayer카메라보기에 하위 레이어로 추가 된 객체를 통해 가장 쉽게 구현됩니다 .
*/
- (AVCaptureVideoPreviewLayer *)previewLayer
{
if(!_previewLayer && _captureSession){
NSLog(@"IDCaptureSessionCoordinator previewLayer 캡쳐 세션 넣어줌 초기화");
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
}
return _previewLayer;
}
- (void)setDelegate:(id<IDCaptureSessionCoordinatorDelegate>)delegate callbackQueue:(dispatch_queue_t)delegateCallbackQueue{
NSLog(@"IDCaptureSessionCoordinator setDelegate 진입 - 전역변수에 IDCaptureSessionCoordinatorDelegate 할당, delegateCallbackQueue 할당");
if (delegate && (delegateCallbackQueue == NULL)) {
@throw [NSException exceptionWithName:NSInvalidArgumentException reason:@"호출자는 델리게이트 콜백큐를 제공해야 한다!" userInfo:nil];
}
@synchronized(self)
{
_delegate = delegate;
if (delegateCallbackQueue != _delegateCallbackQueue) {
_delegateCallbackQueue = delegateCallbackQueue;
}
}
}
//세션러닝 시작
- (void)startRunning{
NSLog(@"IDCaptureSessionCoordinator startRunning 진입 - dispatch_sync(세션큐, 캡쳐쎄션 스타트러닝!)");
dispatch_sync(_sessionQueue, ^{
[self->_captureSession startRunning];
});
}
//러닝 정지
- (void)stopRunning{
NSLog(@"IDCaptureSessionCoordinator stopRunning 진입");
}
//녹화 시작
- (void)startRecording{
NSLog(@"IDCaptureSessionCoordinator startRecording 진입");
}
//녹화 정지
- (void)stopRecording{
NSLog(@"IDCaptureSessionCoordinator stopRecording 진입");
}
#pragma mark - Capture Session Setup
//세션 셋팅
- (AVCaptureSession *)setupCaptureSession
{
NSLog(@"IDCaptureSessionCoordinator setupCaptureSession 진입 ");
AVCaptureSession *captureSession = [AVCaptureSession new];
//비디오 인풋 디바이스 생성 + 캡쳐 세션에 넣어주기 여부 체크
if (![self addDefaultCameraInputToCaptureSession:captureSession]) {
NSLog(@"비디오 인풋을 캡쳐세션에 넣기 실패");
}
//오디오 인풋 디바이스 생성 + 캡쳐 세션에 넣어주기 여부 체크
if (![self addDefaultMicInputToCaptureSession:captureSession]) {
NSLog(@"오디오 인풋을 캡쳐세션에 넣기 실패");
}
return captureSession;
}
//비디오 입력을 구성하려면 AVCaptureDeviceInput원하는 카메라 장치 로 개체를 만들고 캡처 세션에 추가
//기본 설정된 카메라를 캡쳐세션에 추가
- (BOOL)addDefaultCameraInputToCaptureSession:(AVCaptureSession *)captureSession
{
NSLog(@"IDCaptureSessionCoordinator addDefaultCameraInputToCaptureSession 진입");
NSError *error;
//비디오 인풋 디바이스 객체 생성 및 초기화
AVCaptureDeviceInput *cameraDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:&error];
if (error) {
NSLog(@"error configuring 비디오 input : %@" , [error localizedDescription]);
return NO;
} else {
//캡쳐 세션에 카메라 인풋 넣어주기
BOOL success = [self addInput:cameraDeviceInput toCaptureSession:captureSession];
//캡쳐 세션에 인풋디바이스 넣기 성공했으면 디바이스 객체를 할당해 준다.
_cameraDevice= cameraDeviceInput.device;
return success;
}
}
//마이크 인풋디바이스 객체를 생성해서 캡쳐 세션에 넣어준다.
- (BOOL)addDefaultMicInputToCaptureSession:(AVCaptureSession *)captureSession
{
NSLog(@"IDCaptureSessionCoordinator addDefaultMicInputToCaptureSession 진입");
NSError *error;
AVCaptureDeviceInput *micDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] error:&error];
if (error) {
NSLog(@"error configuring 오디오 input : %@" , [error localizedDescription]);
return NO;
} else {
BOOL success = [self addInput:micDeviceInput toCaptureSession:captureSession];
return success;
}
}
//인풋객체 넣기 - 전역 메소드
- (BOOL)addInput:(AVCaptureDeviceInput *)input toCaptureSession:(AVCaptureSession *)captureSession{
NSLog(@"IDCaptureSessionCoordinator addInput toCaptureSession 진입");
//캡쳐세션에 인풋 디바이스 넣을 수 있는지 확인
if ([captureSession canAddInput:input]) {
//캡쳐 세션에 인풋 디바이스 넣기
[captureSession addInput:input];
NSLog(@"IDCaptureSessionCoordinator addInput toCaptureSession - 세션에 캡쳐디바이스인풋 들어갔어요!");
return YES;
} else {
NSLog(@"세션에 인풋 디바이스를 넣을 수 없습니다 : %@" , [input description]);
return NO;
}
return YES;
}
//아웃풋 객체 넣기 - 전역 메소드
- (BOOL)addOutput:(AVCaptureOutput *)output toCaptureSession:(AVCaptureSession *)captureSession{
NSLog(@"IDCaptureSessionCoordinator addOutput toCaptureSession 진입");
//아웃풋 객체를 세션에 넣어주기
if ([captureSession canAddOutput:output]) {
[captureSession addOutput:output];
return YES;
}else{
NSLog(@"output을 세션에 넣을 수 없습니다! 설명 : %@" , [output description]);
}
return NO;
}
@end
IDCaptureSessionAssetWriterCoordinator(자식)
#import "IDCaptureSessionCoordinator.h"
//@protocol IDCaptureSessionAssetWriterCoordinatorDelegate;
@interface IDCaptureSessionAssetWriterCoordinator : IDCaptureSessionCoordinator
@end
#import "IDCaptureSessionAssetWriterCoordinator.h"
#import <MobileCoreServices/MobileCoreServices.h>
#import "IDAssetWriterCoordinator.h"
#import "IDFileManager.h"
typedef NS_ENUM( NSInteger, RecordingStatus )
{
RecordingStatusIdle = 0,
RecordingStatusStartingRecording,
RecordingStatusRecording,
RecordingStatusStoppingRecording,
}; //internal state machine
@interface IDCaptureSessionAssetWriterCoordinator () <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, IDAssetWriterCoordinatorDelegate>
@property (nonatomic, strong) dispatch_queue_t videoDataOutputQueue;
@property (nonatomic, strong) dispatch_queue_t audioDataOutputQueue;
//data아웃풋
@property (nonatomic, strong) AVCaptureVideoDataOutput *videoDataOutput;
@property (nonatomic, strong) AVCaptureAudioDataOutput *audioDataOutput;
//캡쳐 커넥션
@property (nonatomic, strong) AVCaptureConnection *audioConnection;
@property (nonatomic, strong) AVCaptureConnection *videoConnection;
//딕셔너리
@property (nonatomic, strong) NSDictionary *videoCompressionSettings;
@property (nonatomic, strong) NSDictionary *audioCompressionSettings;
@property (nonatomic, strong) AVAssetWriter *assetWriter;
@property (nonatomic, assign) RecordingStatus recordingStatus; //NS_ENUM
@property (nonatomic, strong) NSURL *recordingURL;
@property(nonatomic, retain) __attribute__((NSObject)) CMFormatDescriptionRef outputVideoFormatDescription;
@property(nonatomic, retain) __attribute__((NSObject)) CMFormatDescriptionRef outputAudioFormatDescription;
@property(nonatomic, retain) IDAssetWriterCoordinator *assetWriterCoordinator;
@end
@implementation IDCaptureSessionAssetWriterCoordinator
- (instancetype)init
{
self = [super init];
if (self)
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - init 진입");
//비디오 큐 생성
self.videoDataOutputQueue = dispatch_queue_create("com.example.capturesession.videodata", DISPATCH_QUEUE_SERIAL);
dispatch_set_target_queue(_videoDataOutputQueue, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));
//오디오 큐 생성
self.audioDataOutputQueue = dispatch_queue_create("com.example.capturesession.audiodata", DISPATCH_QUEUE_SERIAL);
[self addDataOutputsToCaptureSession:self.captureSession]; //captureSession은 부모 클래스가 가지고 있다.
}
return self;
}
#pragma mark - Recording
//자식
- (void)startRecording
{
}
- (void)stopRecording
{
}
#pragma mark - Private methods
- (void)addDataOutputsToCaptureSession:(AVCaptureSession *)captureSession
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - addDataOutputsToCaptureSession 진입 : 비디오, 오디오 아웃풋 객체 생성 + 설정");
//비디오 객체 생성 + 설정
self.videoDataOutput = [AVCaptureVideoDataOutput new];
_videoDataOutput.videoSettings = nil;
_videoDataOutput.alwaysDiscardsLateVideoFrames = NO;
//아웃풋에 델리게이트 설정 + 비디오 데이터 아웃풋 큐 할당
[_videoDataOutput setSampleBufferDelegate:self queue:_videoDataOutputQueue];
//오디오 아웃풋 객체 생성 + 오디오 데이터 아웃풋 큐 할당
self.audioDataOutput = [AVCaptureAudioDataOutput new];
[_audioDataOutput setSampleBufferDelegate:self queue:_audioDataOutputQueue];
//캡쳐 세션에 비디오 갭쳐 아웃풋 넣기(이 클래스가 부모 클래스 capturesession 상속 받아서 capturesession 에서 진행됨)
[self addOutput:_videoDataOutput toCaptureSession:self.captureSession];
//비디오 커넥션 초기화
_videoConnection = [_videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
//캡쳐 세션에 오디오 캡쳐 아웃풋 넣기
[self addOutput:_audioDataOutput toCaptureSession:self.captureSession];
//오디오 커넥터 초기화
_audioConnection = [_audioDataOutput connectionWithMediaType:AVMediaTypeAudio];
[self setCompressionSettings];
}
- (void)setupVideoPipelineWithInputFormatDescription:(CMFormatDescriptionRef)inputFormatDescription
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - setupVideoPipelineWithInputFormatDescription 진입 :");
self.outputVideoFormatDescription = inputFormatDescription;
}
- (void)teardownVideoPipeline
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - teardownVideoPipeline 진입 :");
}
- (void)setCompressionSettings
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - setCompressionSettings 진입 :");
_videoCompressionSettings = [_videoDataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie];
_audioCompressionSettings = [_audioDataOutput recommendedAudioSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie];
}
#pragma mark - SampleBufferDelegate methods
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :");
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
if (connection == _videoConnection) {
NSLog(@"IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = videoConnection");
if (self.outputVideoFormatDescription == nil) {
NSLog(@"IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : outputVideoFormatDescription == nil");
// Don't render the first sample buffer.
// This gives us one frame interval (33ms at 30fps) for setupVideoPipelineWithInputFormatDescription: to complete.
// Ideally this would be done asynchronously to ensure frames don't back up on slower devices.
//TODO: outputVideoFormatDescription should be updated whenever video configuration is changed (frame rate, etc.)
//Currently we don't use the outputVideoFormatDescription in IDAssetWriterRecoredSession
/*
첫 번째 샘플 버퍼를 렌더링하지 마십시오.
이 작업을 완료하려면 setupVideoPipelineWithInputFormatDescription :에 대해 하나의 프레임 간격 (30fps에서 33ms)이 필요합니다.
이상적으로 이것은 느린 장치에서 프레임이 백업되지 않도록 비동기 적으로 수행됩니다.
TODO : 비디오 구성이 변경 될 때마다 outputVideoFormatDescription을 업데이트해야합니다 (프레임 속도 등)
현재 우리는 IDAssetWriterRecoredSession에서 outputVideoFormatDescription을 사용하지 않습니다
*/
[self setupVideoPipelineWithInputFormatDescription:formatDescription];
}else{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : outputVideoFormatDescription != nil");
self.outputVideoFormatDescription = formatDescription;
@synchronized(self){
if(_recordingStatus == RecordingStatusRecording){
NSLog(@"IDCaptureSessionAssetWriterCoordinator - [_assetWriterCoordinator appendAudioSampleBuffer:sampleBuffer] 호출 :");
// [_assetWriterCoordinator appendAudioSampleBuffer:sampleBuffer];
}
}
}
}else if(connection == _audioConnection){
NSLog(@"IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = audioConnection");
self.outputAudioFormatDescription = formatDescription;
@synchronized( self ) {
if(_recordingStatus == RecordingStatusRecording){ //2 면
[_assetWriterCoordinator appendAudioSampleBuffer:sampleBuffer];
}
}
}
}
#pragma mark - IDAssetWriterCoordinatorDelegate methods
//녹화 시작 버튼 클릭했을때!!
- (void)writerCoordinatorDidFinishPreparing:(IDAssetWriterCoordinator *)coordinator
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - writerCoordinatorDidFinishPreparing 진입 :");
}
//오류발생 했을때!!
- (void)writerCoordinator:(IDAssetWriterCoordinator *)recorder didFailWithError:(NSError *)error
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - didFailWithError 진입 :");
}
// 정지 버튼 클릭되었을 때 호출
- (void)writerCoordinatorDidFinishRecording:(IDAssetWriterCoordinator *)coordinator
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - writerCoordinatorDidFinishRecording 진입 :");
}
#pragma mark - Recording State Machine
// call under @synchonized( self )
- (void)transitionToRecordingStatus:(RecordingStatus)newStatus error:(NSError *)error
{
NSLog(@"IDCaptureSessionAssetWriterCoordinator - transitionToRecordingStatus 진입");
}
@end
IDAssetWriterCoordinator
#import <Foundation/Foundation.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
@protocol IDAssetWriterCoordinatorDelegate;
@interface IDAssetWriterCoordinator : NSObject
- (instancetype)initWithURL:(NSURL *)URL;
- (void)addVideoTrackWithSourceFormatDescription:(CMFormatDescriptionRef)formatDescription settings:(NSDictionary *)videoSettings;
- (void)addAudioTrackWithSourceFormatDescription:(CMFormatDescriptionRef)formatDescription settings:(NSDictionary *)audioSettings;
- (void)setDelegate:(id<IDAssetWriterCoordinatorDelegate>)delegate callbackQueue:(dispatch_queue_t)delegateCallbackQueue;
- (void)prepareToRecord;
- (void)appendVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;
- (void)appendAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer;
- (void)finishRecording;
@end
@protocol IDAssetWriterCoordinatorDelegate <NSObject>
- (void)writerCoordinatorDidFinishPreparing:(IDAssetWriterCoordinator *)coordinator;
- (void)writerCoordinator:(IDAssetWriterCoordinator *)coordinator didFailWithError:(NSError *)error;
- (void)writerCoordinatorDidFinishRecording:(IDAssetWriterCoordinator *)coordinator;
@end
#import "IDAssetWriterCoordinator.h"
typedef NS_ENUM(NSInteger, WriterStatus){
WriterStatusIdle = 0,
WriterStatusPreparingToRecord,
WriterStatusRecording,
WriterStatusFinishingRecordingPart1, // waiting for inflight buffers to be appended
WriterStatusFinishingRecordingPart2, // calling finish writing on the asset writer
WriterStatusFinished, // terminal state
WriterStatusFailed // terminal state
}; // internal state machine
@interface IDAssetWriterCoordinator()
@property (nonatomic, assign) WriterStatus status; //NS_ENUM
@property (nonatomic) dispatch_queue_t writingQueue;
@property (nonatomic) dispatch_queue_t delegateCallbackQueue;
@property (nonatomic) NSURL *URL;
@property (nonatomic) AVAssetWriter *assetWriter;
@property (nonatomic) BOOL haveStartedSession;
//오디오 설명, 셋팅, 인풋
@property (nonatomic) CMFormatDescriptionRef audioTrackSourceFormatDescription;
@property (nonatomic) NSDictionary *audioTrackSettings;
@property (nonatomic) AVAssetWriterInput *audioInput;
//비디오 설명, 셋팅, 인풋
@property (nonatomic) CMFormatDescriptionRef videoTrackSourceFormatDescription;
@property (nonatomic) CGAffineTransform videoTrackTransform;
@property (nonatomic) NSDictionary *videoTrackSettings;
@property (nonatomic) AVAssetWriterInput *videoInput;
@end
@implementation IDAssetWriterCoordinator
-(instancetype)initWithURL:(NSURL *)URL
{
NSLog(@"IDAssetWriterCoordinator - initWithURL 진입");
self = [super init];
if (self) {
}
return self;
}
- (void)addVideoTrackWithSourceFormatDescription:(CMFormatDescriptionRef)formatDescription settings:(NSDictionary *)videoSettings
{
NSLog(@"IDAssetWriterCoordinator - addVideoTrackWithSourceFormatDescription 진입");
}
- (void)addAudioTrackWithSourceFormatDescription:(CMFormatDescriptionRef)formatDescription settings:(NSDictionary *)audioSettings
{
NSLog(@"IDAssetWriterCoordinator - addAudioTrackWithSourceFormatDescription 진입");
}
- (void)setDelegate:(id<IDAssetWriterCoordinatorDelegate>)delegate callbackQueue:(dispatch_queue_t)delegateCallbackQueue
{
NSLog(@"IDAssetWriterCoordinator - setDelegate <IDAssetWriterCoordinatorDelegate> 진입");
}
- (void)prepareToRecord
{
NSLog(@"IDAssetWriterCoordinator - prepareToRecord 진입");
}
- (void)appendVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
NSLog(@"IDAssetWriterCoordinator - appendVideoSampleBuffer 진입");
}
- (void)appendAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
NSLog(@"IDAssetWriterCoordinator - appendAudioSampleBuffer 진입");
}
- (void)finishRecording
{
NSLog(@"IDAssetWriterCoordinator - finishRecording 진입");
}
#pragma mark - Private methods -----------------------------------------------------------------------------
// audioInput 생성후, assetWriter에 넣어주기
- (BOOL)setupAssetWriterAudioInputWithSourceFormatDescription:(CMFormatDescriptionRef)audioFormatDescription settings:(NSDictionary *)audioSettings error:(NSError **)errorOut
{
NSLog(@"IDAssetWriterCoordinator - setupAssetWriterAudioInputWithSourceFormatDescription 진입");
return YES;
}
//video input 생성 , assetWriter에 videoInput 추가
- (BOOL)setupAssetWriterVideoInputWithSourceFormatDescription:(CMFormatDescriptionRef)videoFormatDescription transform:(CGAffineTransform)transform settings:(NSDictionary *)videoSettings error:(NSError **)errorOut
{
NSLog(@"IDAssetWriterCoordinator - setupAssetWriterVideoInputWithSourceFormatDescription 진입");
return YES;
}
- (NSDictionary *)fallbackVideoSettingsForSourceFormatDescription:(CMFormatDescriptionRef)videoFormatDescription
{
NSLog(@"IDAssetWriterCoordinator - fallbackVideoSettingsForSourceFormatDescription 진입");
return nil;
}
//최종적으로 버퍼 더하기
- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer ofMediaType:(NSString *)mediaType
{
NSLog(@"IDAssetWriterCoordinator - appendSampleBuffer 진입");
}
// call under @synchonized( self )
- (void)transitionToStatus:(WriterStatus)newStatus error:(NSError *)error
{
NSLog(@"IDAssetWriterCoordinator - transitionToStatus 진입");
}
- (NSError *)cannotSetupInputError
{
NSLog(@"IDAssetWriterCoordinator - cannotSetupInputError 진입");
return nil;
}
@end
카메라 실행시켰을 때 로그
2019-04-16 13:49:01.701875+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionPipelineViewController setupWithPipelineMode IDCaptureSessionAssetWriterCoordinator 객체 생성시작!
2019-04-16 13:49:01.701993+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator init 진입
2019-04-16 13:49:01.702034+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator setupCaptureSession 진입
2019-04-16 13:49:01.706344+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addDefaultCameraInputToCaptureSession 진입
2019-04-16 13:49:01.719701+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addInput toCaptureSession 진입
2019-04-16 13:49:01.722364+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addInput toCaptureSession - 세션에 캡쳐디바이스인풋 들어갔어요!
2019-04-16 13:49:01.722407+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addDefaultMicInputToCaptureSession 진입
2019-04-16 13:49:01.731848+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addInput toCaptureSession 진입
2019-04-16 13:49:01.733323+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addInput toCaptureSession - 세션에 캡쳐디바이스인풋 들어갔어요!
2019-04-16 13:49:01.733360+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionAssetWriterCoordinator - init 진입
2019-04-16 13:49:01.733400+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionAssetWriterCoordinator - addDataOutputsToCaptureSession 진입 : 비디오, 오디오 아웃풋 객체 생성 + 설정
2019-04-16 13:49:01.733780+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addOutput toCaptureSession 진입
2019-04-16 13:49:01.735467+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator addOutput toCaptureSession 진입
2019-04-16 13:49:01.736292+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionAssetWriterCoordinator - setCompressionSettings 진입 :
2019-04-16 13:49:01.737157+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator setDelegate 진입 - 전역변수에 IDCaptureSessionCoordinatorDelegate 할당, delegateCallbackQueue 할당
2019-04-16 13:49:01.737205+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionPipelineViewController - configureInterface 호출 - AVCaptureVideoPreviewLayer 객체 생성
2019-04-16 13:49:01.737225+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator previewLayer 캡쳐 세션 넣어줌 초기화
2019-04-16 13:49:01.739046+0900 AssertWriterVideoRecorderNew[2591:505792] IDCaptureSessionCoordinator startRunning 진입 - dispatch_sync(세션큐, 캡쳐쎄션 스타트러닝!)
2019-04-16 13:49:02.041482+0900 AssertWriterVideoRecorderNew[2591:505846] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :
2019-04-16 13:49:02.041546+0900 AssertWriterVideoRecorderNew[2591:505846] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = videoConnection
2019-04-16 13:49:02.041563+0900 AssertWriterVideoRecorderNew[2591:505846] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : outputVideoFormatDescription == nil
2019-04-16 13:49:02.041582+0900 AssertWriterVideoRecorderNew[2591:505846] IDCaptureSessionAssetWriterCoordinator - setupVideoPipelineWithInputFormatDescription 진입 :
2019-04-16 13:49:02.042227+0900 AssertWriterVideoRecorderNew[2591:505847] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :
2019-04-16 13:49:02.042250+0900 AssertWriterVideoRecorderNew[2591:505847] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = videoConnection
2019-04-16 13:49:02.042261+0900 AssertWriterVideoRecorderNew[2591:505847] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : outputVideoFormatDescription != nil
2019-04-16 13:49:02.078642+0900 AssertWriterVideoRecorderNew[2591:505845] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :
2019-04-16 13:49:02.078711+0900 AssertWriterVideoRecorderNew[2591:505845] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = videoConnection
2019-04-16 13:49:02.078726+0900 AssertWriterVideoRecorderNew[2591:505845] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : outputVideoFormatDescription != nil
2019-04-16 13:49:02.109463+0900 AssertWriterVideoRecorderNew[2591:505847] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :
2019-04-16 13:49:02.109554+0900 AssertWriterVideoRecorderNew[2591:505847] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = audioConnection
2019-04-16 13:49:02.111679+0900 AssertWriterVideoRecorderNew[2591:505844] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :
2019-04-16 13:49:02.111712+0900 AssertWriterVideoRecorderNew[2591:505844] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = videoConnection
2019-04-16 13:49:02.111724+0900 AssertWriterVideoRecorderNew[2591:505844] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : outputVideoFormatDescription != nil
2019-04-16 13:49:02.161513+0900 AssertWriterVideoRecorderNew[2591:505844] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer 진입 :
2019-04-16 13:49:02.161828+0900 AssertWriterVideoRecorderNew[2591:505844] IDCaptureSessionAssetWriterCoordinator - didOutputSampleBuffer : connection = audioConnection
예제파일
'ios 뽀개기 > objective-c' 카테고리의 다른 글
AVCaptureVideoDataOutput을 이용해서 카메라 만들기 3 - 녹화 (924) | 2019.04.17 |
---|---|
AVCaptureVideoDataOutput을 이용해서 카메라 만들기 2 - 녹화 (0) | 2019.04.17 |
Ios autoLayout 기초 (0) | 2019.04.05 |
ios 화면보호기 & 터치 감지하기 (0) | 2019.04.05 |
날짜를 이용해서 파일 삭제하기 (0) | 2019.04.03 |
댓글