logo头像

最可怕一生碌碌无为,还说平凡难能可贵!

摄像头采集图像

前言

此文的目的是介绍通过iOS设备摄像头获得视频数据的一些API及基本设置。

主要API

  • AVCaptureDevice //输入设备,摄像头或者麦克风
  • AVCaptureInput //用于配置输入端口
  • AVCaptureOutput //用于管理输出视频或者静态图片
  • AVCaptureSession //用于协调从输入到输出的数据流
  • AVCaptureVideoPreviewLayer //实时预览摄像头采集的图片数据

AVCaptureSession初始化 & 设置分辨率

AVCaptureSession的初始化仅仅是初始化一个实例对象,分辨率一般测试用直播使用AVCaptureSessionPreset640x480已经足够,后期可以根据需求设置

1
2
3
4
5
6
7
//初始化session
self.session = [[AVCaptureSession alloc] init];

//设置摄像头的分辨率640*480
if ([self.session canSetSessionPreset:AVCaptureSessionPreset640x480]) {
self.session.sessionPreset = AVCaptureSessionPreset640x480;
}
  • AVCaptureSessionPresetHigh //Highest recording quality. This varies per device.
  • AVCaptureSessionPresetMedium //Suitable for Wi-Fi sharing. The actual values may change.
  • AVCaptureSessionPresetLow //Suitable for 3G sharing. The actual values may change.
  • AVCaptureSessionPreset640x480 //VGA.
  • AVCaptureSessionPreset1280x720 //720p HD.
  • AVCaptureSessionPresetPhoto //Full photo resolution. This is not supported for video output.

AVCaptureDevice初始化 & 自动变焦

使用默认方式获取到的摄像头是后置,如果需要获取前置要遍历所有摄像头设备,找到[device position] == AVCaptureDevicePositionFront

1
2
3
4
5
6
7
8
9
10
11
12
13
14
//默认获取后置摄像头
self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

or

//遍历所有视频采集设备
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
//找到前置摄像头
if ([device position] == AVCaptureDevicePositionFront) {
self.videoDevice = device;
break;
}
}

拍摄动态视频都是要自动变焦的,我们用到支持自动变焦AVCaptureFocusModeContinuousAutoFocus

注意:要在设备上设置捕获属性,必须首先使用lockForConfiguration:获取设备上的锁。这避免了做出可能与其他应用程序中的设置不兼容的更改。

1
2
3
4
5
6
7
8
9
//自动变焦
if([self.videoDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]){
NSError *error;
BOOL isLooked = [self.videoDevice lockForConfiguration:&error];
if(isLooked && error == nil){
self.videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
[self.videoDevice unlockForConfiguration];
}
}

设置帧率

帧率可以动态去设置,self.videoDevice.activeFormat.videoSupportedFrameRateRanges可以获取到摄像头帧率的支持范围,设置时同样需要对设备上锁。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
//在设置帧率之前先获取帧率支持范围
NSLog(@"支持的帧速范围是: %@",[self.videoDevice.activeFormat.videoSupportedFrameRateRanges objectAtIndex:0]);

//设置帧速
NSError *error;
BOOL isLocked = [self.videoDevice lockForConfiguration:&error];

if (isLocked && error == nil) {

if (self.videoDevice.activeFormat.videoSupportedFrameRateRanges){
[self.videoDevice setActiveVideoMinFrameDuration:CMTimeMake(1, fps)];
[self.videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, fps)];
}

[self.videoDevice unlockForConfiguration];
}

其他设置

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
//输出设置
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];

//kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 表示原始数据的格式为YUV420
NSDictionary *settings = [[NSDictionary alloc] initWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, nil];
self.videoDataOutput.videoSettings = settings;

//保证不会在结束录制时卡顿,丢弃最后的视频帧
self.videoDataOutput.alwaysDiscardsLateVideoFrames = YES;

//创建输出处理队列
_videoQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);

//指定输出监听代理及队列
[self.videoDataOutput setSampleBufferDelegate:self queue:_videoQueue];

if([self.session canAddOutput:self.videoDataOutput]){
[self.session addOutput:self.videoDataOutput];
}

//获取输出连接
self.videoConnection = [self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo];

//设置输出图像的方向
self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;

代理方法处理采集到的画面

在设置好所有的参数便可以调用session来控制摄像头开始、暂停以及结束,通过AVCaptureVideoDataOutputSampleBufferDelegate的方法监听采集到的数据。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
//停止
- (void)stopVideoCapture{
[self.session stopRunning];
[[UIApplication sharedApplication] setIdleTimerDisabled:NO];
}

//进入后台暂停
- (void)pauseCameraCapture{
[self.session stopRunning];
}

//进入前台开始
- (void)resumeCameraCapture{
[self.session startRunning];
}
1
2
3
4
5
6
7
8
9
10
11
12
13
#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate 
/*
CMSampleBufferRef: 帧缓存数据,描述当前帧信息
CMSampleBufferGetXXX : 获取帧缓存信息
CMSampleBufferGetDuration : 获取当前帧播放时间
CMSampleBufferGetImageBuffer : 获取当前帧图片信息
*/
// 获取帧数据
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

// captureSession 会话如果没有强引用,这里不会得到执行
NSLog(@"----- sampleBuffer ----- %@", sampleBuffer);
}

参考

《iOS - 视频采集详解》
《Still and Video Media Capture》

上一篇