iOS 小坑记录:如何给 AVPlayer 截图

最近接了个活,给正在播放 m3u8 文件的 AVPlayer 截图。

我们知道,[UIView snapshotViewAfterScreenUpdates:]AVPlayer 是不好使的,虽然能截出个 View 来,但是画到 CGContext 上就是一片透明。

Stack Overflow 上给出的方案也是不行的。AVAssetImageGenerator 不认 m3u8 的 URL。

继续搜索,这个回答说应该把 AVPlayerItemVideoOutput 的实例设成 AVPlayer 的一个 output ,然后用 [AVPlayerItemVideoOutput copyPixelBufferForItemTime:itemTimeForDisplay:] 这个方法把信号掐下来。

但是我用这个函数,返回的总是 NULL有人说加个循环,取一百遍就有了。我真干了,但是循环了一千遍也没有。还有个傻子说每 0.03 秒取一次,有时能取出来。什么破主意!

要想爽,看文档。

根据 Real-time Video Processing Using AVPlayerItemVideoOutput ,以下可以实现直播的截图。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
@interface ViewController () {
AVPlayerItemVideoOutput *snapshotOutput;
CVPixelBufferRef lastSnapshotPixelBuffer;
}

@end

@implementation ViewController

- (void)viewDidAppear:(BOOL)animated {
[self playVideo];
}

- (IBAction)snapshotButtonTouched:(UIButton *)sender {
UIImage *image = [self snapshotImage];
if (image) {
[_imageView setImage:image];
}
}

- (void)playVideo {
NSURL *url = [NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
_playerItem = [AVPlayerItem playerItemWithAsset:asset];
_player = [AVPlayer playerWithPlayerItem:_playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
playerLayer.frame = _playbackView.bounds;
[_playbackView.layer addSublayer:playerLayer];

CADisplayLink *displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(displayLinkCallback:)];
[displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
snapshotOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:NULL];
[_playerItem removeOutput:snapshotOutput];
[_playerItem addOutput:snapshotOutput];
[_player replaceCurrentItemWithPlayerItem:_playerItem];

[_player play];
}

- (void)displayLinkCallback:(CADisplayLink *) sender{
CMTime time = [snapshotOutput itemTimeForHostTime:CACurrentMediaTime()];
if ([snapshotOutput hasNewPixelBufferForItemTime:time]) {
lastSnapshotPixelBuffer = [snapshotOutput copyPixelBufferForItemTime:time itemTimeForDisplay:NULL];
}
}

- (UIImage *)snapshotImage {
if (lastSnapshotPixelBuffer) {
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:lastSnapshotPixelBuffer];
CIContext *context = [CIContext contextWithOptions:NULL];
CGRect rect = CGRectMake(0,
0,
CVPixelBufferGetWidth(lastSnapshotPixelBuffer),
CVPixelBufferGetHeight(lastSnapshotPixelBuffer));
CGImageRef cgImage = [context createCGImage:ciImage fromRect:rect];
return [UIImage imageWithCGImage:cgImage];
}
return NULL;
}