WebRTC音视频通话-RTC直播本地视频及相册视频文件
WebRTC音视频通话-RTC直播本地视频文件效果图如下
WebRTC音视频通话-RTC直播本地视频文件时候,用到了AVPlayer、CADisplayLink。
一、通过AVPlayer播放本地视频
- AVPlayer是什么?
AVPlayer是基于AVFoundation框架的一个类,很接近底层,灵活性强,可以自定义视频播放样式。
- AVPlayerLayer是什么?
AVPlayerLayer是视频播放时候的画面展示层。
- CADisplayLink是什么?
CADisplayLink和NSTimer一样,是一个定时器。但是CADisplayLink会和屏幕的刷新率始终保持一致(很多时候会使用CADisplayLink来检测屏幕的帧率)。
- AVPlayerItemVideoOutput是什么?
AVPlayerItemVideoOutput是PlayerItem视频的输出,通过AVPlayerItemVideoOutput可以获取视频的CVPixelBufferRef
下面就实现本地视频播放过程中结合ossrs进行WebRTC直播。
AVPlayer设置播放本地视频
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:videoPath]];
(void)reloadPlayItem:(AVPlayerItem *)playerItem {
self.playerItem = playerItem;
[self initPlayerVideoOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}
开始配置播放展示AVPlayerLayer
- (void)startPlay {
if (self.isPlaying) {
return;
}
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer = playerLayer;
self.playerLayer.backgroundColor = [UIColor clearColor].CGColor;
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.superView.layer addSublayer:self.playerLayer];
self.playerLayer.frame = self.superView.bounds;
[self.player seekToTime:CMTimeMake(0, 1)];
[self.player play];
[self startAnimating];
}
通过KVO监控播放器状态
/**
* 通过KVO监控播放器状态
*/
- (void)observeValueForKeyPath:(NSString *)keyPath {
DebugLog(@"observeValueForKeyPath:%@", keyPath);
AVPlayerItem *videoItem = self.playerItem;
if ([keyPath isEqualToString:@"timeControlStatus"]) {
/**
typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {
AVPlayerTimeControlStatusPaused = 0,
AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate = 1,
AVPlayerTimeControlStatusPlaying = 2
} API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(3.0));
*/
// 监听播放器timeControlStatus 指示当前是否正在播放,无限期暂停播放,或在等待适当的网络条件时暂停播放
if (@available(iOS 10.0, *)) {
switch (self.player.timeControlStatus) {
case AVPlayerTimeControlStatusPaused: {
NSLog(@"AVPlayerTimeControlStatusPaused");
// 暂停
self.isPlaying = NO;
}
break;
case AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate: {
NSLog(@"AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate");
// 等待
}
break;
case AVPlayerTimeControlStatusPlaying: {
NSLog(@"AVPlayerTimeControlStatusPlaying");
// 播放
self.isPlaying = YES;
}
break;
default:
break;
}
} else {
// Fallback on earlier versions
}
}
}
设置关键的AVPlayerItemVideoOutput
- (void)initPlayerVideoOutput {
//输出yuv 420格式
NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
[self.playerItem addOutput:output];
self.playerItemVideoOutput = output;
[self.playerItemVideoOutput setDelegate:self queue:dispatch_get_main_queue()];
// 如果将 AVPlayerItemVideoOutput 类的输出(对于 suppressesPlayerRendering 的值为 YES)添加到 AVPlayerItem,则该项目的视频媒体将不会由 AVPlayer 呈现,而音频媒体、字幕媒体和其他类型的媒体(如果存在) , 将被渲染。
self.playerItemVideoOutput.suppressesPlayerRendering = NO;
}
之后通过displayLink开启实时调用AVPlayerItemVideoOutput得到视频画面CVPixelBufferRef
#pragma mark - DisplayLink
- (void)startDisplayLink {
if (self.displayLink) {
return;
}
self.displayLink = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self]
selector:@selector(handleDisplayLink:)];
[self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
// self.displayLink.preferredFramesPerSecond = 2;
self.displayLink.paused = NO;
}
- (void)handleDisplayLink:(CADisplayLink *)displayLink {
//do something
CMTime outputItemTime = kCMTimeInvalid;
CFTimeInterval nextVSync = ([displayLink timestamp] + [displayLink duration]);
outputItemTime = [[self playerItemVideoOutput] itemTimeForHostTime:nextVSync];
if ([[self playerItemVideoOutput] hasNewPixelBufferForItemTime:outputItemTime]) {
CVPixelBufferRef pixelBuffer = NULL;
pixelBuffer = [[self playerItemVideoOutput] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:NULL];
// ..... do something with pixbuffer
if (self.delegate && [self.delegate respondsToSelector:@selector(videoLivePlayerPixelBufferRef:)]) {
[self.delegate videoLivePlayerPixelBufferRef:pixelBuffer];
}
if (pixelBuffer != NULL) {
CFRelease(pixelBuffer);
}
}
}
- (void)stopDisplayLink {
[self.displayLink invalidate];
self.displayLink = nil;
}
这样在播放过程中,通过CADisplayLink获取到CVPixelBufferRef,之后使用WebRTC来进行直播。
完整播放视频过程中获得CVPixelBufferRef代码如下
SDVideoLivePlayer.h
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>
@protocol SDVideoLivePlayerDelegate;
@interface SDVideoLivePlayer : NSObject
@property (nonatomic, strong) UIView *superView;
@property (nonatomic, weak) id<SDVideoLivePlayerDelegate> delegate;
- (instancetype)initWithSuperView:(UIView *)superView;
- (void)reloadPlayItem:(AVPlayerItem *)playerItem;
/// 开始播放
- (void)startPlay;
/// 结束播放
- (void)stopPlay;
@end
@protocol SDVideoLivePlayerDelegate <NSObject>
- (void)videoLivePlayerPixelBufferRef:(CVPixelBufferRef)pixelBufferRef;
@end
SDVideoLivePlayer.m
#import "SDVideoLivePlayer.h"
@interface SDVideoLivePlayer ()<AVPlayerItemOutputPullDelegate> {
}
@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) AVPlayerLayer *playerLayer;
@property (nonatomic, strong) AVPlayerItem *playerItem;
@property (nonatomic, assign) BOOL isPlaying;
@property (nonatomic, strong) AVPlayerItemVideoOutput *playerItemVideoOutput;
@property (nonatomic, strong) CADisplayLink *displayLink;
@end
@implementation SDVideoLivePlayer
- (instancetype)initWithSuperView:(UIView *)superView
{
self = [super init];
if (self) {
self.superView = superView;
self.isPlaying = NO;
[self addNotifications];
[self initPlayerVideoOutput];
}
return self;
}
- (void)initPlayerVideoOutput {
//输出yuv 420格式
NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
[self.playerItem addOutput:output];
self.playerItemVideoOutput = output;
[self.playerItemVideoOutput setDelegate:self queue:dispatch_get_main_queue()];
// 如果将 AVPlayerItemVideoOutput 类的输出(对于 suppressesPlayerRendering 的值为 YES)添加到 AVPlayerItem,则该项目的视频媒体将不会由 AVPlayer 呈现,而音频媒体、字幕媒体和其他类型的媒体(如果存在) , 将被渲染。
self.playerItemVideoOutput.suppressesPlayerRendering = NO;
}
- (void)reloadPlayItem:(AVPlayerItem *)playerItem {
self.playerItem = playerItem;
[self initPlayerVideoOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}
- (void)startPlay {
if (self.isPlaying) {
return;
}
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer = playerLayer;
self.playerLayer.backgroundColor = [UIColor clearColor].CGColor;
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.superView.layer addSublayer:self.playerLayer];
self.playerLayer.frame = self.superView.bounds;
[self.player seekToTime:CMTimeMake(0, 1)];
[self.player play];
[self startAnimating];
}
- (void)stopPlay {
self.isPlaying = NO;
[self.player pause];
[self stopAnimating];
}
/**
* 通过KVO监控播放器状态
*/
- (void)observeValueForKeyPath:(NSString *)keyPath {
DebugLog(@"observeValueForKeyPath:%@", keyPath);
AVPlayerItem *videoItem = self.playerItem;
if ([keyPath isEqualToString:@"timeControlStatus"]) {
/**
typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {
AVPlayerTimeControlStatusPaused = 0,
AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate = 1,
AVPlayerTimeControlStatusPlaying = 2
} API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(3.0));
*/
// 监听播放器timeControlStatus 指示当前是否正在播放,无限期暂停播放,或在等待适当的网络条件时暂停播放
if (@available(iOS 10.0, *)) {
switch (self.player.timeControlStatus) {
case AVPlayerTimeControlStatusPaused: {
NSLog(@"AVPlayerTimeControlStatusPaused");
// 暂停
self.isPlaying = NO;
}
break;
case AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate: {
NSLog(@"AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate");
// 等待
}
break;
case AVPlayerTimeControlStatusPlaying: {
NSLog(@"AVPlayerTimeControlStatusPlaying");
// 播放
self.isPlaying = YES;
}
break;
default:
break;
}
} else {
// Fallback on earlier versions
}
}
}
- (void)audioSessionInterrupted:(NSNotification *)notification{
//通知类型
NSDictionary * info = notification.userInfo;
if ([[info objectForKey:AVAudioSessionInterruptionTypeKey] integerValue] == 1) {
[self.player pause];
} else {
[self.player play];
}
}
- (void)startAnimating {
[self startDisplayLink];
self.displayLink.paused = NO;
}
- (void)stopAnimating {
self.displayLink.paused = YES;
[self stopDisplayLink];
}
- (void)pauseAnimating {
self.displayLink.paused = YES;
[self stopDisplayLink];
}
- (void)resumeAnimating {
if (!self.displayLink) {
[self startDisplayLink];
}
self.displayLink.paused = NO;
}
#pragma mark - DisplayLink
- (void)startDisplayLink {
if (self.displayLink) {
return;
}
self.displayLink = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self]
selector:@selector(handleDisplayLink:)];
[self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
// self.displayLink.preferredFramesPerSecond = 2;
self.displayLink.paused = NO;
}
- (void)handleDisplayLink:(CADisplayLink *)displayLink {
//do something
CMTime outputItemTime = kCMTimeInvalid;
CFTimeInterval nextVSync = ([displayLink timestamp] + [displayLink duration]);
outputItemTime = [[self playerItemVideoOutput] itemTimeForHostTime:nextVSync];
if ([[self playerItemVideoOutput] hasNewPixelBufferForItemTime:outputItemTime]) {
CVPixelBufferRef pixelBuffer = NULL;
pixelBuffer = [[self playerItemVideoOutput] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:NULL];
// ..... do something with pixbuffer
if (self.delegate && [self.delegate respondsToSelector:@selector(videoLivePlayerPixelBufferRef:)]) {
[self.delegate videoLivePlayerPixelBufferRef:pixelBuffer];
}
if (pixelBuffer != NULL) {
CFRelease(pixelBuffer);
}
}
}
- (void)stopDisplayLink {
[self.displayLink invalidate];
self.displayLink = nil;
}
#pragma mark - AVPlayerItemOutputPullDelegate
- (void)outputMediaDataWillChange:(AVPlayerItemOutput *)sender {
[self stopPlay];
}
#pragma mark - Observers
- (void)addNotifications {
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(replay:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
// 音频播放被中断
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(audioSessionInterrupted:) name:AVAudioSessionInterruptionNotification object:nil];
__weak typeof(self) weakSelf = self;
if (@available(iOS 10.0, *)) {
[self.KVOController observe:self.player keyPath:@"timeControlStatus" options:NSKeyValueObservingOptionOld|NSKeyValueObservingOptionNew block:^(id _Nullable observer, id _Nonnull object, NSDictionary<NSString *,id> * _Nonnull change) {
__strong typeof(weakSelf) strongSelf = weakSelf;
[strongSelf observeValueForKeyPath:@"timeControlStatus"];
}];
}
}
- (void)removeNotifications {
[[NSNotificationCenter defaultCenter] removeObserver:self];
[self.KVOController unobserveAll];
}
- (void)replay:(NSNotification *)notification {
if (notification.object == self.player.currentItem) {
[self.player seekToTime:CMTimeMake(0, 1)];
[self.player play];
}
}
- (void)dealloc {
[self removeNotifications];
}
@end
二、获取相册视频
获取相册视频代码如下,得到AVPlayerItem,调用- (void)reloadPlayItem:(AVPlayerItem *)playerItem;
- (void)startPlayAlbumVideo:(SDMediaModel *)mediaModel {
if (!(mediaModel && mediaModel.phasset)) {
return;
}
__weak typeof(self) weakSelf = self;
[[PhotoKitManager shareInstance] requestPlayerItemForVideo:mediaModel.phasset completion:^(AVPlayerItem *playerItem) {
__strong typeof(weakSelf) strongSelf = weakSelf;
[strongSelf.videoLivePlayer reloadPlayItem:playerItem];
[strongSelf.videoLivePlayer startPlay];
} failure:^{
__strong typeof(weakSelf) strongSelf = weakSelf;
}];
}
三、WebRTC来进行RTC直播视频
CADisplayLink获取到CVPixelBufferRef,我们使用WebRTC来进行直播,相当于我们直播我们预先准备好的一个视频。
在之前实现了GPUImage视频通话视频美颜滤镜,这里用到了RTCVideoFrame,我们需要将得到的视频的CVPixelBufferRef,通过封装成RTCVideoFrame,通过调用RTCVideoSource的didCaptureVideoFrame方法实现最终的效果。
RTCVideoSource设置如下
- (RTCVideoTrack *)createVideoTrack {
RTCVideoSource *videoSource = [self.factory videoSource];
self.localVideoSource = videoSource;
// 如果是模拟器
if (TARGET_IPHONE_SIMULATOR) {
if (@available(iOS 10, *)) {
self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
} else {
// Fallback on earlier versions
}
} else{
self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:self];
}
RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
return videoTrack;
}
- (void)createMediaSenders {
if (!self.isPublish) {
return;
}
NSString *streamId = @"stream";
// Audio
RTCAudioTrack *audioTrack = [self createAudioTrack];
self.localAudioTrack = audioTrack;
RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
audioTrackTransceiver.streamIds = @[streamId];
[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];
// Video
RTCVideoTrack *videoTrack = [self createVideoTrack];
self.localVideoTrack = videoTrack;
RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
videoTrackTransceiver.streamIds = @[streamId];
[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
}
详细内容请看iOS端调用ossrs音视频通话部分。
将得到的视频的CVPixelBufferRef,通过封装成RTCVideoFrame,再通过调用RTCVideoSource的didCaptureVideoFrame方法。
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client videoPixelBufferRef:(CVPixelBufferRef)videoPixelBufferRef {
RTCCVPixelBuffer *rtcPixelBuffer =
[[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];
RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *
1000000000;
RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:RTCVideoRotation_0 timeStampNs:timeStampNs];
return rtcVideoFrame;
}
使用RTCVideoSource调用didCaptureVideoFrame
[self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];
其他
之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音视频通话高分辨率不显示画面问题,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
修改SDP中的码率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage视频通话视频美颜滤镜,可以查看:https://blog.csdn.net/gloryFlow/article/details/132265842
四、小结
WebRTC音视频通话-RTC直播本地视频文件。主要用到AVPlayer播放视频,AVPlayerItemVideoOutput得到CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。内容较多,描述可能不准确,请见谅。
本文地址:https://blog.csdn.net/gloryFlow/article/details/132267068文章来源:https://www.toymoban.com/news/detail-646226.html
学习记录,每天不停进步。文章来源地址https://www.toymoban.com/news/detail-646226.html
到了这里,关于WebRTC音视频通话-RTC直播本地视频及相册视频文件的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!