WebRTC音视频通话-RTC直播本地视频及相册视频文件

这篇具有很好参考价值的文章主要介绍了WebRTC音视频通话-RTC直播本地视频及相册视频文件。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

WebRTC音视频通话-RTC直播本地视频及相册视频文件

WebRTC音视频通话-RTC直播本地视频文件效果图如下
WebRTC音视频通话-RTC直播本地视频及相册视频文件,移动开发,iphone开发,Objective-c,webrtc,音视频,实时音视频,视频播放,AVPlayer,直播视频

WebRTC音视频通话-RTC直播本地视频文件时候,用到了AVPlayer、CADisplayLink。

一、通过AVPlayer播放本地视频

  • AVPlayer是什么?

AVPlayer是基于AVFoundation框架的一个类,很接近底层,灵活性强,可以自定义视频播放样式。

  • AVPlayerLayer是什么?

AVPlayerLayer是视频播放时候的画面展示层。

  • CADisplayLink是什么?

CADisplayLink和NSTimer一样,是一个定时器。但是CADisplayLink会和屏幕的刷新率始终保持一致(很多时候会使用CADisplayLink来检测屏幕的帧率)。

  • AVPlayerItemVideoOutput是什么?

AVPlayerItemVideoOutput是PlayerItem视频的输出,通过AVPlayerItemVideoOutput可以获取视频的CVPixelBufferRef

下面就实现本地视频播放过程中结合ossrs进行WebRTC直播。

AVPlayer设置播放本地视频

 AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:videoPath]];
 (void)reloadPlayItem:(AVPlayerItem *)playerItem {
    self.playerItem = playerItem;
    [self initPlayerVideoOutput];
        
    self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}

开始配置播放展示AVPlayerLayer

- (void)startPlay {
    if (self.isPlaying) {
        return;
    }
    
    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    self.playerLayer = playerLayer;
    self.playerLayer.backgroundColor = [UIColor clearColor].CGColor;
    self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    
    [self.superView.layer addSublayer:self.playerLayer];
    self.playerLayer.frame = self.superView.bounds;

    [self.player seekToTime:CMTimeMake(0, 1)];
    [self.player play];
    [self startAnimating];
}

通过KVO监控播放器状态

/**
 *  通过KVO监控播放器状态
 */
- (void)observeValueForKeyPath:(NSString *)keyPath {
    DebugLog(@"observeValueForKeyPath:%@", keyPath);
    
    AVPlayerItem *videoItem = self.playerItem;
    if ([keyPath isEqualToString:@"timeControlStatus"]) {
        
        /**
         typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {
             AVPlayerTimeControlStatusPaused = 0,
             AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate = 1,
             AVPlayerTimeControlStatusPlaying = 2
         } API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(3.0));
         */
        // 监听播放器timeControlStatus 指示当前是否正在播放,无限期暂停播放,或在等待适当的网络条件时暂停播放
        if (@available(iOS 10.0, *)) {
            switch (self.player.timeControlStatus) {
                case AVPlayerTimeControlStatusPaused: {
                    NSLog(@"AVPlayerTimeControlStatusPaused");
                    // 暂停
                    self.isPlaying = NO;
                }
                    break;
                case AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate: {
                    NSLog(@"AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate");
                    // 等待
                }
                    break;
                case AVPlayerTimeControlStatusPlaying: {
                    NSLog(@"AVPlayerTimeControlStatusPlaying");
                    // 播放
                    self.isPlaying = YES;
                }
                    break;
                default:
                    break;
            }
        } else {
            // Fallback on earlier versions
        }
    }
}

设置关键的AVPlayerItemVideoOutput

- (void)initPlayerVideoOutput {
    //输出yuv 420格式
    NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
    AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
    [self.playerItem addOutput:output];
    self.playerItemVideoOutput = output;
    
    [self.playerItemVideoOutput setDelegate:self queue:dispatch_get_main_queue()];
    
    // 如果将 AVPlayerItemVideoOutput 类的输出(对于 suppressesPlayerRendering 的值为 YES)添加到 AVPlayerItem,则该项目的视频媒体将不会由 AVPlayer 呈现,而音频媒体、字幕媒体和其他类型的媒体(如果存在) , 将被渲染。
    self.playerItemVideoOutput.suppressesPlayerRendering = NO;
}

之后通过displayLink开启实时调用AVPlayerItemVideoOutput得到视频画面CVPixelBufferRef

#pragma mark - DisplayLink
- (void)startDisplayLink {
    if (self.displayLink) {
        return;
    }
    self.displayLink = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self]
     selector:@selector(handleDisplayLink:)];
    [self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
    // self.displayLink.preferredFramesPerSecond = 2;
    self.displayLink.paused = NO;
}

- (void)handleDisplayLink:(CADisplayLink *)displayLink {
    //do something
    CMTime outputItemTime = kCMTimeInvalid;
    CFTimeInterval nextVSync = ([displayLink timestamp] + [displayLink duration]);
    outputItemTime = [[self playerItemVideoOutput] itemTimeForHostTime:nextVSync];
    if ([[self playerItemVideoOutput] hasNewPixelBufferForItemTime:outputItemTime]) {
        CVPixelBufferRef pixelBuffer = NULL;
        pixelBuffer = [[self playerItemVideoOutput] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:NULL];
        // ..... do something with pixbuffer
        if (self.delegate && [self.delegate respondsToSelector:@selector(videoLivePlayerPixelBufferRef:)]) {
            [self.delegate videoLivePlayerPixelBufferRef:pixelBuffer];
        }
        
        if (pixelBuffer != NULL) {
            CFRelease(pixelBuffer);
        }
    }
}

- (void)stopDisplayLink {
    [self.displayLink invalidate];
    self.displayLink = nil;
}

这样在播放过程中,通过CADisplayLink获取到CVPixelBufferRef,之后使用WebRTC来进行直播。

完整播放视频过程中获得CVPixelBufferRef代码如下

SDVideoLivePlayer.h

#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>

@protocol SDVideoLivePlayerDelegate;
@interface SDVideoLivePlayer : NSObject

@property (nonatomic, strong) UIView *superView;

@property (nonatomic, weak) id<SDVideoLivePlayerDelegate> delegate;

- (instancetype)initWithSuperView:(UIView *)superView;

- (void)reloadPlayItem:(AVPlayerItem *)playerItem;

/// 开始播放
- (void)startPlay;

/// 结束播放
- (void)stopPlay;

@end

@protocol SDVideoLivePlayerDelegate <NSObject>

- (void)videoLivePlayerPixelBufferRef:(CVPixelBufferRef)pixelBufferRef;

@end

SDVideoLivePlayer.m

#import "SDVideoLivePlayer.h"

@interface SDVideoLivePlayer ()<AVPlayerItemOutputPullDelegate> {

}

@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) AVPlayerLayer *playerLayer;
@property (nonatomic, strong) AVPlayerItem *playerItem;
@property (nonatomic, assign) BOOL isPlaying;
@property (nonatomic, strong) AVPlayerItemVideoOutput *playerItemVideoOutput;
@property (nonatomic, strong) CADisplayLink *displayLink;

@end

@implementation SDVideoLivePlayer

- (instancetype)initWithSuperView:(UIView *)superView
{
    self = [super init];
    if (self) {
        self.superView = superView;
        self.isPlaying = NO;
        [self addNotifications];
        [self initPlayerVideoOutput];
    }
    return self;
}

- (void)initPlayerVideoOutput {
    //输出yuv 420格式
    NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
    AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
    [self.playerItem addOutput:output];
    self.playerItemVideoOutput = output;
    
    [self.playerItemVideoOutput setDelegate:self queue:dispatch_get_main_queue()];
    
    // 如果将 AVPlayerItemVideoOutput 类的输出(对于 suppressesPlayerRendering 的值为 YES)添加到 AVPlayerItem,则该项目的视频媒体将不会由 AVPlayer 呈现,而音频媒体、字幕媒体和其他类型的媒体(如果存在) , 将被渲染。
    self.playerItemVideoOutput.suppressesPlayerRendering = NO;
}

- (void)reloadPlayItem:(AVPlayerItem *)playerItem {
    
    self.playerItem = playerItem;
    [self initPlayerVideoOutput];
        
    self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}

- (void)startPlay {
    if (self.isPlaying) {
        return;
    }
    
    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    self.playerLayer = playerLayer;
    self.playerLayer.backgroundColor = [UIColor clearColor].CGColor;
    self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    
    [self.superView.layer addSublayer:self.playerLayer];
    self.playerLayer.frame = self.superView.bounds;

    [self.player seekToTime:CMTimeMake(0, 1)];
    [self.player play];
    [self startAnimating];
}

- (void)stopPlay {
    self.isPlaying = NO;
    [self.player pause];
    [self stopAnimating];
}

/**
 *  通过KVO监控播放器状态
 */
- (void)observeValueForKeyPath:(NSString *)keyPath {
    DebugLog(@"observeValueForKeyPath:%@", keyPath);
    
    AVPlayerItem *videoItem = self.playerItem;
    if ([keyPath isEqualToString:@"timeControlStatus"]) {
        
        /**
         typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {
             AVPlayerTimeControlStatusPaused = 0,
             AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate = 1,
             AVPlayerTimeControlStatusPlaying = 2
         } API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(3.0));
         */
        // 监听播放器timeControlStatus 指示当前是否正在播放,无限期暂停播放,或在等待适当的网络条件时暂停播放
        if (@available(iOS 10.0, *)) {
            switch (self.player.timeControlStatus) {
                case AVPlayerTimeControlStatusPaused: {
                    NSLog(@"AVPlayerTimeControlStatusPaused");
                    // 暂停
                    self.isPlaying = NO;
                }
                    break;
                case AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate: {
                    NSLog(@"AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate");
                    // 等待
                }
                    break;
                case AVPlayerTimeControlStatusPlaying: {
                    NSLog(@"AVPlayerTimeControlStatusPlaying");
                    // 播放
                    self.isPlaying = YES;
                }
                    break;
                default:
                    break;
            }
        } else {
            // Fallback on earlier versions
        }
    }
}

- (void)audioSessionInterrupted:(NSNotification *)notification{
    //通知类型
    NSDictionary * info = notification.userInfo;
    if ([[info objectForKey:AVAudioSessionInterruptionTypeKey] integerValue] == 1) {
        [self.player pause];
    } else {
        [self.player play];
    }
}

- (void)startAnimating {
    [self startDisplayLink];
    self.displayLink.paused = NO;
}

- (void)stopAnimating {
    self.displayLink.paused = YES;
    [self stopDisplayLink];
}

- (void)pauseAnimating {
    self.displayLink.paused = YES;
    [self stopDisplayLink];
}

- (void)resumeAnimating {
    if (!self.displayLink) {
        [self startDisplayLink];
    }
    self.displayLink.paused = NO;
}

#pragma mark - DisplayLink
- (void)startDisplayLink {
    if (self.displayLink) {
        return;
    }
    self.displayLink = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self]
     selector:@selector(handleDisplayLink:)];
    [self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
    // self.displayLink.preferredFramesPerSecond = 2;
    self.displayLink.paused = NO;
}

- (void)handleDisplayLink:(CADisplayLink *)displayLink {
    //do something
    CMTime outputItemTime = kCMTimeInvalid;
    CFTimeInterval nextVSync = ([displayLink timestamp] + [displayLink duration]);
    outputItemTime = [[self playerItemVideoOutput] itemTimeForHostTime:nextVSync];
    if ([[self playerItemVideoOutput] hasNewPixelBufferForItemTime:outputItemTime]) {
        CVPixelBufferRef pixelBuffer = NULL;
        pixelBuffer = [[self playerItemVideoOutput] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:NULL];
        // ..... do something with pixbuffer
        if (self.delegate && [self.delegate respondsToSelector:@selector(videoLivePlayerPixelBufferRef:)]) {
            [self.delegate videoLivePlayerPixelBufferRef:pixelBuffer];
        }
        
        if (pixelBuffer != NULL) {
            CFRelease(pixelBuffer);
        }
    }
}

- (void)stopDisplayLink {
    [self.displayLink invalidate];
    self.displayLink = nil;
}

#pragma mark - AVPlayerItemOutputPullDelegate
- (void)outputMediaDataWillChange:(AVPlayerItemOutput *)sender {
    [self stopPlay];
}

#pragma mark - Observers
- (void)addNotifications {
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(replay:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
    
    // 音频播放被中断
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(audioSessionInterrupted:) name:AVAudioSessionInterruptionNotification object:nil];
    
    __weak typeof(self) weakSelf = self;
    if (@available(iOS 10.0, *)) {
        [self.KVOController observe:self.player keyPath:@"timeControlStatus" options:NSKeyValueObservingOptionOld|NSKeyValueObservingOptionNew block:^(id  _Nullable observer, id  _Nonnull object, NSDictionary<NSString *,id> * _Nonnull change) {
            __strong typeof(weakSelf) strongSelf = weakSelf;
            [strongSelf observeValueForKeyPath:@"timeControlStatus"];
        }];
    }
}

- (void)removeNotifications {
    [[NSNotificationCenter defaultCenter] removeObserver:self];
    [self.KVOController unobserveAll];
}

- (void)replay:(NSNotification *)notification {
    if (notification.object == self.player.currentItem) {
        [self.player seekToTime:CMTimeMake(0, 1)];
        [self.player play];
    }
}

- (void)dealloc {
    [self removeNotifications];
}

@end

二、获取相册视频

获取相册视频代码如下,得到AVPlayerItem,调用- (void)reloadPlayItem:(AVPlayerItem *)playerItem;

- (void)startPlayAlbumVideo:(SDMediaModel *)mediaModel {
    if (!(mediaModel && mediaModel.phasset)) {
        return;
    }
    __weak typeof(self) weakSelf = self;
    [[PhotoKitManager shareInstance] requestPlayerItemForVideo:mediaModel.phasset completion:^(AVPlayerItem *playerItem) {
        __strong typeof(weakSelf) strongSelf = weakSelf;
        [strongSelf.videoLivePlayer reloadPlayItem:playerItem];
        [strongSelf.videoLivePlayer startPlay];
    } failure:^{
        __strong typeof(weakSelf) strongSelf = weakSelf;
    }];
}

三、WebRTC来进行RTC直播视频

CADisplayLink获取到CVPixelBufferRef,我们使用WebRTC来进行直播,相当于我们直播我们预先准备好的一个视频。
在之前实现了GPUImage视频通话视频美颜滤镜,这里用到了RTCVideoFrame,我们需要将得到的视频的CVPixelBufferRef,通过封装成RTCVideoFrame,通过调用RTCVideoSource的didCaptureVideoFrame方法实现最终的效果。

RTCVideoSource设置如下

- (RTCVideoTrack *)createVideoTrack {
    RTCVideoSource *videoSource = [self.factory videoSource];
    self.localVideoSource = videoSource;

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
        if (@available(iOS 10, *)) {
            self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
        } else {
            // Fallback on earlier versions
        }
    } else{
        self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:self];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}


- (void)createMediaSenders {
    if (!self.isPublish) {
        return;
    }
    
    NSString *streamId = @"stream";
    
    // Audio
    RTCAudioTrack *audioTrack = [self createAudioTrack];
    self.localAudioTrack = audioTrack;
    
    RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    audioTrackTransceiver.streamIds = @[streamId];
    
    [self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];
    
    // Video
    RTCVideoTrack *videoTrack = [self createVideoTrack];
    self.localVideoTrack = videoTrack;
    RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    videoTrackTransceiver.streamIds = @[streamId];
    [self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
}

详细内容请看iOS端调用ossrs音视频通话部分。

将得到的视频的CVPixelBufferRef,通过封装成RTCVideoFrame,再通过调用RTCVideoSource的didCaptureVideoFrame方法。

- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client videoPixelBufferRef:(CVPixelBufferRef)videoPixelBufferRef {
    RTCCVPixelBuffer *rtcPixelBuffer =
    [[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
    int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *
    1000000000;
    RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer                                   rotation:RTCVideoRotation_0                                      timeStampNs:timeStampNs];
                  
    return rtcVideoFrame;
}

使用RTCVideoSource调用didCaptureVideoFrame

[self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];

其他
之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音视频通话高分辨率不显示画面问题,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
修改SDP中的码率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage视频通话视频美颜滤镜,可以查看:https://blog.csdn.net/gloryFlow/article/details/132265842

四、小结

WebRTC音视频通话-RTC直播本地视频文件。主要用到AVPlayer播放视频,AVPlayerItemVideoOutput得到CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。内容较多,描述可能不准确,请见谅。

本文地址:https://blog.csdn.net/gloryFlow/article/details/132267068

学习记录,每天不停进步。文章来源地址https://www.toymoban.com/news/detail-646226.html

到了这里,关于WebRTC音视频通话-RTC直播本地视频及相册视频文件的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • WebRTC音视频通话-实现iOS端调用ossrs视频通话服务

    WebRTC音视频通话-实现iOS端调用ossrs视频通话服务 之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196 这里iOS端使用GoogleWebRTC联调ossrs实现视频通话功能。 iOS端端效果图 ossrs效果图 WebRTC (Web Real-Time Communications) 是一项实时通讯技术,它允许网络应用或者站

    2024年02月13日
    浏览(59)
  • WebRTC实战-第二章-使用WebRTC实现音视频通话

    、 什么是WebRTC|WebRTC入门到精通必看|快速学会音视频通话原理|WebRTC超全资料分享FFmpeg/rtmp/hls/rtsp/SRS WebRTC **WebRTC详细指南** http://www.vue5.com/webrtc/webrtc.html WEBRTC三种类型(Mesh、MCU 和 SFU)的多方通信架构 WebRTC API包括媒体捕获,音频和视频编码和解码,传输层和会话管理 。 假设

    2023年04月12日
    浏览(53)
  • WebRTC音视频通话-实现GPUImage视频美颜滤镜效果iOS

    WebRTC音视频通话-实现GPUImage视频美颜滤镜效果 在WebRTC音视频通话的GPUImage美颜效果图如下 可以看下 之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196 之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724 之前WebR

    2024年02月12日
    浏览(53)
  • WebRTC音视频通话-WebRTC推拉流过程中日志log输出

    WebRTC音视频通话-WebRTC推拉流过程中日志log输出 之前实现iOS端调用ossrs服务实现推拉流流程。 推流:https://blog.csdn.net/gloryFlow/article/details/132262724 拉流:https://blog.csdn.net/gloryFlow/article/details/132417602 在推拉流过程中的WebRTC的相关日志log输出可以看到一些相关描述信息。在WebRTC日志

    2024年02月10日
    浏览(61)
  • Unity Metaverse(八)、RTC Engine 基于Agora声网SDK实现音视频通话

    本文介绍如何在Unity中接入声网SDK,它可以应用的场景有许多,例如直播、电商、游戏、社交等,音视频通话是其实时互动的基础能力。 如下图所示,可以在官网中选择Unity SDK进行下载,也可以到 Unity Asset Store 资源商店中搜索 Agora SDK 进行下载导入。 在官网中前往 Console 控制

    2024年02月09日
    浏览(56)
  • WebRTC音视频通话-新增或修改SDP中的码率Bitrate限制

    WebRTC音视频通话-新增或修改SDP中的码率Bitrate限制参数 之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196 之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724 之前WebRTC音视频通话高分辨率不显示画面问题,可以查

    2024年02月13日
    浏览(45)
  • WebRTC | 音视频直播客户端框架

            端到端通信互动技术可分解为以下几个技术难点:客户端技术、服务器技术、全球设备网络适配技术和通信互动质量监控与展示技术。         音视频直播可分成两条技术路线:一条是以音视频会议为代表的实时互动直播;另一条是以娱乐直播为代表的流媒体

    2024年02月14日
    浏览(49)
  • 基于webrtc的音视频通话,实现相机流识别人脸的功能

    这几天研究了一下webRTC的基础能力,在此基础之上能实现的视频通话,互动等更多实用功能。项目中使用的是阿里的rtc,我研究的是声网的是否符合功能,后续会总结和记录一下应用到的几个功能实现方法。 今天要记录的功能是项目流识别人脸的功能,其实类似功能很常见了

    2024年04月28日
    浏览(52)
  • WebRTC技术文档 -- 1.音视频直播(笔记)

    1.1 两条技术路线 1.1.1 以音视频会议为代表的实时互动直播 互动直播主要解决音视频远程交流问题,实时性较强,时延一般低于500ms。 1.1.2 以娱乐直播为代表的流媒体分发 娱乐直播主要解决音视频大规模分发问题,实时性较差,时延一般在3s以上。 1.2 直播技术 WebRTC用于实时

    2024年02月22日
    浏览(60)
  • 【音视频流媒体】2、WebRTC 直播超详细介绍

    一对一直播框架: WebRTC终端: 音视频采集, 编解码, NAT穿越, 音视频数据传输 Signal服务器: 信令处理(如加入房间, 离开房间, 传递媒体协商消息) STUN/TURN服务器: 获取WebRTC终端在公网的IP地址, NAT穿越失败后的数据中转. js中 var promise = navigator.mediaDevices.getUserMedia(constraints); 可访问摄

    2023年04月18日
    浏览(50)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包