iOS屏幕共享使用说明

本文档详细说明iOS12+的屏幕共享使用方法。

屏幕流获取

1、在项目中添加Targets

targets

2、添加Broadcast Upload Extension—->Next

添加Broadcast Upload Extension

3、在你想要实现屏幕共享的界面添加调取方法

  1. #import "ViewController.h"
  2. #import <ReplayKit/ReplayKit.h>
  3. #define TAG_SHARESCREEN 10086
  4. @interface ViewController ()
  5. @property (nonatomic, strong) RPSystemBroadcastPickerView*broadPickerView;
  6. @end
  7. @implementation ViewController
  8. - (void)viewDidLoad {
  9. [super viewDidLoad];
  10. // Do any additional setup after loading the view.
  11. _broadPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(50, 50, 200, 200)];
  12. _broadPickerView.preferredExtension = @"此处填写你创建的Broadcast Upload Extension 的Bundle id(不是SetupUI的那个)";
  13. [self.view addSubview:_broadPickerView];
  14. }
  15. @end

运行点击屏幕上的按钮便调取出来了开始录屏的界面

根据需求:可能不想要系统自带的按钮,可以做以下优化:
  1. #import "ViewController.h"
  2. #import <ReplayKit/ReplayKit.h>
  3. #define TAG_SHARESCREEN 10086
  4. @interface ViewController ()
  5. @property (nonatomic, strong) RPSystemBroadcastPickerView*broadPickerView;
  6. @end
  7. @implementation ViewController
  8. - (void)viewDidLoad {
  9. [super viewDidLoad];
  10. // Do any additional setup after loading the view.
  11. _broadPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(50, 50, 200, 200)];
  12. _broadPickerView.preferredExtension = @"此处填写你创建的Broadcast Upload Extension 的Bundle id(不是SetupUI的那个)";
  13. UIButton *button = [[UIButton alloc] initWithFrame:CGRectMake(50, 50, 300, 40)];
  14. [button setTitle:@"点我就好了" forState:UIControlStateNormal];
  15. [button addTarget:self action:@selector(clickedOnStartRecordButton:) forControlEvents:UIControlEventTouchUpInside];
  16. [button setTitleColor:[UIColor redColor] forState:UIControlStateNormal];
  17. button.tag = TAG_SHARESCREEN;
  18. [self.view addSubview:button];
  19. }
  20. - (void)clickedOnStartRecordButton:(UIButton *)sender
  21. {
  22. if (sender.tag == TAG_SHARESCREEN)
  23. {
  24. for (UIView *view in _broadPickerView.subviews)
  25. {
  26. if ([view isKindOfClass:[UIButton class]])
  27. {
  28. //调起录像方法,UIControlEventTouchUpInside的方法看其他文章用的是UIControlEventTouchDown,
  29. //我使用时用UIControlEventTouchUpInside用好使,看个人情况决定
  30. [(UIButton*)view sendActionsForControlEvents:UIControlEventTouchUpInside];
  31. }
  32. }
  33. }
  34. }
  35. @end

调试:

1、先运行项目(demo)到手机上

例:demo

运行项目

此时操作录屏等操作,只能断点到demo里的代码

2、检测IDRS_Demo中的数据

检测IDRS_Demo中的数据:运行IDRS_Demo 选择上述的项目IDRS_Demo

选择项目匹配:

选择与哪个项目匹配此时可断点到 IDRS_Demo 下的所有代码

3、检测IDRS_DemoSetupUI:

检测IDRS_DemoSetupUI:运行IDRS_DemoSetupUI选择上述的项目(这个地方没有用到)IDRS_DemoSetupUI

此时可断点到 IDRS_DemoSetupUI 下的所有代码

同步数据

把IDRS_Demo获取的屏幕流推送给demo(主App),demo(主App)推送屏幕流给RTC

简述方法实现

Socket实现数据同步:

  • 1、使用了Socket和Codec两个文件夹中的代码,把两个文件夹拖入自己的主App中。
  • 2、SampleHandler中使用时,需要关联主App加载的代码,方法如下:

连接

详细的使用方法

1、实现获取屏幕流

SampleHandler.m中实现,使用socket发送

  1. #import "SampleHandler.h"
  2. #import "IDRSClientSocket.h"
  3. @interface SampleHandler()
  4. @property(nonatomic , strong)IDRSClientSocket *clientSocket;
  5. @end

监听开始

  1. - (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
  2. // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
  3. self.clientSocket = [[IDRSClientSocket alloc] init];
  4. [self.clientSocket createCliectSocket];
  5. [self sendStringData:@"初始化"];
  6. }

监听结束

  1. - (void)broadcastFinished {
  2. // User has requested to finish the broadcast.
  3. [self sendStringData:@"停止"];
  4. [_clientSocket close];
  5. }

监听数据流

  1. - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
  2. //监听数据回流:
  3. switch (sampleBufferType) {
  4. case RPSampleBufferTypeVideo:
  5. // Handle video sample buffer
  6. [self sendData:sampleBuffer];
  7. break;
  8. case RPSampleBufferTypeAudioApp:
  9. // Handle audio sample buffer for app audio
  10. //音频流信息:44100,双声道,16bit
  11. break;
  12. case RPSampleBufferTypeAudioMic:
  13. // Handle audio sample buffer for mic audio
  14. //音频流信息:48000,单声道,16bit
  15. break;
  16. default:
  17. break;
  18. }
  19. }

具体发送信息方法

  1. //数据流推送
  2. - (void)sendData:(CMSampleBufferRef)sampleBuffer{
  3. [self.clientSocket encodeBuffer:sampleBuffer];
  4. }
  5. //文字推送
  6. -(void)sendStringData:(NSString*)string{
  7. [self.clientSocket encodeStringBuffer:string];
  8. }

2、实现socket接受

  1. #import "ViewController.h"
  2. #import <ReplayKit/ReplayKit.h>
  3. #import "IDRSServerSocket.h"
  4. @interface ViewController ()<IDRSServerSocketProtocol>
  5. @property (nonatomic, strong) RPSystemBroadcastPickerView*broadPickerView;
  6. @property(nonatomic , strong)IDRSServerSocket *serverSocket;
  7. @end

初始化socket

  1. - (void)viewDidLoad {
  2. [super viewDidLoad];
  3. // Do any additional setup after loading the view.
  4. [self.serverSocket createServerSocket];
  5. }
  1. -(IDRSServerSocket *)serverSocket{
  2. if (!_serverSocket) {
  3. IDRSServerSocket *socket = [[IDRSServerSocket alloc] init];
  4. socket.delegate = self;
  5. _serverSocket = socket;
  6. }
  7. return _serverSocket;
  8. }

接收信息

  1. -(void)didProcessSampleBuffer:(CMSampleBufferRef)sampleBuffer{
  2. //把数据推送给aliRTC
  3. [self screenStreaming:sampleBuffer];
  4. }
  5. -(void)didGetStringBuffer:(NSString *)string{
  6. if ([string isEqualToString:@"初始化"]) {
  7. }else if ([string isEqualToString:@"停止"]){
  8. //更换本地预览视图,断开socket连接
  9. isVideoTrue = true;
  10. [self.engine setExternalVideoSource:NO useTexture:NO sourceType:AliRtcVideosourceCameraLargeType renderMode:AliRtcRenderModeAuto];
  11. [self.engine startPreview];
  12. [self.serverSocket disconnect];
  13. }
  14. }

3、屏幕流推给aliRTC

aliRTC外部视频输入接口

  1. -(void)screenStreaming:(CMSampleBufferRef)sampleBuffer{
  2. if (isVideoTrue) {
  3. isVideoTrue = false;
  4. [self.engine setLocalViewConfig:nil forTrack:AliRtcVideoTrackCamera];
  5. [self.engine setExternalVideoSource:YES useTexture:NO sourceType:AliRtcVideosourceCameraLargeType renderMode:AliRtcRenderModeFill];
  6. }
  7. CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
  8. AliRtcVideoDataSample *dataSample = [[AliRtcVideoDataSample alloc] init];
  9. dataSample.pixelBuffer = pixelBuffer;
  10. dataSample.type = AliRtcBufferType_CVPixelBuffer;
  11. int ret = 0;
  12. ret = [self.engine pushExternalVideoFrame:dataSample sourceType:AliRtcVideosourceCameraLargeType];
  13. }