![]() ![]() Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented. Hello, I encountered the same problem as you. Note: I am checking on iOS device: iPhone 11, iOS version: 14.0 Please check the screenshot attached.Īm I missing anything else to enable screen share option?ĭo I need to do any additional code base settings? I have added the latest iOS SDK 3.5.0 (iOS SDK URL ref: ) in my project and checked for screen sharing, but I can not see the screen share option. But I hope these concepts help someone looking into this.Įl 11 sept 2021, a las 13:48, 夕若 : I am trying to enable screen sharing option. Use Core Data ( Haven't gotten too much into this ).since the read write has to happen at a pretty high speed of atleast one read write per 200 - 400 ms. This could work but it might lead to frame drops or lags. Use NSFileCoordinator and coordinate read write into the same file from both extension and App.Use to communicate small back and forth information if required ( But it cannot be used to send CMSampleBuffer, atleast as far as I know ).There are couple of ways I found we could do this. Pass the CMSampleBuffer to the app and then follow the same flow as the camera feed does.Couldn't really get too far using this method but like saghul said since extension has memory limits it would probably not work as well as expected. Either process the CMSampleBuffer within the extension completely.Once the broadcast extension is added to the app we get the CMSampleBuffers after that two things can be done :. On iOS, according to me there are two way to achieve this. Thanks a ton for your quick don't have any code snippets as such, but just some concepts. Please suggest if there are any changes to this flow as well. Let me know if this just sounds stupid and I am completely on the wrong track. On click of button again, get the original camera feed if camera was on.I have made an assumption that just by implementing the RTCVideoCapturer API and calling a function similar to publishSampleBuffer we have already pushed our video frames for the other users in the conference to view it.The RTCVideoCapturer API will handle sending the CMSampleBufferRef obtained from replayKit to the other users and all that is required from us would be to implement a function similar to publishSampleBuffer as seen in the the RTCFileVideoCapturer.m file.Add the broadcast upload extension to the app and send the CMSampleBufferRef buffers obtained from the extension to the webrtc React Native module's implementation of the RTCVideoCapturer API. ![]() On click of this button stop the camera feed (requires research) and call the implementation of RTCVideoCapturer API that will be created in the the webrtc React Native Module.Create a button on the React Native APP UI.→ Jitsi Meet for iOS has to implement the ReplayKit2 Broadcast WebRTC Adapter in the native part of the app or extend and implement the react native module for it. The Jitsi Meet App for iOS is based on React Native, the only ReplayKit React Native Module I've found only supports recording, but not the required broadcast functionality. For streaming the Screen Content - not only of the Jitsi Meet app itself but from the whole system - into the Jitsi video call, the ReplayKit2 Broadcast Functionality could be used. This could work - iOS Apps can access the Screen Content with the ReplayKit Framework. Screen Sharing with the Jitsi Meet iOS App Safari for iOS lacks the getDisplayMedia API to capture the screens content. Screen Sharing with Jitsi Meet running in a browser (safari) I've just looked a bit around what steps are needed for jitsi meet to implement screen sharing from iOS: ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |