Intel® Collaboration Suite for WebRTC
Community support and discussions on the Intel® Collaboration Suite for WebRTC (Intel® CS for WebRTC).
Announcements
The Intel sign-in experience has changed to support enhanced security controls. If you sign in, click here for more information.

Does it support screen share with IOS SDK?

Somnus_Chen
Beginner
505 Views

Does it support screen share with IOS SDK?

webrtc-table-1.png

0 Kudos
5 Replies
Jianjun_Z_Intel
Employee
505 Views

Hi Somnus,

iOS SDK does not provide a method to create an ICSLocalStream from screen sharing. But you are able to create a RTCMediaStream from whatever source you want, then create an ICSLocalStream with a RTCMediaStream.

krishna__venkata
Beginner
505 Views

Hello Jianjun,

We tried this method, we created local stream using RTCMediastream. 

func localStream() -> RTCMediaStream {

          let factory = self.connectionFactory

        let localStream = factory.mediaStream(withStreamId: getUniqueID())

            videoSource = factory.videoSource()

            print("videoSource : \(videoSource)")

                    print("isScreenShared true");

                 self.samplevideoCapture = ExternalSampleCapture.init(delegate: videoSource!)

               let videoTrack = factory.videoTrack(with: videoSource!, trackId: getUniqueID())

                videoTrack.isEnabled = true

                localStream.addVideoTrack(videoTrack)

                print("localStream :: \(localStream) :: \(videoTrack)");

                  return localStream

    }

 

From the Broadcast extension we are getting CMSampleBuffer using the below code we are converting it to RTCVideoFrame.

if CMSampleBufferGetNumSamples(samplebuffer) != 1 || !CMSampleBufferIsValid(samplebuffer) || !CMSampleBufferDataIsReady(samplebuffer) {

            return

        }

            let pixelFormat = CMSampleBufferGetImageBuffer(samplebuffer)

        let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: pixelFormat!)

        let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(samplebuffer)) * 100000)

        let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._0, timeStampNs: timeStampNs)

 

Now we are passing, the  RTCVideoframe  to Static library and adding it to RTCVideoCapture which will used in creating of RTCMediaStream.

self.samplevideoCapture?.addcaptureVideoFrame(videoFrame)

 

Every Buffer is getting correctly (we testing to store it as image in photos), but its not rendering to the opposite participant getting black screen only.

 

We stuck here, can you suggest us how ICSLocalStream will work for CMSampleBuffer.

 

Thank You.

 

 

krishna__venkata
Beginner
507 Views

Hello Team,

 

Any Update

krishna__venkata
Beginner
507 Views

Hello Team,

Any update will be appreciated.

 

Thank you

Jianjun_Z_Intel
Employee
507 Views

Hi venkata,

Did you get your self.connectionFactory from [RTCPeerConnectionFactory sharedInstance]? It is defined in RTCPeerConnectionFactory+OWT.h.

Reply