Intel® Collaboration Suite for WebRTC
Community support and discussions on the Intel® Collaboration Suite for WebRTC (Intel® CS for WebRTC).
Announcements
FPGA community forums and blogs on community.intel.com are migrating to the new Altera Community and are read-only. For urgent support needs during this transition, please visit the FPGA Design Resources page or contact an Altera Authorized Distributor.
1151 Discussions

Does it support screen share with IOS SDK?

Somnus_Chen
Beginner
1,368 Views

Does it support screen share with IOS SDK?

webrtc-table-1.png

0 Kudos
5 Replies
Jianjun_Z_Intel
Employee
1,368 Views

Hi Somnus,

iOS SDK does not provide a method to create an ICSLocalStream from screen sharing. But you are able to create a RTCMediaStream from whatever source you want, then create an ICSLocalStream with a RTCMediaStream.

0 Kudos
krishna__venkata
Beginner
1,368 Views

Hello Jianjun,

We tried this method, we created local stream using RTCMediastream. 

func localStream() -> RTCMediaStream {

          let factory = self.connectionFactory

        let localStream = factory.mediaStream(withStreamId: getUniqueID())

            videoSource = factory.videoSource()

            print("videoSource : \(videoSource)")

                    print("isScreenShared true");

                 self.samplevideoCapture = ExternalSampleCapture.init(delegate: videoSource!)

               let videoTrack = factory.videoTrack(with: videoSource!, trackId: getUniqueID())

                videoTrack.isEnabled = true

                localStream.addVideoTrack(videoTrack)

                print("localStream :: \(localStream) :: \(videoTrack)");

                  return localStream

    }

 

From the Broadcast extension we are getting CMSampleBuffer using the below code we are converting it to RTCVideoFrame.

if CMSampleBufferGetNumSamples(samplebuffer) != 1 || !CMSampleBufferIsValid(samplebuffer) || !CMSampleBufferDataIsReady(samplebuffer) {

            return

        }

            let pixelFormat = CMSampleBufferGetImageBuffer(samplebuffer)

        let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: pixelFormat!)

        let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(samplebuffer)) * 100000)

        let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._0, timeStampNs: timeStampNs)

 

Now we are passing, the  RTCVideoframe  to Static library and adding it to RTCVideoCapture which will used in creating of RTCMediaStream.

self.samplevideoCapture?.addcaptureVideoFrame(videoFrame)

 

Every Buffer is getting correctly (we testing to store it as image in photos), but its not rendering to the opposite participant getting black screen only.

 

We stuck here, can you suggest us how ICSLocalStream will work for CMSampleBuffer.

 

Thank You.

 

 

0 Kudos
krishna__venkata
Beginner
1,370 Views

Hello Team,

 

Any Update

0 Kudos
krishna__venkata
Beginner
1,370 Views

Hello Team,

Any update will be appreciated.

 

Thank you

0 Kudos
Jianjun_Z_Intel
Employee
1,370 Views

Hi venkata,

Did you get your self.connectionFactory from [RTCPeerConnectionFactory sharedInstance]? It is defined in RTCPeerConnectionFactory+OWT.h.

0 Kudos
Reply