hi team:
I have a topic to share my screen to others in iOS
And
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
}
我首先将sampleBufferType 转为webrtc中的RTCMediaStream 然后RTCMediaStream转为你们的OWTLocalStream通过initWithMediaStream方法 但是貌似没有起作用 方便问下你们在处理视频的时候是如何转换的,我哪里出了问题
RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:CMSampleBufferGetImageBuffer(buffer)];
int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(buffer)) * NSEC_PER_SEC;
RTCVideoFrame *videoFrame= [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:RTCVideoRotation_0 timeStampNs:timeStampNs];
还是我在将sampleBufferType转为RTCMediaStream时出现了问题?
Hello team,
We are also having the same issue,
if CMSampleBufferGetNumSamples(samplebuffer) != 1 || !CMSampleBufferIsValid(samplebuffer) || !CMSampleBufferDataIsReady(samplebuffer) {
return
}
let pixelFormat = CMSampleBufferGetImageBuffer(samplebuffer)
let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: pixelFormat!)
let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(samplebuffer)) * 100000)
let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._0, timeStampNs: timeStampNs)
And we are adding RTCVideoCapture.
We stuck here, try to suggest solution for it As Soon As Possible
Thank you
For more complete information about compiler optimizations, see our Optimization Notice.