I want to integrate video streaming to game using WebRTC, anyone has any idea how to achieve it?
There are 3 options I can figure out right now:
1, Implement on native, then call show video stream from cocos2d-x
2, Add webview which contains code that can play/stream video (webview support webrtc on android & web)
3, Dispatch frame data from webrtc, then render to RendererTexture
The option 1st and 2nd has pros / cons:
- Pros: easy to integrate
- Stay on top of all elements
- 1st option: difficult to manage from c++ code
- 2nd option: Webrtc is not supported native on iOS
The last option:
- Easy to control
- Difficult to implement
- Hard to implement texture cache on iOS and Android
Above is some of my thought. Do you have any suggestion or any references to achieve this?
Thank you so much.
P/S: sorry for my English