Imagine a world where your phone, TV, and computer could all communicate on a common platform. Imagine it was easy to add video chat and peer-to-peer data sharing to your web application. That’s the vision of WebRTC. WebRTC is used in various apps like WhatsApp, Facebook Messenger, appear.in and platforms such as TokBox, Addlive, FrozenMountain, etc.
Following is the list of components required to build a complete WebRTC infrastructure solution.
- Signaling Server
- Session Traversal Utilities for NAT (STUN) Server
- Traversal Using Relay NAT (TURN) Server
- Interactive Connectivity Establishment Framework (ICE)
- Selective Forwarder Unit (SFU) / Multipoint Control Unit (MCU)
- Android SDK & Sample Android Client
- iOS SDK & Sample iOS Client
- JS SDK & Sample Web Client
- Browser Plugins for Screensharing
- System Diagnosis to validate hardware
- Webrtc Extension Plugins for non-compatible browsers like Safari, IE etc
- Deployment Configuration Scripts
Supported Platforms: Chrome, Firefox, Opera, Android, iOS.
WebRTC can’t create direct connections between peers without the help of a signaling server. The signaling server is not something standardized that your application can use. Actually, any communication mechanism that allows us to exchange Session Description Protocol (SDP) data between peers can be used for signalization
Open-Source Signalling Server Implementations: https://github.com/muaz-khan/WebRTC-Experiment/blob/master/Signaling.md
The recommended approach is to build own signaling server for better management of flow and architecture of app…
A STUN server allows clients to discover their public IP address and the type of NAT they are behind. This information is used to establish a media connection. In most cases, a STUN server is only used during the connection setup and once that session has been established, media will flow directly between clients.
If a STUN server cannot establish the connection, ICE can turn to TURN
RTCPeerConnection tries to set up direct communication between peers over UDP. If that fails, RTCPeerConnection resorts to TCP. If that fails, TURN servers can be used as a fallback, relaying data between endpoints. TURN servers have public addresses, so they can be contacted by peers even if the peers are behind firewalls or proxies. TURN servers have a conceptually simple task — to relay a stream — but, unlike STUN servers, they inherently consume a lot of bandwidth. In other words, TURN servers need to be beefier.
NOTE: Amazon VM Images & Google Public Servers are available for both STUN & TURN
For testing, Google runs a public STUN server, stun.l.google.com:19302, as used by apprtc.appspot.com. For a production STUN/TURN service, we recommend using the rfc5766-turn-server; source code for STUN and TURN servers is available from code.google.com/p/rfc5766-turn-server, which also provides links to several sources of information about server installation. A VM image for Amazon Web Services is also available. An alternative TURN server is returned, available as source code and also for AWS.
WebRTC apps can use the ICE framework to overcome the complexities of real-world networking. To enable this to happen, your application must pass ICE server URLs to RTCPeerConnection.
ICE tries to find the best path to connect peers. It tries all possibilities in parallel and chooses the most efficient option that works. ICE first tries to make a connection using the host address obtained from a device’s operating system and network card; if that fails (which it will for devices behind NATs) ICE obtains an external address using a STUN server, and if that fails, traffic is routed via a TURN relay server.
MCU / SFU
This is a server that works as a bridge to distribute media between a large number of participants. MCUs (Multipoint Control Unit) or SFU (Selective Forwarding Unit) can cope with different resolutions, codecs and frame rates within a video conference, handle transcoding, do selective stream forwarding, and mix or record audio and video. For multi-party calls, there are a number of issues to consider: in particular, how to display multiple video inputs and mix audio from multiple sources.
For example, Licode (previously know as Lynckia) produces an open source MCU for WebRTC.
In order to create mobile SDKs, first we have to grab the WebRTC project and compile it manually to get the native code libs, then write a helper wrapper over it for abstraction
- Grab WebRTC project with Android/iOS specific code base
- Compile for native dependencies
- Create wrapper over native SDK for abstraction
- Although WebRTC is supported natively on web browsers still there is incompatibility issues among different versions of browsers regarding API. Although there is a solution for using adapter.js but still need a wrapper over native.
- Handle Namespace conflicts with other WebRTC solutions & session recording capabilities
Session Description Message
Sample Session description message to be transferred through the Signalling process.