WebRTC is a free, open source project that provides real-time communication capabilities for browsers and mobile applications through a simple API. This article will show you the basic concepts and functions of WebRTC and guide you through building your own WebRTC live video using Node.js.

Prerequisites:

  1. Java experience
  2. Master basic knowledge of socket. IO

WebRTC basis

WebRTC supports real-time communication in the networked world, primarily for transferring video and audio data over the network. Before we start coding, let’s take a look at the most important concepts of WebRTC.

  1. Signal:

WebRTC is used for traffic in browsers, but you also need a mechanism to coordinate traffic and send control messages, a process called signaling.

Signaling is used for the following tasks:

  • Initialize and close communications
  • Sharing network configuration (IP address, port) with the outside world
  • Report connection error

Signaling methods are not specified by WebRTC and are optional for developers (socket.io will be used in this tutorial).

STUN and TURN servers:

STUN and TURN servers are used as backup if the primary WebRTC peer connection encounters problems. The STUN server is used to get the IP address of the computer, while the TURN server is used as a relay for peer connection failures.

Now that we understand the basic concepts of WebRTC, we can move on to the projects discussed above.

Use socket. IO to send signals

Before we can use WebRTC to send a video broadcast over a peer connection, we first need to instantiate the connection using a signaling method (socket.io in this case).

To do this, we create the project and install the required dependencies using NPM:

mkdir WebSocketsVideoBroadcast && cd WebSocketsVideoBroadcast
npm install express socket.io --save
Copy the code

After that, we create the following folder structure:

We’ll start with a simple socket. IO server framework:

const express = require("express");
const app = express();

const port = 4000;

const http = require("http");
const server = http.createServer(app);

const io = require("socket.io")(server);
app.use(express.static(__dirname + "/public"));

io.sockets.on("error", e => console.log(e));
server.listen(port, () => console.log(`Server is running on port ${port}`));
Copy the code

Then, we need to connect the client and the streamer to the server. The Socket ID of the streamer is stored in a variable so that we know later where the client needs to connect.

let broadcaster

io.sockets.on("connection", socket => {
  socket.on("broadcaster", () => {
    broadcaster = socket.id;
    socket.broadcast.emit("broadcaster");
  });
  socket.on("watcher", () => {
    socket.to(broadcaster).emit("watcher", socket.id);
  });
  socket.on("disconnect", () => {
    socket.to(broadcaster).emit("disconnectPeer", socket.id);
  });
});
Copy the code

After that, we’ll implement the socket. IO event to initialize the WebRTC connection. Both parties will use these events to instantiate the peer connection.

socket.on("offer", (id, message) => {
    socket.to(id).emit("offer", socket.id, message);
});
socket.on("answer", (id, message) => {
  socket.to(id).emit("answer", socket.id, message);
});
socket.on("candidate", (id, message) => {
  socket.to(id).emit("candidate", socket.id, message);
});
Copy the code

That’s all for our server implementation of socket. IO, and now we can move on to the layout and implementation of communication between the two sides.

Layouts

Our layout consists of two basic HTML files that contain a video view (which will later show the video stream we are sending) and a CSS file for some basic styles.

index.html

The file contains a video view that displays the video stream from the broadcaster. It also imports socket. IO dependencies and ours

watch.js

File.

<! DOCTYPE html> <html> <head> <title>Viewer</title> <meta charset="UTF-8" /> <link href="/styles.css" rel="stylesheet"> </head> <body> <video playsinline autoplay></video> <script src="/socket.io/socket.io.js"></script> <script src="/watch.js"></script> </body> </html>Copy the code

broadcast.html

The file is very similar to the main layout, but is imported

broadcast.js

Documents instead of

watch.js

.

<! DOCTYPE html> <html> <head> <title>Broadcaster</title> <meta charset="UTF-8" /> <link href="/styles.css" rel="stylesheet"> </head> <body> <video playsinline autoplay muted></video> <script src="/socket.io/socket.io.js"></script> <script src="/broadcast.js"></script> </body> </html>Copy the code

I also provided some simple CSS styles for the video view.

html {
  overflow: hidden;
  height: 100%;
}

video {
  width: 100%;
  height: 100%;
  position: absolute;
  display: block;
  top: 0;
  left: 0;
  object-fit: cover;
}

body {
  background-color: black;
  margin: 0;
  height: 100%;
  width: 100%;
  position: fixed;
  top: 0;
  left: 0;
  bottom: 0;
  right: 0;
}
Copy the code

RTCPeerConnection

RTCPeerConnections helps us connect two computers on a local network to each other. There are many terms involved when talking about these types of connections:

  • ICE- Internet connection established
  • STUN- User datagram protocol [UDP] session traversal via network Address Converter [NAT]

Because most devices today are behind NAT routers, they cannot connect directly. This is why the peer connection must be initiated by the STUN server, which will return ICE candidates that we can connect to.

In this guide, we have two different connecting sections. One is the live video provider, which can establish multiple peer-to-peer connections with clients and use streams to send videos. The second is the client, which has only one connection to the current video streamer.

Live side

First, we create configuration objects for the peer connection and camera.

const peerConnections = {};
const config = {
  iceServers: [
    {
      urls: ["stun:stun.l.google.com:19302"]
    }
  ]
};

const socket = io.connect(window.location.origin);
const video = document.querySelector("video");

// Media contrains
const constraints = {
  video: { facingMode: "user" }
  // Uncomment to enable audio
  // audio: true,
};
Copy the code

We used a formal Google STUN server for point-to-point connectivity and configured the camera with media constraints. You can also enable audio by uncommenting the audio line.

Before creating a peer-to-peer connection, we first need to get the video from the camera so we can add it to our connection.

navigator.mediaDevices
  .getUserMedia(constraints)
  .then(stream => {
    video.srcObject = stream;
    socket.emit("broadcaster");
  })
  .catch(error => console.error(error));
Copy the code

Next, we will create an RTCPeerConnection using the following code:

socket.on("watcher", id => { const peerConnection = new RTCPeerConnection(config); peerConnections[id] = peerConnection; let stream = video.srcObject; stream.getTracks().forEach(track => peerConnection.addTrack(track, stream)); peerConnection.onicecandidate = event => { if (event.candidate) { socket.emit("candidate", id, event.candidate); }}; peerConnection .createOffer() .then(sdp => peerConnection.setLocalDescription(sdp)) .then(() => { socket.emit("offer", id, peerConnection.localDescription); }); }); socket.on("answer", (id, description) => { peerConnections[id].setRemoteDescription(description); }); socket.on("candidate", (id, candidate) => { peerConnections[id].addIceCandidate(new RTCIceCandidate(candidate)); });Copy the code

Each time a new client joins, we create a new RTCPeerConnection and save it in our peerConnections object.

We then use the addTrack () method to add the local stream to the connection and pass the stream and trace data.

When we receive an ICE candidate, calls the peerConnection. Onicecandidate events, and send it to our server.

We by calling the peerConnection. CreateOffer () will connect proposal is sent to the client, then calls the peerConnection. SetLocalDescription () to configure the connection.

Closing the connection when the client disconnects is another important part of the application, which we can do with the following code:

socket.on("disconnectPeer", id => {
  peerConnections[id].close();
  delete peerConnections[id];
});
Copy the code

Finally, if the user closes the window, we close the socket connection.

window.onunload = window.onbeforeunload = () => {
  socket.close();
};
Copy the code

The client

The client (the party watching the video) has almost the same functionality. The only difference is that he only opens a peer-to-peer connection to the current streamer, and he grabs the video instead of streaming it.

We also need to create a configuration for RTCPeerConnection.

let peerConnection;
const config = {
  iceServers: [
    {
      urls: ["stun:stun.l.google.com:19302"]
    }
  ]
};

const socket = io.connect(window.location.origin);
const video = document.querySelector("video");
Copy the code

We can then create our RTCPeerConnection and get the video stream from the video streamer.

socket.on("offer", (id, description) => { peerConnection = new RTCPeerConnection(config); peerConnection .setRemoteDescription(description) .then(() => peerConnection.createAnswer()) .then(sdp => peerConnection.setLocalDescription(sdp)) .then(() => { socket.emit("answer", id, peerConnection.localDescription); }); peerConnection.ontrack = event => { video.srcObject = event.streams[0]; }; peerConnection.onicecandidate = event => { if (event.candidate) { socket.emit("candidate", id, event.candidate); }}; });Copy the code

Here, we create a new RTCPeerConnection using the configuration object as above. The only difference is that we call the createAnswer () function to send the connection reply back to the request of the video streamer.

Once the connection is established, we can continue to get the video stream using the OnTrack event listener of the peerConnection object.

We also need to implement additional lifecycle features for point-to-point connections, which will help us open and close new connections.

socket.on("candidate", (id, candidate) => {
  peerConnection
    .addIceCandidate(new RTCIceCandidate(candidate))
    .catch(e => console.error(e));
});

socket.on("connect", () => {
  socket.emit("watcher");
});

socket.on("broadcaster", () => {
  socket.emit("watcher");
});

window.onunload = window.onbeforeunload = () => {
  socket.close();
  peerConnection.close();
};
Copy the code

At this point, the application is complete and you can continue to test it in the browser.

Test application

Now that we’ve finished with the application, it’s time to test it and see if it works.

We can start the application with the following command:

node server.js
Copy the code

The application should now run on your localhost: 4000 and can be tested by connecting to localhost: 4000 / Broadcast to add a new live video program.

After that, simply access localhost: 4000 to connect to the server as a client, and you should get streaming video from the live video provider.

conclusion

I hope this article has helped you understand the basics of WebRTC and how to use it to stream live video.

EasyRTC Video conferencing cloud service

** Based on WebRTC technology, EasyRTC ** is developed by the TSINGSEE Video team with years of expertise in audio and video. It is a real-time audio development platform with global coverage that supports one-to-one and one-to-many video calls.

EasyRTC has MCU and SFU architecture, no need to install client and plug-in, pure H5 online video conference system, support wechat small program, H5 page, APP, PC client and other access methods, greatly meet the needs of voice and video social, online education and training, video conference and telemedicine and other scenarios.

With the rapid development of mobile Internet, the arrival of AI, 5G and other emerging technologies, combined with WebRTC technology, will also derive more application scenarios, changing the way of life such as clothing, food, housing and transportation of human beings.