If you are having problems developing your WebRTC application, you are welcome to talk to more experienced colleagues in the RTC developer community.

WebRTC is a protocol that allows people to create real-time communications between two points using JavaScript.

We can use this structure to enable direct communication between two or more browsers without the need for a central server.

The server only needs to be used when connecting, so each client knows how to connect to each other.

What kind of App can we create with this feature? For example, direct webcam connection. Peer-to-peer calls, file sharing, and more.

In this tutorial I’m going to introduce you to an App that will make you gasp when you use it for the first time: a Webcam communication App.

We will not use the original WebRTC API, however, we need to pay attention to a number of details. That’s what libraries do. They provide a nice abstraction so people can focus on building apps instead of spending their energy on the underlying apis.

One such library is PeerJS, which makes real-time communication very easy. WebRTC in a nutshell, its abstraction makes it faster to get results, and then you can learn how it works inside.

Tip: Use bits to share in a reusable collection when you build an App, and synchronize them across all your projects! Try it. It will speed up your work.

The back-end

First we need to create the back end. Although we will implement direct point-to-point communication, the initial handshake and collaboration will require a central server.

Once the handshake is complete, the dots communicate directly with each other without relying on the back end.

PeerJS provides us with such a server that the installation process is simple and easy to run.

In a folder, initialize an NPM project using the NPM init command, install PeerJS using the NPM install peer directive, and then you can run it using NPX:

Npx peerjs - port 9000Copy the code

Use NPX Peerjs -help to see more options.

This is your back end.

Now we can create the simplest App we can, and we set up a receiver and a sender.

The front end

First create a receiver, connect to the PeerJS server, and wait for the receiving information. The first argument, new Peer(), is the name of our endpoint, which we call receiver, to make the meaning clearer. Import PeerJS client:

<script src="https://cdnjs.cloudflare.com/ajax/libs/peerjs/0.3.16/peer.min.js"></script>
Copy the code

Then the Peer object is initialized. The Connection event is called when another endpoint connects to us. The data event is called when some information is received:

const peer = new Peer('receiver', { host: 'localhost', port: 9000, path: '/' })
peer.on('connection', (conn) => {
  conn.on('data', (data) => { console.log(data); })})Copy the code

Let’s create the other side of the communication. We call it the sender, because it connects and sends a message to the receiver.

Initialize the Peer object, and then request the endpoint to connect to the receiver, which we have already created. Once the connection is established, the open event is started and the send() method is called to send the message to the receiver:

const peer = new Peer('sender', { host: 'localhost', port: 9000, path: '/' })
const conn = peer.connect('receiver')
conn.on('open', () => {
  conn.send('hi! ')})Copy the code

That’s the simplest example.

First open the receiving end, then open the sending end. The receiver gets the information directly from the sender, without the need for a central server. The server just needs to exchange information and make sure both ends are connected. After that, they don’t interfere with the communication between the two ends.

This is a very basic connection of information.

Next, instead of sending messages, we let both ends share a webcam stream with each other.

On the client side, instead of using peer-.connect (), we use peer-.call ():

const call = peer.call('receiver'.localStream)
})
Copy the code

On the receiving end, when a call event is received, a response must be made:

peer.on('call', call => {
  call.answer(localStream)
})
Copy the code

Just like phone conversations, we can’t automatically respond to every call, we have to respond specifically.

What is LocalStream in each call? It is produced by the webcam, we must through a call to the navigator. MediaDevices. GetUserMedia () to get it, it’s a browser API.

This is an asynchronous communication, so we use async/await, we need to wrap the call in an asynchronous function, first:

const startChat = async () => {
  const localStream = await navigator.mediaDevices.getUserMedia({
    video: true
  })
}
startChat()
Copy the code

Once we have the localStream object, we can assign it to a video element in an HTML page. We can create local and remote video elements:

<video id="local" autoplay></video>
<video id="remote" autoplay></video>
Copy the code

Assign the stream to the Video# local element:

document.querySelector('video#local').srcObject = localStream
Copy the code

Call receiver, passing localStream object:

const call = peer.call('receiver'.localStream)
Copy the code

The receiver code is as follows:

peer.on('call', call => {
  call.answer(localStream)
})
Copy the code

We also need to get media streams. The code is very simple, similar to the sender, except that we wrap all the code in the call event feedback:

peer.on('call', call => {
  const startChat = async () => {
    const localStream = await navigator.mediaDevices.getUserMedia({
      video: true
    })
    document.querySelector('video#local').srcObject = localStream
    call.answer(localStream)
  }
  startChat()
})
Copy the code

Show remote flows

We also need to add the last part to the sender and receiver.

Once we get the remote stream from the Stream event of the Call object, we need to attach it to the Video# remote element.

call.on('stream', remoteStream => {
  document.querySelector('video#remote').srcObject = remoteStream
})
Copy the code

The receiver code is as follows:

peer.on('call', call => {
  const startChat = async () => {
    const localStream = await navigator.mediaDevices.getUserMedia({
      video: true
    })
    document.querySelector('video#local').srcObject = localStream
    call.answer(localStream)
    call.on('stream', remoteStream => {
      document.querySelector('video#remote').srcObject = remoteStream
    })
  }
  startChat()
})
Copy the code

The sender code is as follows:

const startChat = async () => {
  const localStream = await navigator.mediaDevices.getUserMedia({
    video: true
  })
  document.querySelector('video#local').srcObject = localStream
  const call = peer.call('receiver'.localStream)
  call.on('stream', remoteStream => {
    document.querySelector('video#remote').srcObject = remoteStream
  })
}
startChat()
Copy the code

When one end closes the connection by directing to a new web page or closing the browser TAB, the other end stops receiving the stream and the remote video stream stops.

conclusion

We created a very simple webcam communication App using WebRTC. Two files were created to handle communication between the two ends, but this was not necessary. You can set up a user interface that allows users to decide for themselves whether they need to call and, more importantly, who they want to talk to. You can do this by allowing the user to enter a user name or select from a list.