Recently, Sonnet released the first full-featured version of Agora Unity SDK based on Native SDK 2.9.1 full-platform interface, which can be used for games developed based on Unity environment (regardless of package size), education, AR, VR projects. We’ll show you how to use the Agora Video SDK for Untiy to build cross-platform live Video chat scenarios in Unity.

The Agora Video SDK for Untiy differs a lot from the Agora Interactive SDK for Unity.

That said, if you are developing a Unity app/game that is sensitive to the size of the package, then it is recommended to use the Agora Interactive Games SDK for Unity. The package size can be as small as 1MB. If your app/game is less sensitive to the size of the package, or needs more custom audio and Video streams, you can use the Agora Video SDK for Unity, which is featured in this tutorial. Okay, so let’s start talking about how to use it quickly.

The preparatory work

  • Unity Editor

  • Learn about Unity Editor, GameObjects, Unity scripts, and how to publish Unity applications to mobile devices

  • Basic understanding of C#

  • Agora. IO developer account

An overview of the

Before getting into the subject, let’s take a moment to go over all the steps we’re going to complete

  1. Create a new project and import Agora Video SDK for Untiy
  2. Create a scenario
  3. Handles events for buttons
  4. Integrated Agora SDK
  5. Build and test on the equipment (iOS/Android/Windows/macOS)

Creating a new project

First, let’s open Unity and create a blank new project called Agora Video Demo. After completing the Unity project setup, we navigated to the Unity Asset Store and searched for “Agora Video SDK”. Next, we imported Agora Video SDK for Unity into our project. When prompted, make sure all Assets in the list are selected.

Create a Scene

Now, create a new Scene, name it WelcomeScene, and double-click the Scene to open it in the Editor view. WelcomeScene is the first screen the user will see when the application is loaded. We first remove the existing Camera and Lights from the scene, and then we add a Canvas GameObject. Next we add “Camera” as a child of “Canvas”. Since the screen is displayed in 2D mode, we switch the Editor to 2D mode. We also need to switch the Editor view to Game mode so that we can visualize the position relative to the Camera perspective. We need a button to trigger the application to perform the “JoinChannel” action, so we add a button, name it JoinChannel, and tag it. We also need to enable users to enter the name of their channel, for which we create a text field named ChannelName.

Next, we’ll create a new scene and name it ChatScene, which is the screen for playing local and remote video streams. To test some of the features of the Agora Video SDK, we added some 3D GameObjects to the scene. First, add a Cube to the scene. Since our scene contains 3D, exit 2D mode in the editor. We want Cube to render the local camera video stream as a texture, so we need to add VideoSurface.cs as a component of Cube. Add a Cylinder GameObject and move it to the top so that it is not blocked by Cube. Next, we adjust position for Cylinder and add videoSurfacing. Finally, we added the Canvas to the scene so that we could add a 2D button to exit the chat, which we named LeaveButton and assigned the appropriate tag. In doing so, we need to add a text box and name it VersionText. We place the VersionText in the upper-right corner so it doesn’t block the image, and then set the color to white for easy viewing.

Handle button events

While setting up the scene, we created two buttons (JoinButton and LeaveButton), and now we need to create a script that maps some On Click Events to these two buttons.

First create a new C# script in Assets, name it buttonhandler.cs, and then double-click on it to open the file in Visual Studio. Then we add a function OnButtonClick(), and for simplicity we add a Debug Log to the function body.

Come back to WelcomeScene in Unity and attach scripts and functions to the button. First select JoinButton and add ButtonHandler.cs as a component. Next add an On Click Event and map the button Click to OnButtonClick(). We then repeated the process on the ChatScene LeaveButton.

Also note that we need to import unityEngine. UI before we can get the InputFiled component from ChannelNameGameObject.

public void OnButtonClick()
{
    Debug.Log("Button Clicked: " + name);

    // determin which button
    if (name.CompareTo("JoinButton") == 0)
    {
        // join chat
        OnJoinButtonClicked();
    }
    else if (name.CompareTo("LeaveButton") == 0)
    {
        // leave chat
        OnLeaveButtonClicked();
    }
}

private void OnJoinButtonClicked()
{
    Debug.Log("Join button clicked");

    // get channel name from text input
    GameObject go = GameObject.Find("ChannelName");
    InputField input = go.GetComponent<InputField>();
}

private void OnLeaveButtonClicked()
{
    Debug.Log("Leave button clicked");
}
Copy the code

Integrated Agora SDK

First, we create a new C# script in Assets, call it agorainterface.cs, and open the file in Visual Studio. The first variable we create is appId, which holds our Agora appId. Copy your AppID and paste it into the value of AppID. We also need to create a variable to hold the UID of the remote video stream.

private static string appId = "Agora App ID";

public IRtcEngine mRtcEngine;

public uint mRemotePeer;
Copy the code

Let’s create a function to initialize Agora mRtcEngine, which we’ll call LoadEngine(). We made sure the engine was initialized only once. We use the if statement to check if the mRtcEngine reference is null. If the result is true, we initialize the engine by passing in our Agora AppID with irtcengine.getengine.

Next, we declare the JoinChannel() function. First check if mRtcEngine exists, then we call EnableVideo and EnableVideoObserver, and finally we call JoinChannel.

So far, we have initialized Agora mRtcEngine and used it to support the “Join channels” feature. Now we need to provide users with a way to “leave the channel” and “uninstall” the engine. We start with LeaveChannel(), check again to see if the engine exists, and then call LeaveChannel() and DisabelVideoObserver() with the engine. Finally, we call irtcengine.destroy () and set the local reference to NULL to unload the engine.

We are going to add callback functions that can be called based on various predefined events. We focus on three callbacks: OnChannelJoinSuccess, OnUserJoined, and OnUserOffline. OnChannelJoinSuccess is called as long as the local device successfully joins the channel. OnUserJoined is called whenever a remote stream joins a channel; OnUserOffline is called whenever the remote stream leaves the channel. After calling JoinChannel, we add a callback to the SceneLoaded listener of SceneManager in OnJoinButtonClicked and name the function OnSceneFinishedLoading. Since the callback is called every time the scene is loaded, we need an intermediate function.

Before we can test our work on the device, we need to add some permission requests for Android. Starting with Unity 2018_3, permissions are no longer added automatically, so we need to add some if-else statements to check and request permissions for the microphone and camera.

Build and test on the device

Finally ready to test our application! Let’s go back to Unity and open “Build Settings”. First, we drag WelcomeScene and ChatScene into the Scene list of the Build Settings dialog box.

Before we could build and deploy our application, we needed to make some adjustments to the player Settings for each platform. We need to update the Bundle ID and provide some text describing camera and microphone usage (enable permission prompt).

Now we’re ready to build our application! In the “BuildSettings” dialog, click the “Build” button and Unity will prompt you for a location to save your Build. Once Unity has finished building the iOS app, a Finder window will appear that contains Unity-iphone.xcodeProj. Double-click on this file to open XCode. After opening XCode, select the project (from the file navigator on the left), enable “AutomaticSigning” and select your signing certificate. Finally, make sure the test device is connected, and then click the Play button.

Similarly, our version also supports Windows and macOS, as shown below:

You can see WelcomeScene loaded first, we enter the ChatScene when we join the channel, and return WelcomeScene when we click the “Leave” button. The only thing missing is remote streaming. In order to achieve remote video streaming, we need a second device. We also support Android. Select “Android” from the “Platform” list, open “Player Settings” and provide a Package Name. Make sure your Android device is connected, then click “BuildAnd Run.”

Done!

Give you a thumbs up, another little target! If you have any new ideas, or if you have any questions, please click “Read the original article” to talk to us in the RTC developer community.

Other resources

  • We have an open source code sample for the Unity SDK on Github, which can be found at github.com/AgoraIO/Ago…

  • Agora. IO Video SDK for Unity is available on the Unity Asset Store

  • Complete the API documentation can be found in the center of the ring network document: docs. Agora. IO/cn/Video/re…

  • If you encounter problems with integration, you can ask questions in the RTC developer community

note

  1. Why getEngine() and not the new IRtcEngine(), you may ask, has to do with the fact that Agora mRtcEngine runs as a singleton. GetEngine () checks to see if the instance exists and creates a new one if it does not. The getEngine() function can also be called later to get a reference to the engine from other parts of the code.

  2. All callbacks are visible in AgoraGameRtcEngine. Cs (Assets → /Scripts → /AgoraGamingSDK).

Note: Each callback function passes a specific set of parameters, so be sure to check the agoragamertcEngine.cs file if you plan to add any other callbacks