Zhang Ganze is an engineer at Agora

If you are interested in our Flutter plugin development process, or if you have any questions related to real-time audio and video development, please visit our Agora Q&A section and post to our engineers.

It has been a while since The release of Flutter 1.0. After the Spring Festival, Agora released the Agora Flutter SDK in the form of a Flutter plugin, which helps Flutter developers quickly implement Flutter video calling applications.

Now let’s see how to quickly build a simple mobile cross-platform video calling application using the Agora Flutter SDK.

Environment to prepare

The tutorial on setting up an open environment on the Flutter Chinese website is relatively complete. The IDE and environment configuration process is not covered in this article. If there is a problem with the Flutter installation, you can perform a Flutter Doctor configuration check.

This article uses VS Code under MacOS as the main development environment.

The target

We want to implement a simple video calling application using Flutter+Agora Flutter SDK. This video calling application should include the following functions:

  • Join a Call Room
  • Video call
  • Front and rear Cameras switching
  • Local mute/unmute

Sonnet video calls are separated by room. Users in the same room can communicate with each other. To make it easier to distinguish, the demo will require a simple form page for the user to submit which room to join. At the same time, a room can accommodate up to 4 users, so we need to display different layouts when the number of users is different.

Think it through? Start with the code.

Project creation

First go to VS Code and select View -> Command panel (or directly use CMD + Shift + P) to bring up the command panel, enter flutter and select flutter: New Project Creates a New Flutter Project named agorA_flutter_quickstart and waits for the Project to complete.

Now run start -> Start Debug (or F5) to see a simple counting App

It looks like we are off to a good start 🙂 next we need to do some simple configuration for our new project to reference and use the agora flutter SDK.

Yaml: agorA_rtc_engine: ^0.9.0

dependencies:
  flutter:
    sdk: flutter

  # The following adds the Cupertino Icons font to your application.
  # Use with the CupertinoIcons class for iOS style icons.
  cupertino_icons: ^ 0.1.2
  # add agora rtc sdk
  agora_rtc_engine: ^ 0.9.0

dev_dependencies:
  flutter_test:
    sdk: flutter
Copy the code

After saving, VS Code automatically executes the flutter Packages GET update dependencies.

Application of the home page

After the project configuration is complete, we can start development. First we need to create a page file to replace the MyHomePage class in the default example code. Dart file we can create a pages directory in lib/ SRC and create an index.dart file.

If you have completed the official tutorial Write Your First Flutter app, the following code should be easy for you to understand.

class IndexPage extends StatefulWidget {
  @override
  State<StatefulWidget> createState() {
    return newIndexState(); }}class IndexState extends State<IndexPage> {
  @override
  Widget build(BuildContext context) {
  	// UI
  }
  
  onJoin() {
  	//TODO}}Copy the code

Now we need to start constructing the UI for the home page in the Build method.

After breaking down the UI as shown above, we can modify our home page code as follows,

@override
Widget build(BuildContext context) {
return Scaffold(
    appBar: AppBar(
      title: Text('Agora Flutter QuickStart'),
    ),
    body: Center(
      child: Container(
          padding: EdgeInsets.symmetric(horizontal: 20),
          height: 400,
          child: Column(
            children: <Widget>[
              Row(children: <Widget>[]),
              Row(children: <Widget>[
                Expanded(
                    child: TextField(
                  decoration: InputDecoration(
                      border: UnderlineInputBorder(
                          borderSide: BorderSide(width: 1)),
                      hintText: 'Channel name'),
                ))
              ]),
              Padding(
                  padding: EdgeInsets.symmetric(vertical: 20),
                  child: Row(
                    children: <Widget>[
                      Expanded(
                        child: RaisedButton(
                          onPressed: () => onJoin(),
                          child: Text("Join"),
                          color: Colors.blueAccent,
                          textColor: Colors.white,
                        ),
                      )
                    ],
                  ))
            ],
          )),
    ));
}
Copy the code

Run F5 to launch the view and you should see the following image,

Looks good! But it only looks good. Our UI now only looks, not interacts. We want to be able to implement the following functions based on our current UI,

  1. Add callback navigation to the call page for the Join button
  2. Check the channel name. If the channel name is empty when you try to join the channel, an error will be displayed on the TextField

TextField input validation

TextField itself provides an decoration property. We can provide an InputDecoration object to identify the decoration style of TextField. The errorText property in the InputDecoration is very suitable for us to use, and we use the TextEditingController object to record the value of the TextField to determine whether the error should be displayed at the moment. So after a few simple modifications, our TextField code looks like this,

	final _channelController = TextEditingController();
	
	/// if channel textfield is validated to have error
	bool _validateError = false;

	@override
	void dispose() {
		// dispose input controller
		_channelController.dispose();
		super.dispose();
	}

	@override
 	Widget build(BuildContext context) {
		...
		TextField(
		  controller: _channelController,
		  decoration: InputDecoration(
		      errorText: _validateError
		          ? "Channel name is mandatory"
		          : null,
		      border: UnderlineInputBorder(
		          borderSide: BorderSide(width: 1)),
		      hintText: 'Channel name')))... } onJoin() {// update input validation
		setState(() {
		  _channelController.text.isEmpty
		      ? _validateError = true
		      : _validateError = false;
		});
	}
Copy the code

The onJoin callback is triggered when the Join channel button is clicked, which first updates the state of the TextField via setState for component redrawing.

Note: Don’t forget the OverrideDispose method releases _Controller at the end of the component’s life cycle.

Go to the Call page

Finally, we create the MaterialPageRoute in onJoin to navigate the user to the CallPage, where we pass the obtained channel name as a parameter to the CallPage constructor to the next page, CallPage.

import './call.dart';

class IndexState extends State<IndexPage> {... onJoin() {// update input validation
		setState(() {
		  _channelController.text.isEmpty
		      ? _validateError = true
		      : _validateError = false;
		});
		if (_channelController.text.isNotEmpty) {
		  // push video page with given channel name
		  Navigator.push(
		      context,
		      MaterialPageRoute(
		          builder: (context) => newCallPage( channelName: _channelController.text, ))); }}Copy the code

Talk page

Also in the /lib/ SRC /pages directory, we need to create a call.dart file where we will implement our most important live video call logic. First, we need to create our CallPage class. If you remember our implementation in IndexPage, the CallPage would need to pass in a parameter to the constructor as the channel name.

class CallPage extends StatefulWidget {
	/// non-modifiable channel name of the page
	final String channelName;
	
	/// Creates a call page with given channel name.
	const CallPage({Key key, this.channelName}) : super(key: key);
	
	@override
	_CallPageState createState() {
		return new_CallPageState(); }}class _CallPageState extends State<CallPage> {
	@override
	Widget build(BuildContext context) {
		returnScaffold( appBar: AppBar( title: Text(widget.channelName), ), backgroundColor: Colors.black, body: Center( child: Stack( children: <Widget>[], ))); }}Copy the code

It is important to note that we do not need to pass in the parameters when creating the state instance. State can access widget.channelName directly to get the component’s properties.

Introduce sound net SDK

Since we initially added the agorA_rtc_engine dependency in pubspec.yaml, we can now directly introduce the Soundnet SDK in the following way.

import 'package:agora_rtc_engine/agora_rtc_engine.dart';
Copy the code

After introduction, you can use to create a sound net media engine instance. Before making a video call using the Sonnet SDK, we need to do the following initialization. Initialization should only be done once in the entire page life cycle, so here we need the overrideinitState method, which does the initialization in this method.

class _CallPageState extends State<CallPage> {
	@override
	void initState() {
		super.initState();
		initialize();
	}
	void initialize() {
		_initAgoraRtcEngine();
		_addAgoraEventHandlers();
	}
	
	/// Create agora sdk instance and initialze
	void _initAgoraRtcEngine() {
		AgoraRtcEngine.create(APP_ID);
		AgoraRtcEngine.enableVideo();
	}
	
	/// Add agora event handlers
   void _addAgoraEventHandlers() {
	AgoraRtcEngine.onError = (int code) {
	  // sdk error
	};
	
	AgoraRtcEngine.onJoinChannelSuccess =
	    (String channel, int uid, int elapsed) {
	  // join channel success
	};
	
	AgoraRtcEngine.onUserJoined = (int uid, int elapsed) {
	  // there's a new user joining this channel
	};
	
	AgoraRtcEngine.onUserOffline = (int uid, int reason) {
	  // there's an existing user leaving this channel}; }}Copy the code

Note: Please refer to our official documentation for details on how to obtain our APP_ID.

In the above code, we mainly created the media SDK instance of soundnet and listened for key events, and then we will start to do the processing of video stream.

In a typical video call, there are two types of video streams for the local device: local and remote – the former needs to be rendered by the local camera and sent, and the latter needs to receive and render the data from the remote stream. Now we need to dynamically render the video stream for up to 4 people to the call page.

We will render the call page with something like this structure.

Unlike the home page, the toolbar where the call action button is placed is overlaid on top of the video, so here we use the Stack component to place the stacked components.

To better distinguish UI builds, we split video builds and toolbar builds into two approaches.

Local stream creation and rendering

To render the local stream, create a container for rendering the video stream after the initialization SDK is complete, and then render the local stream onto the corresponding container using the SDK. Soundnet SDK provides the createNativeView method to create containers. After obtaining containers and successfully rendering them to the container view, we can use SDK to join channels and communicate with other clients.

	void initialize() {
		_initAgoraRtcEngine();
		_addAgoraEventHandlers();
		// use _addRenderView everytime a native video view is needed
		_addRenderView(0, (viewId) {
			// local view setup & preview
			AgoraRtcEngine.setupLocalVideo(viewId, 1);
			AgoraRtcEngine.startPreview();
			// state can access widget directly
			AgoraRtcEngine.joinChannel(null, widget.channelName, null.0);
		});
	}
	/// Create a native view and add a new video session object
	/// The native viewId can be used to set up local/remote view
	void _addRenderView(int uid, Function(int viewId) finished) {
		Widget view = AgoraRtcEngine.createNativeView(uid, (viewId) {
		  setState(() {
		    _getVideoSession(uid).viewId = viewId;
		    if(finished ! =null) { finished(viewId); }}); }); VideoSession session = VideoSession(uid, view); _sessions.add(session); }Copy the code

Note: The code finally creates a VideoSession object with uid and container information and adds it to _sessions, mainly for video layout purposes, which I’ll cover in more detail later.

Remote stream listening and rendering

In fact, we have already mentioned the monitoring of the remote stream in the previous initialization code. We can listen to the onUserJoined and onUserOffline callback provided by SDK to judge whether there are other users entering and entering the current channel. If a new user joins the channel, we will create a rendering container for him and make corresponding rendering. If a user leaves the channel, remove his render container.

	AgoraRtcEngine.onUserJoined = (int uid, int elapsed) {
	  setState(() {
	    _addRenderView(uid, (viewId) {
	      AgoraRtcEngine.setupRemoteVideo(viewId, 1, uid);
	    });
	  });
	};
	
	AgoraRtcEngine.onUserOffline = (int uid, int reason) {
	  setState(() {
	    _removeRenderView(uid);
	  });
	};
	/// Remove a native view and remove an existing video session object
	void _removeRenderView(int uid) {
		VideoSession session = _getVideoSession(uid);
		if(session ! =null) {
		  _sessions.remove(session);
		}
		AgoraRtcEngine.removeNativeView(session.viewId);
	}
Copy the code

Note: _sessions saves locally a list of video streams in the current channel. Therefore, when the user joins, the corresponding VideoSession object needs to be created and added to sessions. When the user leaves, the corresponding VideoSession instance needs to be deleted.

Video stream layout

Once we have the _sessions array and a native render container for each local/remote stream, we can start laying out the video stream.

	/// Helper function to get list of native views
	List<Widget> _getRenderViews() {
		return _sessions.map((session) => session.view).toList();
	}
	
	/// Video view wrapper
	Widget _videoView(view) {
		return Expanded(child: Container(child: view));
	}
	
	/// Video view row wrapper
	Widget _expandedVideoRow(List<Widget> views) {
		List<Widget> wrappedViews =
		    views.map((Widget view) => _videoView(view)).toList();
		return Expanded(
		    child: Row(
		  children: wrappedViews,
	));
	}
	
	/// Video layout wrapper
	Widget _viewRows() {
		List<Widget> views = _getRenderViews();
		switch (views.length) {
		  case 1:
		    return Container(
		        child: Column(
		      children: <Widget>[_videoView(views[0]]));case 2:
		    return Container(
		        child: Column(
		      children: <Widget>[
		        _expandedVideoRow([views[0]]),
		        _expandedVideoRow([views[1]]]]));case 3:
		    return Container(
		        child: Column(
		      children: <Widget>[
		        _expandedVideoRow(views.sublist(0.2)),
		        _expandedVideoRow(views.sublist(2.3))));case 4:
		    return Container(
		        child: Column(
		      children: <Widget>[
		        _expandedVideoRow(views.sublist(0.2)),
		        _expandedVideoRow(views.sublist(2.4))));default:}return Container();
	}
Copy the code

Toolbar (Hang up, mute, switch camera)

After implementing the video stream layout, let’s implement the video call action toolbar. There are three buttons in the toolbar, respectively corresponding to the sequence of mute, hang up, switch camera. Use a simple Flex Row layout.

	/// Toolbar layout
	Widget _toolbar() {
		return Container(
		  alignment: Alignment.bottomCenter,
		  padding: EdgeInsets.symmetric(vertical: 48),
		  child: Row(
		    mainAxisAlignment: MainAxisAlignment.center,
		    children: <Widget>[
		      RawMaterialButton(
		        onPressed: () => _onToggleMute(),
		        child: new Icon(
		          muted ? Icons.mic : Icons.mic_off,
		          color: muted ? Colors.white : Colors.blueAccent,
		          size: 20.0,
		        ),
		        shape: new CircleBorder(),
		        elevation: 2.0, fillColor: muted? Colors.blueAccent : Colors.white, padding:const EdgeInsets.all(12.0),
		      ),
		      RawMaterialButton(
		        onPressed: () => _onCallEnd(context),
		        child: new Icon(
		          Icons.call_end,
		          color: Colors.white,
		          size: 35.0,
		        ),
		        shape: new CircleBorder(),
		        elevation: 2.0,
		        fillColor: Colors.redAccent,
		        padding: const EdgeInsets.all(15.0),
		      ),
		      RawMaterialButton(
		        onPressed: () => _onSwitchCamera(),
		        child: new Icon(
		          Icons.switch_camera,
		          color: Colors.blueAccent,
		          size: 20.0,
		        ),
		        shape: new CircleBorder(),
		        elevation: 2.0,
		        fillColor: Colors.white,
		        padding: const EdgeInsets.all(12.0() [() [(); }void _onCallEnd(BuildContext context) {
		Navigator.pop(context);
	}
	
	void_onToggleMute() { setState(() { muted = ! muted; }); AgoraRtcEngine.muteLocalAudioStream(muted); }void _onSwitchCamera() {
		AgoraRtcEngine.switchCamera();
	}
Copy the code

The final integration

Now that both parts of the UI are complete, we’re going to Stack the two components together.

	@override
	Widget build(BuildContext context) {
		return Scaffold(
		    appBar: AppBar(
		      title: Text(widget.channelName),
		    ),
		    backgroundColor: Colors.black,
		    body: Center(
		        child: Stack(
		      children: <Widget>[_viewRows(), _toolbar()],
		    )));
Copy the code

Clean up the

If you only use the Soundnet SDK on the current page, you need to call the Destroy interface to destroy the SDK instance before you leave. If you need to use it across pages, it is recommended to make the SDK instance a singleton that can be accessed by different pages. At the same time, we should also pay attention to the release of the native render container. We can directly use the removeNativeView method to release the corresponding native render container.

	@override
	void dispose() {
		// clean up native views & destroy sdk
		_sessions.forEach((session) {
		  AgoraRtcEngine.removeNativeView(session.viewId);
		});
		_sessions.clear();
		AgoraRtcEngine.destroy();
		super.dispose();
	}
Copy the code

End result:

conclusion

As a new thing, Flutter inevitably has some immaturity, but we have seen great potential from its current progress. From my experience so far, developing cross-platform applications on Flutter is comfortable as long as there are sufficient community resources. The Flutter SDK provided by Sonnet has basically covered most of the methods provided by the native SDK, and the development experience of Flutter is basically the same as that of the native SDK. This article is also written based on the attitude of learning, hoping to be helpful to students who want to develop RTC application with Flutter.

The full code described in this article can be found on Github.

If you encounter problems during development, please goRTC developer community, the author of this article will reply promptly.