Why Flutter?
Flutter is Google’s mobile UI framework for quickly building high-quality native user interfaces on iOS and Android. Flutter can work with existing code. Flutter is being used by more and more developers and organizations around the world, and Flutter is completely free and open source.
What makes Flutter different
1. The Beautiful – Flutter allows you to control every pixel on the screen, which eliminates the need for design to compromise implementation.
2. Fast – What is the standard for an application not to lag? You may say 16ms or 60fps, which is enough for desktop or mobile applications, but when facing the vast AR/VR field, 60fps will still be the bottleneck that makes the human brain dizzy. The Flutter aims for much more than 60fps; With Dart support for AOT compilation and Skia drawing, Flutter can run quickly.
Hot Reload may be used to development, but this is a new feature in mobile development. Flutter provides a stateful hot Reload development mode and allows a set of Codebase to run on multiple ends. Others, such as JIT compilation and published AOT compilation, allow developers to be more efficient in developing applications.
4. Open-dart/Skia/Flutter (Framework) is Open source. The Flutter and Dart teams are also Open to a variety of technologies, including the Web, as long as they are good. In terms of ecological construction, the speed of Flutter response to GitHub issues is even more amazing because it is really fast (the average settlement time of closed issues is 0.29 days).
Besides supporting APICloud, Unity3d and React Native, why build the RTSP/RTMP player under Flutter
First, Flutter relies on the Flutter Engine virtual machine to run on iOS and Android. Developers can interact internally with Flutter through the Flutter framework and API. The Flutter Engine is written in C/C++ and features low delay input and high frame rate. Unlike Unity3d, we call back YUV/RGB data and draw in Unity3d. The Flutter Engine directly calls the native SDK, which is more efficient.
Second, the client and the developer drive, Flutter development up to now, there is no decent RTSP or RTMP player, a player, not to say that there is a interface, can have a start and stop button, live a good player, demanding properties on the functionality and performance, particularly those stability and low latency, and not modest said, Possibly the first powerful and really useful Flutter RTSP/RTMP live playback SDK.
RTSP/RTMP on Android and iOS phones
1. Video playback effect:
www.iqiyi.com/w_19s8dv6yh…
2. Screen shot:
RTMP live broadcast player
- [Supports playback protocol] High stability, ultra-low delay (within one second, few players with similar effects in the industry), the industry’s leading RTMP live player SDK;
- [Multi-instance play] Supports multi-instance play.
- [Event callback] Supports callback of network status and buffer status.
- Support RTMP extension H.265, H.264;
- Support AAC/PCMA/PCMU/Speex;
- [H.264/H.265 soft decoding] Supports H.264/H.265 soft decoding;
- [H.264 hard decoding]Android/iOS support H.264 hard decoding;
- [H.265 hard solution]Android/iOS support H.265 hard solution;
- [H.264/H.265 hard decoding]Android supports Surface mode hard decoding and normal mode hard decoding;
- [Buffer time setting] Supports buffer time setting.
- [First screen second on] Supports the first screen second on mode;
- [Low delay mode] Supports the setting of ultra-low delay mode (200-400ms on the public network) similar to online live broadcasting schemes.
- [Complex network processing] Automatic adaptation of various network environments such as disconnection and reconnection;
- [Quick URL switching] During playback, you can quickly switch to other urls, and content switching is faster.
- Android platform, video: SurfaceView /OpenGL ES, audio: AudioTrack/OpenSL ES;
- [Real-time mute] You can mute or unmute a player in real time.
- [Real-time snapshot] Capture the current screen during playback.
- [Render Angle] Support 0°, 90°, 180° and 270° video rendering Angle Settings;
- [Render image] support horizontal inversion, vertical inversion mode Settings;
- [Real-time download speed update] Supports real-time callback of the current download speed (supports setting the callback interval);
- [Video data callback before decoding] support H.264/H.265 data callback;
- [Decoded video data callback] support decoded YUV/RGB data callback;
- Support AAC/PCMA/PCMU/SPEEX data callback;
- [Audio and video adaptive] Support audio and video information change in the playback process, self-adaptation;
- [Extended video function] perfect support and video SDK combination use (support RTMP extension H.265 stream recording, support PCMA/PCMU/Speex to AAC after recording, support setting only audio or video recording)
RTSP live broadcast player features:
- High stability, ultra-low latency, the industry’s leading RTSP live player SDK;
- [Multi-instance play] Supports multi-instance play.
- [Event callback] Supports callback of network status and buffer status.
- [Video format] support H.265, H.264, in addition, Windows/Android platform also support RTSP MJPEG playback;
- [Audio format] AAC/PCMA/PCMU;
- [H.264/H.265 soft decoding] Supports H.264/H.265 soft decoding;
- [H.264 hard decoding]Android/iOS support H.264 hard decoding;
- [H.265 hard solution]Android/iOS support H.265 hard solution;
- [H.264/H.265 hard decoding]Android supports Surface mode hard decoding and normal mode hard decoding;
- [RTSP mode setting] Supports RTSP TCP/UDP mode setting.
- [AUTOMATIC RTSP TCP/UDP switchover] Supports automatic RTSP TCP/UDP switchover.
- [RTSP Timeout setting] The RTSP timeout setting is supported, in seconds.
- [RTSP 401 Authentication Processing] RTSP 401 events are reported. For example, authentication information carried by URLS is automatically processed.
- [Buffer time setting] Supports buffer time setting.
- [First screen second on] Supports the first screen second on mode;
- [Complex network processing] Automatic adaptation of various network environments such as disconnection and reconnection;
- [Quick URL switching] During playback, you can quickly switch to other urls, and content switching is faster.
- Android platform, video: SurfaceView /OpenGL ES, audio: AudioTrack/OpenSL ES;
- [Real-time mute] You can mute or unmute a player in real time.
- [Real-time snapshot] Capture the current screen during playback.
- [Render Angle] Support 0°, 90°, 180° and 270° video rendering Angle Settings;
- [Render image] support horizontal inversion, vertical inversion mode Settings;
- [Real-time download speed update] Supports real-time callback of the current download speed (supports setting the callback interval);
- [Video data callback before decoding] support H.264/H.265 data callback;
- [Decoded video data callback] support decoded YUV/RGB data callback;
- Support AAC/PCMA/PCMU/SPEEX data callback;
- [Audio and video adaptive] Support audio and video information change in the playback process, self-adaptation;
- [Extended video function] Perfect support and video SDK combined use (support RTSP H.265 stream recording, support PCMA/PCMU to AAC recording, support setting only audio or video recording)
On the interface:
//
// smartplayer.dart
// smartplayer
//
// GitHub: https://github.com/daniulive/SmarterStreaming
// website: https://www.daniulive.com
//
// Created by daniulive on 2019/02/25.
// Copyright © 2014~2019 Daniulive. All rights reserved
//
import 'dart:async';
import 'dart:convert';
import 'package:flutter/services.dart';
class EVENTID {
static const EVENT_DANIULIVE_COMMON_SDK = 0x00000000;
static const EVENT_DANIULIVE_PLAYER_SDK = 0x01000000;
static const EVENT_DANIULIVE_PUBLISHER_SDK = 0x02000000;
static const EVENT_DANIULIVE_ERC_PLAYER_STARTED =
EVENT_DANIULIVE_PLAYER_SDK | 0x1;
static const EVENT_DANIULIVE_ERC_PLAYER_CONNECTING =
EVENT_DANIULIVE_PLAYER_SDK | 0x2;
static const EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED =
EVENT_DANIULIVE_PLAYER_SDK | 0x3;
static const EVENT_DANIULIVE_ERC_PLAYER_CONNECTED =
EVENT_DANIULIVE_PLAYER_SDK | 0x4;
static const EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED =
EVENT_DANIULIVE_PLAYER_SDK | 0x5;
static const EVENT_DANIULIVE_ERC_PLAYER_STOP =
EVENT_DANIULIVE_PLAYER_SDK | 0x6;
static const EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO =
EVENT_DANIULIVE_PLAYER_SDK | 0x7;
static const EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED =
EVENT_DANIULIVE_PLAYER_SDK | 0x8;
static const EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL =
EVENT_DANIULIVE_PLAYER_SDK | 0x9;
static const EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE =
EVENT_DANIULIVE_PLAYER_SDK | 0xA;
static const EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE =
EVENT_DANIULIVE_PLAYER_SDK | 0x21; /* Record to write new file */
static const EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED =
EVENT_DANIULIVE_PLAYER_SDK | 0x22; /* A video file is completed */
static const EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING =
EVENT_DANIULIVE_PLAYER_SDK | 0x81;
static const EVENT_DANIULIVE_ERC_PLAYER_BUFFERING =
EVENT_DANIULIVE_PLAYER_SDK | 0x82;
static const EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING =
EVENT_DANIULIVE_PLAYER_SDK | 0x83;
static const EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED =
EVENT_DANIULIVE_PLAYER_SDK | 0x91;
}
typedef SmartEventCallback = void Function(int, String, String, String);
class SmartPlayerController {
MethodChannel _channel;
EventChannel _eventChannel;
SmartEventCallback _eventCallback;
void init(int id) {
_channel = MethodChannel('smartplayer_plugin_$id');
_eventChannel = EventChannel('smartplayer_event_$id');
_eventChannel.receiveBroadcastStream().listen(_onEvent, onError: _onError);
}
void setEventCallback(SmartEventCallback callback) {
_eventCallback = callback;
}
void _onEvent(Object event) {
if(event ! =null) {
Map valueMap = json.decode(event);
String param = valueMap['param']; onSmartEvent(param); }}void _onError(Object error) {
// print('error:'+ error);
}
Future<dynamic> _smartPlayerCall(String funcName) async {
var ret = await _channel.invokeMethod(funcName);
return ret;
}
Future<dynamic> _smartPlayerCallInt(String funcName, int param) async {
var ret = await _channel.invokeMethod(funcName, {
'intParam': param,
});
return ret;
}
Future<dynamic> _smartPlayerCallIntInt(
String funcName, int param1, int param2) async {
var ret = await _channel.invokeMethod(funcName, {
'intParam': param1,
'intParam2': param2,
});
return ret;
}
Future<dynamic> _smartPlayerCallString(String funcName, String param) async {
var ret = await _channel.invokeMethod(funcName, {
'strParam': param,
});
return ret;
}
False Soft decoder true Hard decoder The default value is false
/// </summary>
/// <param name="isHwDecoder"></param>
Future<dynamic> setVideoDecoderMode(int isHwDecoder) async {
return _smartPlayerCallInt('setVideoDecoderMode', isHwDecoder);
}
/// <summary>
// Set the audio output mode: if 0: automatic selection; If with 1: Audiotrack mode. This interface is only available on the Android platform
/// </summary>
/// <param name="use_audiotrack"></param>
Future<dynamic> setAudioOutputType(int useAudiotrack) async {
return _smartPlayerCallInt('setAudioOutputType', useAudiotrack);
}
/// <summary>
// Set the buffer size of the player. Default: 200 ms
/// </summary>
/// <param name="buffer"></param>
Future<dynamic> setBuffer(int buffer) async {
return _smartPlayerCallInt('setBuffer', buffer);
}
/// <summary>
/// The interface can be called in real time: set whether to mute in real time, 1: mute; 0: Unmute
/// </summary>
/// <param name="is_mute"></param>
Future<dynamic> setMute(int isMute) async {
return _smartPlayerCallInt('setMute', isMute);
}
/// <summary>
// Set RTSP TCP mode, 1: TCP; 0: UDP
/// </summary>
/// <param name="is_using_tcp"></param>
Future<dynamic> setRTSPTcpMode(int isUsingTcp) async {
return _smartPlayerCallInt('setRTSPTcpMode', isUsingTcp);
}
/// <summary>
/// Set the RTSP timeout period. Timeout is in seconds and must be greater than 0
/// </summary>
/// <param name="timeout"></param>
Future<dynamic> setRTSPTimeout(int timeout) async {
return _smartPlayerCallInt('setRTSPTimeout', timeout);
}
/// <summary>
// set the RTSP TCP/UDP switchover automatically
/// For RTSPS, some may support RTP over UDP, while others may support RTP over TCP.
If udp cannot be played, the SDK will automatically try TCP. If TCP cannot be played, the SDK will automatically try UDP.
/// </summary>
/// <param name="is_auto_switch_tcp_udp"></param>
Future<dynamic> setRTSPAutoSwitchTcpUdp(int is_auto_switch_tcp_udp) async {
return _smartPlayerCallInt('setRTSPAutoSwitchTcpUdp', is_auto_switch_tcp_udp);
}
/// <summary>
// Set quick start mode,
/// </summary>
/// <param name="is_fast_startup"></param>
Future<dynamic> setFastStartup(int isFastStartup) async {
return _smartPlayerCallInt('setFastStartup', isFastStartup);
}
/// <summary>
/// Set the ultra-low latency mode. False Disable true Enable the default false
/// </summary>
/// <param name="mode"></param>
Future<dynamic> setPlayerLowLatencyMode(int mode) async {
return _smartPlayerCallInt('setPlayerLowLatencyMode', mode);
}
/// <summary>
/// Set the video vertical reversal
/// </summary>
/// <param name="is_flip"></param>
Future<dynamic> setFlipVertical(int is_flip) async {
return _smartPlayerCallInt('setFlipVertical', is_flip);
}
/// <summary>
/// Set the video horizontal reversal
/// </summary>
/// <param name="is_flip"></param>
Future<dynamic> setFlipHorizontal(int is_flip) async {
return _smartPlayerCallInt('setFlipHorizontal', is_flip);
}
/// <summary>
/// Set the rotation clockwise, note that any Angle except 0 degrees will consume extra performance
/// degress: Supports 0 °, 90 °, 180 °, and 270 ° rotation
/// </summary>
/// <param name="degress"></param>
Future<dynamic> setRotation(int degress) async {
return _smartPlayerCallInt('setRotation', degress);
}
/// <summary>
// Set whether to call back the download speed
Is_report: if 1: report the download speed, 0: not report the download speed.
/// report_interval: reporting interval, expressed in seconds, >0.
/// </summary>
/// <param name="is_report"></param>
/// <param name="report_interval"></param>
Future<dynamic> setReportDownloadSpeed(
int isReport, int reportInterval) async {
return _smartPlayerCallIntInt(
'setReportDownloadSpeed', isReport, reportInterval);
}
/// <summary>
/// Set Playback Orientation this interface is only suitable for Android platforms
/// </summary>
/// <param name="surOrg"></param>
/// surOrg: current orientation, PORTRAIT 1, LANDSCAPE with 2
Future<dynamic> setOrientation(int surOrg) async {
return _smartPlayerCallInt('setOrientation', surOrg);
}
/// <summary>
// Set whether to take a snapshot during playback or recording
/// </summary>
/// <param name="is_save_image"></param>
Future<dynamic> setSaveImageFlag(int isSaveImage) async {
return _smartPlayerCallInt('setSaveImageFlag', isSaveImage);
}
/// <summary>
/// Snapshots taken during playback or recording
/// </summary>
/// <param name="imageName"></param>
Future<dynamic> saveCurImage(String imageName) async {
return _smartPlayerCallString('saveCurImage', imageName);
}
/// <summary>
/// Quickly switch urls during playback or recording
/// </summary>
/// <param name="uri"></param>
Future<dynamic> switchPlaybackUrl(String uri) async {
return _smartPlayerCallString('switchPlaybackUrl', uri);
}
/// <summary>
// create a video storage path
/// </summary>
/// <param name="path"></param>
Future<dynamic> createFileDirectory(String path) async {
return _smartPlayerCallString('createFileDirectory', path);
}
/// <summary>
// Set the video storage path
/// </summary>
/// <param name="path"></param>
Future<dynamic> setRecorderDirectory(String path) async {
return _smartPlayerCallString('setRecorderDirectory', path);
}
/// <summary>
// Set the size of a single video file
/// </summary>
/// <param name="size"></param>
Future<dynamic> setRecorderFileMaxSize(int size) async {
return _smartPlayerCallInt('setRecorderFileMaxSize', size);
}
/// <summary>
// Set the switch of audio to AAC when recording
/// AAC is more general, SDK adds other audio encoding (such as speex, PCMU, PCMA, etc.) to AAC function.
/// </summary>
/// <param name="is_transcode"></param>
Is_transcode: set to 1, if the audio encoding is not AAC, it is converted to AAC, if it is aAC, it is not converted. If set to 0, no conversion is done. The default is 0.
Future<dynamic> setRecorderAudioTranscodeAAC(int is_transcode) async {
return _smartPlayerCallInt('setRecorderAudioTranscodeAAC', is_transcode);
}
/// <summary>
/// Set the playback path
/// </summary>
Future<dynamic> setUrl(String url) async {
return _smartPlayerCallString('setUrl', url);
}
/// <summary>
/// Start playing
/// </summary>
Future<dynamic> startPlay(a) async {
return _smartPlayerCall('startPlay');
}
/// <summary>
/// stop playing
/// </summary>
Future<dynamic> stopPlay(a) async {
return _smartPlayerCall('stopPlay');
}
/// <summary>
/// start recording
/// </summary>
Future<dynamic> startRecorder(a) async {
return _smartPlayerCall('startRecorder');
}
/// <summary>
/// stop recording
/// </summary>
Future<dynamic> stopRecorder(a) async {
return _smartPlayerCall('stopRecorder');
}
/// <summary>
/// close playback
/// </summary>
Future<dynamic> dispose(a) async {
return await _channel.invokeMethod('dispose');
}
void onSmartEvent(String param) {
if(! param.contains(",")) {
print("[onNTSmartEvent] Android pass parameter error");
return;
}
List<String> strs = param.split(', ');
String code = strs[1];
String param1 = strs[2];
String param2 = strs[3];
String param3 = strs[4];
String param4 = strs[5];
int evCode = int.parse(code);
var p1, p2, p3;
switch (evCode) {
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_STARTED:
print("Go...");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTING:
print("Connecting...");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED:
print("Connection failed.");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTED:
print("Connection successful...");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED:
print("The connection is down...");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_STOP:
print("Stop playing...");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO:
print("Width:" + param1 + ", height: " + param2);
p1 = param1;
p2 = param2;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED:
print("Media data not received, possibly url error.");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL:
print("Toggle playing URL...");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE:
print("Snapshot:" + param1 + "Path:" + param3);
if (int.parse(param1) == 0) {
print("Snapshot interception succeeded. .");
} else {
print("Snapshot interception failed. .");
}
p1 = param1;
p2 = param3;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE:
print("[Record] Starts a new video file:" + param3);
p3 = param3;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED:
print("[Record] has generated a video file:" + param3);
p3 = param3;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING:
print("Start_Buffering");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_BUFFERING:
print("Buffering: " + param1 + "%");
p1 = param1;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING:
print("Stop_Buffering");
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED:
print("download_speed:" + (double.parse(param1) * 8 / 1000).toStringAsFixed(0) + "kbps" + "," + (double.parse(param1) / 1024).toStringAsFixed(0) + "KB/s");
p1 = param1;
break;
}
if(_eventCallback ! =null) { _eventCallback(evCode, p1, p2, p3); }}}Copy the code
Call instance:
//
// main.dart
// main
//
// GitHub: https://github.com/daniulive/SmarterStreaming
// website: https://www.daniulive.com
//
// Created by daniulive on 2019/02/25.
// Copyright © 2014~2019 Daniulive. All rights reserved
//
import 'dart:io';
import 'package:flutter/services.dart';
import 'package:flutter/material.dart';
import 'package:flutter/cupertino.dart';
import 'package:flutter/foundation.dart';
import 'package:smartplayer_native_view/smartplayer.dart';
import 'package:smartplayer_native_view/smartplayer_plugin.dart';
void main() {
///
/// force portrait
///
SystemChrome.setPreferredOrientations(
[DeviceOrientation.portraitUp, DeviceOrientation.portraitDown]);
runApp(new MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => new _MyAppState();
}
class _MyAppState extends State<MyApp> {
SmartPlayerController player;
double aspectRatio = 4.0 / 3.0;
// Enter the RTMP/RTSP URL to play
TextEditingController playback_url_controller_ = TextEditingController();
//Event Event callback display
TextEditingController event_controller_ = TextEditingController();
bool is_playing_ = false;
bool is_mute_ = false;
var rotate_degrees_ = 0;
Widget smartPlayerView() {
return SmartPlayerWidget(
onSmartPlayerCreated: onSmartPlayerCreated,
);
}
@override
void initState() {
print("initState called..");
super.initState();
}
@override
void didChangeDependencies() {
print('didChangeDependencies called.. ');
super.didChangeDependencies();
}
@override
void deactivate() {
print('deactivate called.. ');
super.deactivate();
}
@override
void dispose() {
print("dispose called..");
player.dispose();
super.dispose();
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Flutter SmartPlayer Demo'),
),
body: new SingleChildScrollView(
child: new Column(
children: <Widget>[
new Container(
color: Colors.black,
child: AspectRatio(
child: smartPlayerView(),
aspectRatio: aspectRatio,
),
),
new TextField(
controller: playback_url_controller_,
keyboardType: TextInputType.text,
decoration: InputDecoration(
contentPadding: EdgeInsets.all(10.0),
icon: Icon(Icons.link),
labelText: 'Please enter RTSP/RTMP URL',
),
autofocus: false,
),
new Row(
children: [
new RaisedButton(
onPressed: this.onSmartPlayerStartPlay,
child: new Text("Start playing")),
new Container(width: 20),
new RaisedButton(
onPressed: this.onSmartPlayerStopPlay,
child: new Text("Stop playing")),
new Container(width: 20),
new RaisedButton(
onPressed: this.onSmartPlayerMute,
child: new Text("Live Mute")),
],
),
new Row(
children: [
new RaisedButton(
onPressed: this.onSmartPlayerSwitchUrl,
child: new Text("Switching urls in real time")),
new Container(width: 20),
new RaisedButton(
onPressed: this.onSmartPlayerSetRotation,
child: new Text("Real-time rotation View")),
],
),
new TextField(
controller: event_controller_,
keyboardType: TextInputType.text,
decoration: InputDecoration(
contentPadding: EdgeInsets.all(10.0),
icon: Icon(Icons.event_note),
labelText: 'Event State callback ',
),
autofocus: false,),),))); } void _eventCallback(int code, String param1, String param2, String param3) { String event_str; switch (code) { case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_STARTED: event_str ="Start...";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTING:
event_str = "In connection with...";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED:
event_str = "Connection failed..";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTED:
event_str = "Connection successful..";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED:
event_str = "Connection down...";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_STOP:
event_str = "Stop playing...";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO:
event_str = "Width:" + param1 + ", height: " + param2;
setState(() {
aspectRatio = double.parse(param1) / double.parse(param2);
print('change aspectRatio:$aspectRatio');
});
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED:
event_str = "Media data not received, possible URL error..";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL:
event_str = "Toggle play URL..";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE:
event_str = "Snapshot:" + param1 + " 路径: " + param3;
if (int.parse(param1) == 0) {
print("Snapshot interception succeeded. .");
} else {
print("Snapshot interception failed. .");
}
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE:
event_str = "[record] new file: " + param3;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED:
event_str = "[record] record finished: " + param3;
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING:
//event_str = "Start Buffering";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_BUFFERING:
event_str = "Buffering: " + param1 + "%";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING:
//event_str = "Stop Buffering";
break;
case EVENTID.EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED:
event_str = "download_speed:" +
(double.parse(param1) * 8 / 1000).toStringAsFixed(0) +
"kbps" +
"," +
(double.parse(param1) / 1024).toStringAsFixed(0) +
"KB/s";
break;
}
event_controller_.text = event_str;
}
void onSmartPlayerCreated(SmartPlayerController controller) async {
player = controller;
player.setEventCallback(_eventCallback);
var ret = -1;
// Set video decoder mode
var is_video_hw_decoder = 0;
if (defaultTargetPlatform == TargetPlatform.android)
{
ret = await player.setVideoDecoderMode(is_video_hw_decoder);
}
else if(defaultTargetPlatform == TargetPlatform.iOS)
{
is_video_hw_decoder = 1;
ret = await player.setVideoDecoderMode(is_video_hw_decoder);
}
// Set the buffer time
var play_buffer = 100;
ret = await player.setBuffer(play_buffer);
// Set the quick start
var is_fast_startup = 1;
ret = await player.setFastStartup(is_fast_startup);
// Whether to enable low-latency mode
var is_low_latency_mode = 0;
ret = await player.setPlayerLowLatencyMode(is_low_latency_mode);
// Set report download Speed (default 5 seconds callback users can adjust the report interval)
ret = await player.setReportDownloadSpeed(1.2);
// Set the RTSP timeout
var rtsp_timeout = 10;
ret = await player.setRTSPTimeout(rtsp_timeout);
var is_auto_switch_tcp_udp = 1;
ret = await player.setRTSPAutoSwitchTcpUdp(is_auto_switch_tcp_udp);
// Set the RTSP TCP mode
//ret = await player.setRTSPTcpMode(1);
// Set an initial URL for the first startup to facilitate testing
playback_url_controller_.text = "rtmp://live.hkstv.hk.lxdns.com/live/hks2";
}
Future<void> onSmartPlayerStartPlay() async {
var ret = -1;
if (playback_url_controller_.text.length < 8) {
playback_url_controller_.text =
"rtmp://live.hkstv.hk.lxdns.com/live/hks1"; // Give an initial URL
}
// Real-time mute Settings
ret = await player.setMute(is_mute_ ? 1 : 0);
if(! is_playing_) { ret = await player.setUrl(playback_url_controller_.text); ret = await player.startPlay();if (ret == 0) {
is_playing_ = true;
}
}
}
Future<void> onSmartPlayerStopPlay() async {
if (is_playing_) {
await player.stopPlay();
playback_url_controller_.clear();
is_playing_ = false;
is_mute_ = false;
}
}
Future<void> onSmartPlayerMute() async {
if(is_playing_) { is_mute_ = ! is_mute_; await player.setMute(is_mute_ ?1 : 0);
}
}
Future<void> onSmartPlayerSwitchUrl() async {
if (is_playing_) {
if (playback_url_controller_.text.length < 8) {
playback_url_controller_.text =
"rtmp://live.hkstv.hk.lxdns.com/live/hks1";
}
await player.switchPlaybackUrl(playback_url_controller_.text);
}
}
Future<void> onSmartPlayerSetRotation() async {
if (is_playing_) {
rotate_degrees_ += 90;
rotate_degrees_ = rotate_degrees_ % 360;
if (0 == rotate_degrees_) {
print("Rotate 90 degrees");
} else if (90 == rotate_degrees_) {
print("Rotate 180 degrees.");
} else if (180 == rotate_degrees_) {
print("Rotate 270 degrees.");
} else if (270 == rotate_degrees_) {
print("No rotation"); } await player.setRotation(rotate_degrees_); }}}Copy the code
According to the test, RTMP and RTSP played in the Flutter environment, providing the same excellent playback experience as the Native SDK.