Although the Baidu voice to text function of Flutter is implemented, the Baidu voice to text function is mainly implemented through the Android terminal, and data is transferred through the Flutter and Android terminal (as shown below) to fulfill the functions required by Flutter. The function of Baidu voice to text can also be realized on ios. However, because my computer is Windows, I cannot use Flutter to develop ios. Therefore, the following tutorial is mainly for Android.
The principle must be to see the effect first, as evidenced by examples: Baidu voice to Chinese characters will generally add a period at the back.
1. Create a Flutter project to implement the View side.
Do not check the red circle below
2. Create an Android Module
1. Open the Android file inside Flutter
Find the Android file inside the Flutter project, then find MainActivty, via Android Studio, there is no need to explode here, start building may be a little slow ==
Open the success
1. Create a Module
Remember to choose Android Library
.Name it yourself, I’ll call it asr_plugin without going into details.New success
3. Configure the Baidu Voice to Text SDK
1. Download the SDK
Baidu SDK download address:ai.baidu.com/sdk
2. Configure the SDK in asr_plugin
1. Find the Core folder
2. Place the following file in the lib file of the asr_plugin file
Don't forget to load the JAR package
3. Paste jniLibs into your SRC file
As shown in the figure: Delete the useless packages, leaving only the so package as shown
.
4. Configure your AndroidMainfest permissions
The code is as follows:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.asr_plugin">
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<application>
<! -- <service android:name="com.baidu.speech.VoiceRecognitionService" android:exported="false" />-->
<meta-data
android:name="com.baidu.speech.APP_ID"
android:value="15519387" />
<meta-data
android:name="com.baidu.speech.API_KEY"
android:value="Ywyh8ErHPQXBSapdQZqVBtdl" />
<meta-data
android:name="com.baidu.speech.SECRET_KEY"
android:value="DFPpDWnOGrNGzxfX07D5GFLryh3d3Nne" />
</application>
</manifest>
Copy the code
Some students may want to change the APP_ID,API_KEY and SECRET_KEY into their own application. However, when I changed the APP_ID,API_KEY and SECRET_KEY into my own application, there was an error code. I forgot what the error was. Can directly these data, if you want to configure their own, you can also try, according to the error code to view the error file, check the error code website: ai.baidu.com/ai-doc/SPEE…
Load the ASr_plugin library into the build.gradle file of the Flutter Android file
This function loads the ASR_plugin into flutter to facilitate data communication later.One more step, remember in
4. Build Bridges between Flutter and Android
1. Configure MethodChannel on the flutter
Create an ASR_manager file and communicate with MethodChannel. There are three types of communication, and here we belong to method communication. There are three methods: start recording, stop recording and cancel recording.The code is as follows:
import 'package:flutter/services.dart';
class AsrManager {
static const MethodChannel _channel = const MethodChannel('asr_plugin');
/// start recording
static Future<String> start({Map params}) async {
return await _channel.invokeMethod('start', params ?? {});
}
/// stop recording
static Future<String> stop(a) async {
return await _channel.invokeMethod('stop');
}
/// Cancel recording
static Future<String> cancel(a) async {
return await _channel.invokeMethod('cancel'); }}Copy the code
2. Configure MethodChannel on Android
These files except AsrPlugin can be found in the Baidu SDK you downloaded, but there are some modifications. These files you can go to my Baidu net disk below to download
, should all be adaptive:Pan.baidu.com/s/1ODBeSEBf…Extraction code: GNKQ
For the AsrPlugin file, you need to configure the Flutter environment on the Android side by adding the code shown in the following figure to the build.gradle file:
def localProperties = new Properties()
def localPropertiesFile = rootProject.file('local.properties')
if (localPropertiesFile.exists()) {
localPropertiesFile.withReader('UTF-8') { reader ->
localProperties.load(reader)
}
}
def flutterRoot = localProperties.getProperty('flutter.sdk')
apply from: "$flutterRoot/packages/flutter_tools/gradle/flutter.gradle"
Copy the code
flutter {
source '.. /.. '
}
Copy the code
This configuration will enable Android to use flutter files. If you have not configured the environment properly, you will not be able to communicate with the Android side of the flutter. Use the following code to implement the start,stop, and cancel operations on the flutter:
import android.Manifest;
import android.app.Activity;
import android.content.pm.PackageManager;
import android.util.Log;
import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import java.util.ArrayList;
import java.util.Map;
import io.flutter.plugin.common.BinaryMessenger;
import io.flutter.plugin.common.MethodCall;
import io.flutter.plugin.common.MethodChannel;
public class AsrPlugin implements MethodChannel.MethodCallHandler {
private final static String TAG = "AsrPlugin";
private final Activity activity;
private ResultStateful resultStateful;
private AsrManager asrManager;
public static void registerWith(Activity activity, BinaryMessenger messenger) {
MethodChannel channel = new MethodChannel(messenger, "asr_plugin");
AsrPlugin instance = new AsrPlugin(activity);
channel.setMethodCallHandler(instance);
}
public AsrPlugin(Activity activity) {
this.activity = activity;
}
@Override
public void onMethodCall(MethodCall methodCall, MethodChannel.Result result) {
initPermission();
switch (methodCall.method) {
case "start":
resultStateful = ResultStateful.of(result);
start(methodCall, resultStateful);
break;
case "stop":
stop(methodCall,result);
break;
case "cancel":
cancel(methodCall,result);
break;
default: result.notImplemented(); }}private void start(MethodCall call, ResultStateful result) {
if (activity == null) {
Log.e(TAG, "Ignored start, current activity is null.");
result.error("Ignored start, current activity is null.", null, null);
return;
}
if(getAsrManager() ! = null) { getAsrManager().start(call.arguments instanceof Map ? (Map) call.arguments : null); }else {
Log.e(TAG, "Ignored start, current getAsrManager is null.");
result.error("Ignored start, current getAsrManager is null.", null, null); }}private void stop(MethodCall call, MethodChannel.Result result) {
if (asrManager != null) {
asrManager.stop();
}
}
private void cancel(MethodCall call, MethodChannel.Result result) {
if (asrManager != null) {
asrManager.cancel();
}
}
@Nullable
private AsrManager getAsrManager(a) {
if (asrManager == null) {
if(activity ! = null && ! activity.isFinishing()) { asrManager =newAsrManager(activity, onAsrListener); }}return asrManager;
}
/** * Android 6.0 requires dynamic application permission */
private void initPermission(a) {
String permissions[] = {Manifest.permission.RECORD_AUDIO,
Manifest.permission.ACCESS_NETWORK_STATE,
Manifest.permission.INTERNET,
Manifest.permission.READ_PHONE_STATE,
Manifest.permission.WRITE_EXTERNAL_STORAGE
};
ArrayList<String> toApplyList = new ArrayList<String>();
for (String perm :permissions){
if(PackageManager.PERMISSION_GRANTED ! = ContextCompat.checkSelfPermission(activity, perm)) { toApplyList.add(perm);// No permission is granted.
}
}
String tmpList[] = new String[toApplyList.size()];
if(! toApplyList.isEmpty()){ ActivityCompat.requestPermissions(activity, toApplyList.toArray(tmpList),123); }}private OnAsrListener onAsrListener = new OnAsrListener() {
@Override
public void onAsrReady() {
}
@Override
public void onAsrBegin() {
}
@Override
public void onAsrEnd() {
}
@Override
public void onAsrPartialResult(String[] results, RecogResult recogResult) {
}
@Override
public void onAsrOnlineNluResult(String nluResult) {
}
@Override
public void onAsrFinalResult(String[] results, RecogResult recogResult) {
if(resultStateful ! = null) { resultStateful.success(results[0]); }} @Override
public void onAsrFinish(RecogResult recogResult) {} @Override
public void onAsrFinishError(int errorCode, int subErrorCode, String descMessage, RecogResult recogResult) {
if (resultStateful != null) {
resultStateful.error(descMessage, null, null);
}
}
@Override
public void onAsrLongFinish(a) {} @Override
public void onAsrVolume(int volumePercent, int volume) {} @Override
public void onAsrAudio(byte[] data, int offset, int length) {} @Override
public void onAsrExit(a) {} @Override
public void onOfflineLoaded(a) {} @Override
public void onOfflineUnLoaded(a) {}}; }Copy the code
3. Configure the packaging environment in the Android file of Flutter
In order to avoid the failure of functions caused by environment conflicts, the configuration is as shown in the figure below:
The code is as follows:
ndk {
abiFilters "armeabi-v7a"."arm64-v8a"."x86_64"."x86" /* Only packages that are supported by Flutter. Flutter does not have the so of Armeabi architecture. The reason for adding x86 is to be compatible with the simulator abiFilters "Armeabi-v7A" release with "armeabi-v7 package */
}
Copy the code
packagingOptions {
/* Make sure app and ASr_plugin both rely on libflutter. So libapp.so Merge does not conflict @https://github.com/card-io/card.io-Android-SDK/issues/186#issuecomment-427552552 */
pickFirst 'lib/x86_64/libflutter.so'
pickFirst 'lib/x86_64/libapp.so'
pickFirst 'lib/x86/libflutter.so'
pickFirst 'lib/arm64-v8a/libflutter.so'
pickFirst 'lib/arm64-v8a/libapp.so'
pickFirst 'lib/armeabi-v7a/libapp.so'
}
Copy the code
5. Register plugin to realize communication function
We mainly call the communication file AsrPlugin in the Android Module we established above in the Android folder of Flutter, and then use it to communicate data communication to achieve the desired effect.
import android.os.Bundle;
import androidx.annotation.NonNull;
import com.example.asr_plugin.AsrPlugin;
import io.flutter.embedding.android.FlutterActivity;
import io.flutter.embedding.engine.FlutterEngine;
import io.flutter.plugins.GeneratedPluginRegistrant;
public class MainActivity extends FlutterActivity {
@Override
public void configureFlutterEngine(@NonNull FlutterEngine flutterEngine) {
GeneratedPluginRegistrant.registerWith(flutterEngine);
// Register the custom plugin with flutter SDK >= v1.17.0
AsrPlugin.registerWith(this, flutterEngine.getDartExecutor().getBinaryMessenger());
}
@Override
protected void onCreate(Bundle savedInstanceState ) {
super.onCreate(savedInstanceState); }}Copy the code
6. Finally, write a DART interface to show off the effects
The code looks like this:
import 'package:flutter/material.dart';
import 'asr_manager.dart';
void main(a){
runApp(MaterialApp(
home: SpeakPage(),
));
}
// speech recognition
class SpeakPage extends StatefulWidget {
@override
_SpeakPageState createState(a) => _SpeakPageState();
}
class _SpeakPageState extends State<SpeakPage>
with SingleTickerProviderStateMixin {
String speakTips = 'Long press to speak';
String speakResult = "'; String result='Palace Museum ticket n Beijing One day tour N Disneyland'; Animation
animation; AnimationController controller; @override void initState() { controller = AnimationController( vsync: this, duration: Duration(milliseconds: 1000)); animation = CurvedAnimation(parent: controller, curve: Curves.easeIn) .. addStatusListener((status) { if (status == AnimationStatus.completed) { controller.reverse(); } else if (status == AnimationStatus.dismissed) { controller.forward(); }}); super.initState(); } @override void dispose() { controller.dispose(); super.dispose(); } @override Widget build(BuildContext context) { return Scaffold( body: Container( padding: EdgeInsets.all(30), child: Center( child: Column( mainAxisAlignment: MainAxisAlignment.spaceBetween, children:
[_topItem(), _bottomItem()], ), ), ), ); } _speakStart() { controller.forward(); setState(() { speakTips = '
- Identifying -'; }); AsrManager.start().then((text) { if (text ! = null && text.length > 0) { setState(() { speakResult = text; result=speakResult; }); /*Navigator.pop(context); Navigator.push(context,MaterialPageRoute(builder: (context)=> SearchPage( keyword: speakResult, ))); */ print(" error "+ text); } }).catchError((e) { print("----------123" + e.toString()); }); } _speakStop() { setState(() { speakTips = 'Long press to speak'; }); controller.reset(); controller.stop(); AsrManager.stop(); } _topItem() { return Column( children:
[ Padding( padding: EdgeInsets.fromLTRB(0, 30, 0, 30), child: Text('
You could say that', style: TextStyle(fontSize: 16, color: Colors.black54))), Text(result, textAlign: TextAlign.center, style: TextStyle( fontSize: 15, color: Colors.grey, )), Padding( padding: EdgeInsets.all(20), child: Text( speakResult, style: TextStyle(color: Colors.blue), ), ) ], ); } _bottomItem() { return FractionallySizedBox( widthFactor: 1, child: Stack( children:
[ GestureDetector( onTapDown: (e) { _speakStart(); }, onTapUp: (e) { _speakStop(); }, onTapCancel: () { _speakStop(); }, child: Center( child: Column( children:
[ Padding( padding: EdgeInsets.all(10), child: Text( speakTips, style: TextStyle(color: Colors.blue, fontSize: 12), ), ), Stack( children:
[Container(// trap), prevent the parent layout size from becoming height: MIC_SIZE, width: MIC_SIZE,), Center(child: AnimatedMic( animation: animation, ), ) ], ) ], ), ), ), Positioned( right: 0, bottom: 20, child: GestureDetector( onTap: () { Navigator.pop(context); }, child: Icon( Icons.close, size: 30, color: Colors.grey, ), ), ) ], ), ); } } const double MIC_SIZE = 80; Class AnimatedMic extends AnimatedWidget {static final _opacityTween = Tween
(begin: 1, end: 0.5); static final _sizeTween = Tween
(begin: MIC_SIZE, end: MIC_SIZE - 20); AnimatedMic({Key key, Animation
animation}) : super(key: key, listenable: animation); @override Widget build(BuildContext context) { final Animation
animation = listenable; return Opacity( opacity: _opacityTween.evaluate(animation), child: Container( height: _sizeTween.evaluate(animation), width: _sizeTween.evaluate(animation), decoration: BoxDecoration( color: Colors.blue, borderRadius: BorderRadius.circular(MIC_SIZE / 2), ), child: Icon( Icons.mic, color: Colors.white, size: 30,),),); }}
Copy the code
Steps might, after all, more than the place where I may have a lot of useless also consider oh, if there are any questions to ask the realization of the function of the mainly comes from my study flutter video of the teacher, the teacher blog: blog.csdn.net/fengyuzheng…