This is the seventh day of my participation in the August More text Challenge. For details, see:August is more challenging
Unity uses the IfC SDK
Tools to prepare: 1. Basic knowledge of speech recognition 2
The body of the
Android Studio operations
1. The project starts
Create a new project in AS, whatever the name is. Then create a new module and name it file-new-new Module.
2. Create folders
Then create two new folders under the Java folder to manage speech recognition and wake up, and create a new mainActif. class (below).
3. Access Unity’s ckasses.jar package
Access the Unity class in the Unity client installation path: Unity\Editor\Data\PlaybackEngines\AndroidPlayer\Variations\mono\Release\Classes
4. Access the Flyvoice classess.jar package
Copy all files under the libs folder of the AS project to the same AS Unity’s classes.jar, AS shown in the following figure
5. Associate two classes.jar packages
Right click on the.jar file in the libs folder, Add As Libray…You can also right-click iFlyteVoice,Open Module Settings and manually add the.JAR file. When you’re done, click Apply
6. Add libmsc. So
Create a new Jnilibs folder under main and add libmsc.so to it. Libmsc. so under libs\armeabi-v7a in the FlySDK folder, it is better to copy it with the armeabi-v7a folder. Results the following
7. Modify the AndroidManifest file
Copy the following blue content from the AndroidManifest under the app into our Module’s AndroidManifest, as shown belowI’m just going to change it to this, but notice here, I’m going to change the name of this to the name of the class that I created, which is asr.asrport
Then add the following code to the AndroidManifest to get the permissions Such as access, recorder permissions < USES – permission android: name = “android. Permission. RECORD_AUDIO” / > < USES – the permission android:name=”android.permission.INTERNET” />
AndroidManifest is configured
8. Write the SDK interface
At this point, Ifitro’s SDK is almost configured, and the next step is to write the method using the code to write the speech recognition first, written in the asrPort class established before
package com.example.iflytekvoice.asr;
import android.os.Bundle;
import com.example.iflytekvoice.JsonParser;
import com.iflytek.cloud.RecognizerListener;
import com.iflytek.cloud.RecognizerResult;
import com.iflytek.cloud.SpeechConstant;
import com.iflytek.cloud.SpeechError;
import com.iflytek.cloud.SpeechRecognizer;
import com.iflytek.cloud.SpeechUtility;
import com.unity3d.player.UnityPlayer;
import com.unity3d.player.UnityPlayerActivity;
import org.json.JSONException;
import org.json.JSONObject;
import java.util.HashMap;
import java.util.LinkedHashMap;
public class asrPort extends UnityPlayerActivity{
private SpeechRecognizer mIat;
private HashMap<String.String> mIatResults = new LinkedHashMap<String.String> (); @Override protectedvoid onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
/ / initialization
SpeechUtility.createUtility(this, SpeechConstant.APPID + "= 60307482");
mIat = SpeechRecognizer.createRecognizer(this.null);
// Set mIat parameters
// What service is it
mIat.setParameter(SpeechConstant.DOMAIN, "iat");
// Set the language
mIat.setParameter(SpeechConstant.LANGUAGE, "zh_cn");
// The type of the accepted language
mIat.setParameter(SpeechConstant.ACCENT, "mandarin");
// What kind of engine
mIat.setParameter(SpeechConstant.ENGINE_TYPE, SpeechConstant.TYPE_CLOUD);
}
RecognizerListener mRecognizerLis=new RecognizerListener() {
@Override
public void onVolumeChanged(int i, byte[] bytes) {
}
@Override
public void onBeginOfSpeech() {
}
@Override
public void onEndOfSpeech() {
}
@Override
public void onResult(RecognizerResult recognizerResult, boolean b) {
printResult(recognizerResult);
}
@Override
public void onError(SpeechError speechError) {
}
@Override
public void onEvent(int i, int i1, int i2, Bundle bundle){}};// How to parse Json
// Method from speechDemo-> Java -> voiceDemo ->IatDemo printResult method
private void printResult(RecognizerResult results) {
String text = JsonParser.parseIatResult(results.getResultString());
String sn = null;
// Read the SN field in the JSON result
try {
JSONObject resultJson = new JSONObject(results.getResultString());
sn = resultJson.optString("sn");
} catch (JSONException e) {
e.printStackTrace();
}
mIatResults.put(sn, text);
StringBuffer resultBuffer = new StringBuffer();
for (String key : mIatResults.keySet()) {
resultBuffer.append(mIatResults.get(key));
}
// Send the message to the OnResult method on the iFlytekASRController object in the Unity scene
UnityPlayer.UnitySendMessage("iFlytekASRController"."OnResult", resultBuffer.toString());
}
public void beginListen(){
// Start identification
mIat.startListening(mRecognizerLis);
}
public void connected(){
UnityPlayer.UnitySendMessage("iFlytekASRController"."tryConnected"."The connection worked.");
}
public int beginTest(int a, int b){
// Interactive testing
returna+b; }}Copy the code
It also adds a class for parsing Json, which is created directly in the Java folder. JsonParser as follows
package com.example.iflytekvoice;
import org.json.JSONArray;
import org.json.JSONObject;
import org.json.JSONTokener;
/** * Json result parsing class */
public class JsonParser {
public static String parseIatResult(String json) {
StringBuffer ret = new StringBuffer();
try {
JSONTokener tokener = new JSONTokener(json);
JSONObject joResult = new JSONObject(tokener);
JSONArray words = joResult.getJSONArray("ws");
for (int i = 0; i < words.length(); i++) {
// Transliterate the result word, default to use the first result
JSONArray items = words.getJSONObject(i).getJSONArray("cw");
JSONObject obj = items.getJSONObject(0);
ret.append(obj.getString("w"));
// If multiple candidate results are required, parse the other fields of the array
// for(int j = 0; j < items.length(); j++)
/ / {
// JSONObject obj = items.getJSONObject(j);
// ret.append(obj.getString("w"));
/ /}}}catch (Exception e) {
e.printStackTrace();
}
return ret.toString();
}
public static String parseGrammarResult(String json) {
StringBuffer ret = new StringBuffer();
try {
JSONTokener tokener = new JSONTokener(json);
JSONObject joResult = new JSONObject(tokener);
JSONArray words = joResult.getJSONArray("ws");
for (int i = 0; i < words.length(); i++) {
JSONArray items = words.getJSONObject(i).getJSONArray("cw");
for(int j = 0; j < items.length(); j++)
{
JSONObject obj = items.getJSONObject(j);
if(obj.getString("w").contains("nomatch"))
{
ret.append("No match.");
return ret.toString();
}
ret.append("[Result]" + obj.getString("w"));
ret.append("[confidence]" + obj.getInt("sc"));
ret.append("\n"); }}}catch (Exception e) {
e.printStackTrace();
ret.append("No match.");
}
return ret.toString();
}
public static String parseLocalGrammarResult(String json) {
StringBuffer ret = new StringBuffer();
try {
JSONTokener tokener = new JSONTokener(json);
JSONObject joResult = new JSONObject(tokener);
JSONArray words = joResult.getJSONArray("ws");
for (int i = 0; i < words.length(); i++) {
JSONArray items = words.getJSONObject(i).getJSONArray("cw");
for(int j = 0; j < items.length(); j++)
{
JSONObject obj = items.getJSONObject(j);
if(obj.getString("w").contains("nomatch"))
{
ret.append("No match.");
return ret.toString();
}
ret.append("[Result]" + obj.getString("w"));
ret.append("\n");
}
}
ret.append("[confidence]" + joResult.optInt("sc"));
} catch (Exception e) {
e.printStackTrace();
ret.append("No match.");
}
return ret.toString();
}
public static String parseTransResult(String json,String key) {
StringBuffer ret = new StringBuffer();
try {
JSONTokener tokener = new JSONTokener(json);
JSONObject joResult = new JSONObject(tokener);
String errorCode = joResult.optString("ret");
if(! errorCode.equals("0")) {
return joResult.optString("errmsg");
}
JSONObject transResult = joResult.optJSONObject("trans_result");
ret.append(transResult.optString(key));
/*JSONArray words = joResult.getJSONArray("results"); for (int i = 0; i < words.length(); i++) { JSONObject obj = words.getJSONObject(i); ret.append(obj.getString(key)); } * /
} catch (Exception e) {
e.printStackTrace();
}
returnret.toString(); }}Copy the code
The methods added to the asrPort are intended for later interaction with Unity. So far, in the asrPort written speech recognition method class, package the AAR package to Unity to import with.
9. Packaging aar
The project written above in AS is used by Unity. Select the module, right click “Make Module ‘iFlytekvoice’ “and wait for a moment
Copy the AAR package and AndroidManifest into one of your own folders (as follows)
10. Modify AAR and AndroidManifest
1. The files in the aar package are as follows: go to the libs folder, delete the classes.jar in it and replace it with this classes.jar in the root directory, delete the “fake” in the following picture, cut the “true” in it, and leave only the “true” oneThen modify the AndroidManifest in the AAR package and delete the line “Android :label=”. This is the name of the apK that is packaged. If you do not delete this line, it will conflict with the AndroidManifest outside the AAR package. I had to go through a lot of bugs to get it right…)Aar package outside the AndroidManifest, just need to change the package, here change and package is not the same, but in Unity playerSetting PackageName must be set AS here, otherwise Unity can not fall AS written in the method. The following figureAt this point, the operation in AS is over, and the AAR package is also printed out. It can be used in Unity. The following steps are written in Unity
Operation in Unity
1. Import the AAR package
Create a Plugins/Android folder inside Unity’s Assets and place aar in the AndroidManifest file. As shown below.
2. Build a simple UI
The following is an example. A Text component is used to display the content of the speech recognition, and a Button component is used to start the speech recognition with a click. The Try below is to make sure that the AAR package is properly connected in Unity. The method was written in AS to be called after the Button was clicked
3. Script hanging on the iFlytekASRController object
The code is as follows, with just two simple click events, and calls to methods in the AAR package
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
public class SpeechManager : MonoBehaviour
{
public Text ASRmsg;
public Text tryTex;
public Text aad;
private Button TryBtn;
private Button ASR_Btn;
private AndroidJavaObject jo;
private void Awake()
{
ASR_Btn = GameObject.Find("Speech/ASR_Btn").GetComponent<Button>();
TryBtn = GameObject.Find("Test/TryBtn").GetComponent<Button>();
AndroidJavaClass jc = new AndroidJavaClass("com.unity3d.player.UnityPlayer");
jo = jc.GetStatic<AndroidJavaObject>("currentActivity");
}
void Start()
{
ASR_Btn.onClick.AddListener(() = > {
jo.Call("beginListen");
});
TryBtn.onClick.AddListener(tryConnect);
}
public void OnResult(string msg)
{
ASRmsg.text = msg;
}
/// <summary>
/// Click the Try button
/// </summary>
public void tryConnect()
{
int aaa;
aaa=jo.Call<int>("beginTest".2.3);
aad.text = aaa.ToString();
jo.Call("connected");
}
public void tryConnected(string tryMsg)
{
tryTex.text = tryMsg;
Color ramColor = ColorRandom();
tryTex.color = ramColor;
}
public Color ColorRandom()
{
float r = Random.Range(0f,1f);
float g = Random.Range(0f, 1f);
float b = Random.Range(0f, 1f);
Color color = new Color(r, g, b);
returncolor; }}Copy the code
Remember to hang the script on this object because it was written in AS and can be changed in AS
4. Call a method
The following figure is a fixed writing method, we Unity this part of their own free play, as long as the method name and in AndroidStudio to write the same, otherwise can not be(1) Calling methods in Android Studio from Unity
Android Studio calls methods in Unity
5. Modify The PlayerSetting in Unity
The pakeage name must be the same as the pakeage name in the AndroidManifest outside the AAR to call the methods written in the AAR package
6. Package THE APK to the real machine for testing
Summary: so far, a simple IFlytek speech recognition SDK can be used to access Unity, as to how to use it at will play. The article is not short, but the real core parts are few. It looks complicated, but in fact it doesn’t do anything. I don’t understand Android very well, even Android Studio is not very skilled, I groped on the Internet for about a week, then learned to package AAR, Android and Unity simple interaction, access iFlyTEK SDK and so on. This article is mainly written in Unity to receive iFlytek voice recognition SDK the whole process, the core is not much, but every detail can not be wrong, because I was fumbling out from the bug heap, after all, Android this aspect is too far away.
Originally wanted to write an article speech recognition + speech wake up, a read too much, just write the speech recognition, the next article to write speech wake up (only write the voice wake up interface). If anything doesn’t work out, or I’m not quite right here, let me know in the comments section, and I’ll reply when I have time.
You can download the complete resource file Flyvoice recognition + Wake up DemoS. Zip