As 3D technology continues to evolve, more and more museums are holding online exhibitions to allow more users to appreciate the beauty of history. By simulating different environments, lighting projections, and zooming in and out of exhibits at 360°, visitors can enjoy an immersive experience. Not only that, add BGM or voice explanation to the exhibits to help the audience understand the detailed background of the exhibits, so that the demonstration scene has more sense of substitution.

Results indicated

After seeing such a realistic effect display, do you want to know how to achieve it?

This can be done through Android Studio’s Kotlin project, which enables 3D scene building, object display, and sound playback.

1. Prepare 3D models

Huawei Mobile Services’ latest 3D Modeling Kit facilitates easy Modeling. We can automatically generate 3D geometric models and textures of objects by taking images from different angles using our phone camera, providing 3D model building, preview and other capabilities for applications. The specific operation guidance can refer to “5 minutes to build a 3D model of goods, how do I do it?”

2. Create a 3D object view

Next, we will prepare the 3D model of the exhibits and create an interactive 3D object view through Huawei graphics Engine service, as shown in the figure below:

Left left left

Left left left

Huawei graphics engine services are integrated

Software: JDK1.7 or later

• minSdkVersion: Set this parameter to 19 or later

• targetSdkVersion: Set it to 19 or later

• compileSdkVersion: Set it to 19 or later

• Gradle 3.5 or above

In the build.gradle file, configure the following:

buildscript {
    repositories {
        ...
        maven { url 'https://developer.huawei.com/repo/' }
    }
    ...
}

allprojects {
 repositories {
  ...
  maven { url 'https://developer.huawei.com/repo/' }
 }
}
Copy the code

Configure the following in the application-level build.gradle file:

dependencies { ... Scenekit :full-sdk:5.1.0.300'} Implementation 'com.huawei.scenekit:full-sdk:5.1.0.300'} The sample project uses Kotlin's viewBinding function to bypass the view initialization boilerplate code. To enable viewBinding, add the following code to your build.gradle file: Android {... buildFeatures { viewBinding true } ... }Copy the code

Once the build.gradle files are synchronized, you can use the graphics engine service in your project.

In this article, we simply need to use the service to display and interact with 3D images of objects. If you need to use other functions, see huawei GRAPHICAL Engine service official documents.

Creating a 3D view

The purpose of creating a custom view is simply to ensure that the first model is automatically loaded into the view once the view is initialized. Manually implement model loading through the default SceneView, as shown below:

import android.content.Context import android.util.AttributeSet import android.view.SurfaceHolder import com.huawei.hms.scene.sdk.SceneView class CustomSceneView : SceneView { constructor(context: Context?) : super(context) constructor( context: Context? , attributeSet: AttributeSet? ) : super(context, attributeSet) override fun surfaceCreated(holder: SurfaceHolder) { super.surfaceCreated(holder) loadScene("qinghuaci/scene.gltf") loadSpecularEnvTexture("qinghuaci/specularEnvTexture.dds") loadDiffuseEnvTexture("qinghuaci/diffuseEnvTexture.dds") } }Copy the code

The display items need to add related model files, open the project folder, create the “assets” folder under the “SRC /main” path, and save the 3D model files, such as:

The loadScene(), loadSpecularEnvTexture() and loadDiffuseEnvTexture() methods in surfaceCreated are used to load items. Once the Surface is created, the first item will be loaded into the Surface. Next, open the XML file, in this case activity_main.xml, that displays the VIEW of the 3D model. In this file, create the CustomSceneView you just constructed. The code below uses an arrow image to switch between different item models.

<? The XML version = "1.0" encoding = "utf-8"? > <androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:app="http://schemas.android.com/apk/res-auto" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".MainActivity"> <com.example.sceneaudiodemo.CustomSceneView android:id="@+id/csv_main" android:layout_width="match_parent" android:layout_height="match_parent"/> <ImageView android:id="@+id/iv_rightArrow" android:layout_width="32dp" android:layout_height="32dp" android:layout_margin="12dp" android:src="@drawable/ic_arrow" android:tint="@color/white" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintTop_toTopOf="parent" /> <ImageView android:id="@+id/iv_leftArrow" android:layout_width="32dp" android:layout_height="32dp" android:layout_margin="12dp" android:rotation="180" android:src="@drawable/ic_arrow" android:tint="@color/white" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintStart_toStartOf="parent" app:layout_constraintTop_toTopOf="parent" /> </androidx.constraintlayout.widget.ConstraintLayout>Copy the code

When everything is ready, the application opens to reveal the first exhibit: a blue and white porcelain vase.

Add switching function

Now, we can view multiple 3D models of exhibits through the switching function. In MainActivity, configure the following information:

private lateinit var binding: ActivityMainBinding private var selectedId = 0 private val modelSceneList = arrayListOf( "qinghuaci/scene.gltf", "tangyong/scene.gltf", ) private val modelSpecularList = arrayListOf( "qinghuaci/specularEnvTexture.dds", "tangyong/specularEnvTexture.dds", ) private val modelDiffList = arrayListOf( "qinghuaci/diffuseEnvTexture.dds", "tangyong/diffuseEnvTexture.dds", ) override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) binding = ActivityMainBinding.inflate(layoutInflater) val view = binding.root setContentView(view) binding.ivRightArrow.setOnClickListener { if (modelSceneList.size == 0) return@setOnClickListener SelectedId = (selectedId + 1) % modelscenelist.size // Make sure the ID is within the range of the model list. loadImage() } binding.ivLeftArrow.setOnClickListener { if (modelSceneList.size == 0) return@setOnClickListener if (selectedId == 0) selectedId = modelscenelist.size - 1 // Make sure the ID is in the range of the model list. else selectedId -= 1 loadImage() } } private fun loadImage() { binding.csvMain.loadScene(modelSceneList[selectedId]) binding.csvMain.loadSpecularEnvTexture(modelSpecularList[selectedId]) binding.csvMain.loadDiffuseEnvTexture(modelDiffList[selectedId]) }Copy the code

In onCreate(), a simple logic is created to look at the next/previous model. The item file path is stored as a string in each hard-coded list. You can modify this logic to make the model more dynamic. SelectedId indicates the ID of the item model being displayed. In this way, SceneView is used to display the 3D model, and the effect is as follows:

3. Add explanatory words for exhibits

When loading different 3D models, we can play the corresponding explanatory words of the exhibits through Huawei audio service to provide users with a detailed introduction of the exhibits. Huawei audio services are integrated

Software requirements:

• JDK 1.8.211 or later

• minSdkVersion: Set this parameter to 21

• targetSdkVersion: Set this parameter to 29

• compileSdkVersion: Set it to 29

• Gradle 4.6 or later

As you can see, the audio service is more demanding than the graphics engine service software, so we need to make sure that the audio service usage requirements are met.

First, open the application-level build.gradle file and add the configuration related to the audio service.

dependencies { ... Implementation 'com. Huawei. HMS: audiokit - player: 1.1.0.300'... }Copy the code

When configuring the graphics engine service, you have already added the necessary libraries, so there is no need to change the project-level build.gradle. In the activity_main.xml file, add a simple play button.

<Button
    android:id="@+id/btn_playSound"
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:text="Play"
    app:layout_constraintBottom_toBottomOf="parent"
    app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
Copy the code

This button can be used to play sounds for items on display. Next, add the following configuration to MainActivity:

private var mHwAudioManager: HwAudioManager? = null private var mHwAudioPlayerManager: HwAudioPlayerManager? = null override fun onCreate(savedInstanceState: Bundle?) {... initPlayer(this) binding.btnPlaySound.setOnClickListener { mHwAudioPlayerManager? .play(selectedId) // Create a playlist instance. SelectedId: Parameter of the track to play. }... } private fun initPlayer(context: Context) { val hwAudioPlayerConfig = HwAudioPlayerConfig(context) HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig, object : HwAudioConfigCallBack { override fun onSuccess(hwAudioManager: HwAudioManager?) { try { mHwAudioManager = hwAudioManager mHwAudioPlayerManager = hwAudioManager? .playerManager mHwAudioPlayerManager? .playList(getPlaylist(), 0, 0) } catch (ex: Exception) { ex.printStackTrace() } } override fun onError(p0: Int) { Log.e("init:onError: ","$p0") } }) } fun getPlaylist(): List<HwAudioPlayItem>? { val playItemList: MutableList<HwAudioPlayItem> = ArrayList() val audioPlayItem1 = HwAudioPlayItem() val sound = Uri. Parse (" android. Resource: / / yourpackagename/raw/soundfilename "). The toString () / / soundfilename does not include the file name extension. audioPlayItem1.audioId = "1000" audioPlayItem1.singer = "Taoge" audioPlayItem1.onlinePath = "Https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-chengshilvren.mp3" / / Demo using songs here audioPlayItem1.setOnline(1) audioPlayItem1.audioTitle = "chengshilvren" playItemList.add(audioPlayItem1) val audioPlayItem2 = HwAudioPlayItem() audioPlayItem2.audioId = "1001" audioPlayItem2.singer = "Taoge" AudioPlayItem2. OnlinePath = "https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-dayu.mp3" / / Demo using songs here audioPlayItem2.setOnline(1) audioPlayItem2.audioTitle = "dayu" playItemList.add(audioPlayItem2) return playItemList }Copy the code

After the above configuration is added, the explanation word can be played for the exhibits. This project uses audio for online resources. To play local audio, see the official website. In this way, you can import the audio file and play the sound for the object.

At this point, we can create a 360° rotating, zooming in and out, with sound effects.

Finally, in addition to 3D cultural relic display and other application scenarios, we can also apply these capabilities to many related industries, such as:

Facial cuteness, video emojis and video virtual backgrounds in the field of online social networking; 3D commodity display, home decoration scene rendering, AR fitting in the field of e-commerce shopping; 3D unlock screensaver/phone theme, 3D special effects rendering, live emojis in audio and video field; In the field of education, 3D teaching, 3D books, VR distance teaching.

For more details, please refer to:

Huawei 3D modeling service and open source warehouse

Huawei graphical engine service official website and open source warehouse

Huawei audio service official website and open source warehouse

Huawei HMS Core official forum

To solve integration problems, go to Stack Overflow

Click here to learn about the latest HMS Core technology