⚠️ note: surprise at the end, be sure to see the end!!

Recently I saw a news that a 1: 1 free gundam is settled in Shanghai Jinqiao.

As a gundam fan, I always want to go to the scene to feel the real sense of pressure of Gundam, but I have no chance to go to Shanghai. But it didn’t bother me, so I decided to make my own 1:1 gundam to try it out.

With the help of AR technology, I finally completed this wish. How, this effect is not inferior to Shanghai jinqiao ~❗

What is AUGMENTED Reality (AR)

AR (augmented reality) is an emerging technology in recent years. It can apply virtual elements such as 3D models to the real world after simulation. Virtual and real information complement each other, so as to “enhance” the real world.

Many people will confuse AR (augmented reality) with VR (virtual reality). The difference between the two lies in the proportion of virtual elements:

  • VR: The scenes and characters you see are all fake. It’s about putting your consciousness into a virtual world.
  • AR: The scenes and characters are partly true and partly false, bringing virtual information into the real world.

VR technology has been studied for more than 30 years, while AR is much younger and is gradually known with the popularity of smart phones and smart wearable devices. Compared with VR, the development threshold of AR is much lower. As long as you have a smartphone, everyone can develop their own AR applications with the help of ARCore provided by Google.

ARCore is an AR solution provided by Google. It provides apis for developers to sense the surrounding environment through Android, iOS and other mobile phone platforms and create immersive AR experience.

ARCore provides developers with three major capabilities:

ARCore ability Schematic diagram
Motion tracking

The relative position changes of virtual elements are determined by tracking the photographer’s position by recognizing the visual feature points in the camera image
Environment to understand

Identifying clusters of feature points on common horizontal or vertical surfaces, such as tables or walls, can also determine plane boundaries on which virtual objects can be placed
The light prediction

Predict lighting conditions for the current scene and use this lighting information to illuminate virtual AR objects

ARCore provides AR with the ability to perceive the surrounding environment, but a complete AR application also needs to deal with the rendering of 3D models, which needs to be completed with OpenGL ES, and the learning cost is very high. Officials are aware of the problem and, following the launch of ARCore in 2017, Sceneform, a 3D image rendering library for Android, was unveiled at IO 2018.

Common 3D model file formats, such as.obj,.fbx or.gltf, can be used in mainstream 3D software, but in Android we can only render them with OpenGL code. Sceneform converts model files in these formats, along with dependent resource files (.mtl,.bin,.png, etc.) to.sfb and.sfb formats. The latter is the binary model file that Sceneform renders, and the former is a readable summary file that describes the latter.

Sceneform is much easier to use than OpenGL, and SFB can also preview the model in the IDE through the AS plug-in provided by Sceneform.

Next, I use Sceneform and ARCore to achieve my 1:1 gundam

1. Gradle adds dependencies

Create a new AndroidStudio project and add Sceneform to root’s build.gradle plugin

dependencies { 
    classpath 'com. Google. Ar. Sceneform: plugin: 1.15.0' 
}
Copy the code

Then rely on ARCore and Sceneform’s AAR in your app’s build.gradle

dependencies {
    ...
    implementation 'com. Google. Ar: core: 1.5.0'
    implementation 'com. Google. Ar. Sceneform. Ux: sceneform - ux: 1.5.1'
    implementation 'com. Google. Ar. Sceneform: core: 1.5.1'
}
Copy the code

2. Manifest the permission


<uses-permission android:name="android.permission.CAMERA"/>

<! -- this App will only be visible on devices that support ARCore in GooglePlay -->
<uses-feature android:name="android.hardware.camera.ar" android:required="true"/>

<application... ><! If ARCore is not installed on the device, GooglePlay will automatically install it.
  <meta-data android:name="com.google.ar.core" android:value="required" />

</application>
Copy the code

3. Layout file

ARFragment can be used to host AR scenarios and respond to user behavior. The simplest way to display virtual elements on Android is to add an ARFragment to your layout:


      
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">
  
<fragment
        android:id="@+id/ux_fragment"
        android:name="com.google.ar.sceneform.ux.ArFragment"
        android:layout_width="match_parent"
        android:layout_height="0dp"
        android:layout_weight="9" />

</FrameLayout>

Copy the code

4. Create an SFB model file

3D models are usually created using professional software such as Maya or 3D Max, and many 3D modeling enthusiasts upload their work to design websites for free or paid download.

We can download 3D model files in common formats on the website. Take.obj as an example, the information of vertices and fragments of polygons is described in obj file, and other information, such as color and material, is stored in the matching.mtl file. We copy the downloaded model files, such as OBj/MTL/PNG, to the non-assets directory to avoid entering APK.

For example app/sampledata

We pass sceneForm. asset(…) at build.gtadle. Add obj > SFB as follows

sceneform.asset('sampledata/msz-006_zeta_gundam/scene.obj'.'default'.'sampledata/msz-006_zeta_gundam/scene.sfa'.'src/main/assets/scene')
Copy the code

Sampledata/MSZ-006_zeta_gundam /scene.obj is the location of obj source file, SRC /main/assets/scene is the generated SFB target path, we generate the target file in assets/, enter apk, Easy to load at run time.

After gradle is configured, sync and build the project. During the build process, an SFB file with the same name will be generated in assets/

5. Load and render the model

//MainActivity.kt

    override fun onCreate(savedInstanceState: Bundle?). {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        
        arFragment = supportFragmentManager.findFragmentById(R.id.ux_fragment) as ArFragment
        arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
        
            if(plane.type ! = Plane.Type.HORIZONTAL_UPWARD_FACING || state ! = AnchorState.NONE) {return@setOnTapArPlaneListener
            }

            val anchor = hitResult.createAnchor()
            placeObject(ux_fragment, anchor, Uri.parse("scene.sfb"))}}Copy the code

ARFragment can respond to user clicks in AR scenarios by adding virtual elements to the click location. Uri.parse(“scene.sfb”) is used to retrieve model files generated in assets.

    private fun placeObject(fragment: ArFragment, anchor: Anchor, model: Uri) {
        ModelRenderable.builder()
                .setSource(fragment.context, model)
                .build()
                .thenAccept {
                    addNodeToScene(fragment, anchor, it)
                }
                .exceptionally { throwable : Throwable ->
                   Toast.makeText(arFragment.getContext(), "Error:$throwable.message", Toast.LENGTH_LONG).show();
                   return@exceptionally null}}Copy the code

Sceneform provides ModelRenderable for model rendering. Load SFB model files using setSource

   private fun addNodeToScene(fragment: ArFragment, anchor: Anchor, renderable: Renderable) {
        val anchorNode = AnchorNode(anchor)
        val node = TransformableNode(fragment.transformationSystem)
        node.renderable = renderable
        node.setParent(anchorNode)
        fragment.arSceneView.scene.addChild(anchorNode)
        node.select()
    }
Copy the code

ARSceneView holds a Scene, which is a tree data structure that serves as the root node of the ARScene, and various virtual elements are added to the Scene as its children for rendering

val node = TransformableNode(fragment.transformationSystem)
node.renderable = renderable
node.setParent(anchorNode)
Copy the code

So, rendering a 3D model is simply adding a Node and setting the Renderable for it.

HitResult is the location information clicked by the user. Anchor creates an Anchor based on hitResult, which is added to the root node of Scene as a child node and also acts as the parent node of TransformableNode. The TransformableNode is used to host the 3D model, which can be dragged and zoomed in and out with gestures. Adding to the Archor is equivalent to placing the 3D model in the position of a click.

6. Complete code

class MainActivity : AppCompatActivity() {
    lateinit var arFragment: ArFragment
    
    override fun onCreate(savedInstanceState: Bundle?). {
        super.onCreate(savedInstanceState)
        if(! checkIsSupportedDeviceOrFinish(this)) return
        setContentView(R.layout.activity_main)
        
        arFragment = supportFragmentManager.findFragmentById(R.id.ux_fragment) as ArFragment
        arFragment.setOnTapArPlaneListener { hitresult: HitResult, plane: Plane, motionevent: MotionEvent? ->
            if(plane.type ! = Plane.Type.HORIZONTAL_UPWARD_FACING)return@setOnTapArPlaneListener
            val anchor = hitresult.createAnchor()
            placeObject(arFragment, anchor, R.raw.cube)
        }
    }

    private fun placeObject(arFragment: ArFragment, anchor: Anchor, uri: Int) {
        ModelRenderable.builder()
                .setSource(arFragment.context, uri)
                .build()
                .thenAccept { modelRenderable: ModelRenderable -> addNodeToScene(arFragment, anchor, modelRenderable) }
                .exceptionally { throwable: Throwable ->
                   Toast.makeText(arFragment.getContext(), "Error:$throwable.message", Toast.LENGTH_LONG).show();
                    return@exceptionally null}}private fun addNodeToScene(arFragment: ArFragment, anchor: Anchor, renderable: Renderable) {
        val anchorNode = AnchorNode(anchor)
        val node = TransformableNode(arFragment.transformationSystem)
        node.renderable = renderable
        node.setParent(anchorNode)
        arFragment.arSceneView.scene.addChild(anchorNode)
        node.select()
    }

    private fun checkIsSupportedDeviceOrFinish(activity: Activity): Boolean {
        if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {
            Toast.makeText(activity, "Sceneform requires Android N or later", Toast.LENGTH_LONG).show()
            activity.finish()
            return false
        }
        val openGlVersionString = (activity.getSystemService<Any>(Context.ACTIVITY_SERVICE) as ActivityManager)
                .deviceConfigurationInfo
                .glEsVersion
        if (openGlVersionString.toDouble() < MIN_OPENGL_VERSION) {
            Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG)
                    .show()
            activity.finish()
            return false
        }
        return true
    }

    companion object {
        private const val MIN_OPENGL_VERSION = 3.0}}Copy the code

CheckIsSupportedDeviceOrFinish used to detect operational environment, through the implementation, the Sceneform operating conditions is AndroidN and OpenGL 3.0 above.

This is all the code, although very little code, very good effect

The last

Sceneform and ARCore can quickly build AR applications. In addition to loading static 3D models, Sceneform can also load models that drive paintings.

With the rise of the concept of “meta-universe”, Google, Facebook and other giants are bound to increase research investment in AR and even VR technology. Virtual reality technology may become a new generation of social and entertainment scenes after the mobile Internet, with huge space for imagination.

Today write here, I want to and just know the little sister to play 🙈

Finally, I recommend a few websites where you can download some “interesting” 3D models

sketchfab.com/

123free3dmodels.com/

(after)


Surprise at the end! 🎉 🎉

Thank you for your attention and welcome to comment on this article! 🙏 🙏

I will select the most popular first two comments users, send a small gift: gold badge 🏅

  • Ranking of popular comments: Likes + comments (excluding the author’s own comments)
  • Statistical deadline: 24 o ‘clock, September 10th

After that, I will contact the winning users to send gifts. Thanks for the support of the Gold Digging Platform!