Phenomenon of the problem

The bottom TAB, now everyone is familiar with, click a TAB to switch to a fragment, now the mainstream practice gradually evolved into the bottom TAB when clicking the corresponding icon to do some animation. In general, Lottie is the library that we rely on to do this kind of animation (don’t ask why, it sucks). However, in actual development, we found that if the animation was a little more complex, it would lose frames undetected. Often manifested in:

  • The first click on the TAB will cause the Lottie animation to drop a little bit of frames because of the initialization of the corresponding fragment (which may take some time), as if an animation that was supposed to run 10 frames had run less than 10 frames.

How to prove that this animation is not executed perfectly?

For example, we could print out the execution of this animation

mLottieAnimationView.addAnimatorUpdateListener(new ValueAnimator.AnimatorUpdateListener() { @Override public void onAnimationUpdate(ValueAnimator animation) { Log.v("wuyue", "onAnimationUpdate ===" + animation.getAnimatedFraction()); }});Copy the code

When executing the animation for the first time, note that the animation is initialized with the corresponding fragment.

The second time the animation is executed, the fragment is always initialized.

We can obviously see that the first animation has a lot fewer steps, which means there are a lot of frames, which is a lot of frames, that are not shown. If your animation is detailed and complex enough it’s visible to the naked eye that the animation is not perfect. If your animation is not fine enough, it will not be visible to the naked eye, and may have to be printed in the log this way.

What did Lottie do? Why does an animation drop frames?

This is not hard to predict, because the fragment may consume some system resources when initialized for the first time, or the fragment may not be written well enough, which in itself can cause the main thread to stall.

But why is the main thread stuttering causing Lottie animations to not perform perfectly? What magic does Lottie have? Can also actively detect the frame rate automatic downfrequency operation? With that in mind, let’s take a look at the source code and see what Lottie does.

Let’s explain it a little bit here

public interface FrameCallback {
        /**
         * Called when a new display frame is being rendered.
         * <p>
         * This method provides the time in nanoseconds when the frame started being rendered.
         * The frame time provides a stable time base for synchronizing animations
         * and drawing.  It should be used instead of {@link SystemClock#uptimeMillis()}
         * or {@link System#nanoTime()} for animations and drawing in the UI.  Using the frame
         * time helps to reduce inter-frame jitter because the frame time is fixed at the time
         * the frame was scheduled to start, regardless of when the animations or drawing
         * callback actually runs.  All callbacks that run as part of rendering a frame will
         * observe the same frame time so using the frame time also helps to synchronize effects
         * that are performed by different callbacks.
         * </p><p>
         * Please note that the framework already takes care to process animations and
         * drawing using the frame time as a stable time base.  Most applications should
         * not need to use the frame time information directly.
         * </p>
         *
         * @param frameTimeNanos The time in nanoseconds when the frame started being rendered,
         * in the {@link System#nanoTime()} timebase.  Divide this value by {@code 1000000}
         * to convert it to the {@link SystemClock#uptimeMillis()} time base.
         */
        public void doFrame(long frameTimeNanos);
    }

Copy the code

The doFrame method in this callback you can think of as calling this interface every time the interface draws a frame. In this callback method, we calculate the time difference with the previous callback, so that we can know whether the interface drawing is slow or not. This is the frame rate detection scheme of the mainstream app at present.

So what does Lottie do with this callback?

When you look back, Lottie’s optimization of animation has reached a certain level. Instead of forcing animation to execute every frame, it decides which frame to execute according to the doFrame, avoiding the burden on the entire UI system. If the system works fine, the animation executes flawlessly, and if the system drops frames, it skips some frames and lets the animation finish as soon as possible.

How to fix the problem?

There are actually two fixes to this problem:

  • Use Systrace to monitor where the fragment was stuck the first time it was initialized, directly fixing the essence of the problem. This fix is more thorough, but the fly in the ointment is that it takes time, and if you are not familiar with the code yourself, it is not recommended to use this fix too close to the project’s release
  • Since this problem only occurs when the fragment is initialized for the first time, let’s delay the execution of the first animation until the fragment is initialized.

The code is as follows:

@Override public void onResume() { super.onResume(); If (mIsFirstInitFlag) {((MainActivity) getActivity()).playForumLottieanim (); if (mIsFirstInitFlag) {((MainActivity) getActivity()).playForumLottieanim (); mIsFirstInitFlag = false; }}Copy the code