First, the AppWidgetProvider

A Widget is a tiny view of an application. It can be embedded in applications like the desktop as a carrier for the small functionality of our application. Since the Widgets themselves are broadcastReceivers and the application Widget layout is based on RemoteViews, Not every layout or View widget is supported. Currently, only the following View classes are supported. If you need other support views or custom views, you need to add containers to the FramWork layer:

- 'FrameLayout' - 'LinearLayout' - 'RelativeLayout' - 'GridLayout' and the following widget classes:  - `AnalogClock` - `Button` - `Chronometer` - `ImageButton` - `ImageView` - `ProgressBar` - `TextView` - `ViewFlipper` -  `ListView` - `GridView` - `StackView` - `AdapterViewFlipper`Copy the code

Second, common visual features of widgets

Depending on the official website and the documentation and other people’s blogs maybe the only thing we can do is a simple layout, list, at most a 🉑️ drag card or a working clock and of course our own widgets are beautiful.

Tiktok: Yes, the same search box has 6 styles to choose from after installation.

OneNote: Simple design.

Our calendar: it looks good

Many of these widgets on the market provide shortcuts based on common application modules. There’s not a lot of operational digging going on. Of course,Google probably adapted this way to avoid memory jitter on the CPU, or desktop performance issues. But as developers we care about the beauty of our products and the satisfaction of our users. So animations and custom drawing are necessary.

Can widgets not animate? The custom?

Can we still do all those Android custom views and all those fancy whistles? Doesn’t animation smell good? Isn’t custom View beautiful? So can we fix this? Framwork coats help [after all, many developers are impossible to modify the system unless the bottom has its own system, formulate], groped on existing API Widget online, today we are going to analysis, step by step 1 different development period of years so I hope everyone understand, after all, the article I wrote repetitive, hope and I have the same People who have been there can understand it. Do we have any ideas for the following animations and customizations for desktop widgets? Most developers who read the website or write about widgets say no. How about the following animations and customizations? My answer is yes, and then we step by step into the business, began to explore.

1. Animation:

The second list of nice animations in a View

The waves animation

Sonic animation

2. Customization:

Some nice statistics in View

The line chart

K line graph



For too little use of custom API can see my custom articles, was really sweat and sweat. I hope it helps. A “like” would be nice. Before, a little brother was so excited to see my article that he gave me a “like” for each article, added friends and praised me in various ways. He lost contact with me for a long time. The day before yesterday, he suddenly asked me whether THE line chart customization on Earchars could be done!!

Jetpack-compose basic layout jetpack-compose basic layout jetpack-compose basic layout Jetpack-compose – Custom draw jetpack-compose – Flutter dynamic UI? The Jetpack-Compose UI concludes with the Jetpack-Compose ink painting effect

Widget animation

For example, for Compose, there is an API for creating a simple Widget. For example, for Compose, there is an API for creating a simple Widget. You can read Jetpack Glance? Of course, wouldn’t it be nice to buy a copy of Jetpack Compose’s new UI programming for Android? . The first two animation let us have a preparation, the following animation is today we want to achieve the animation module content. Sound wave, water wave effect animation.

1. Create the Widget in four steps

  • Step 1 Androidmanifest.xml registration

We mentioned earlier that the Widget itself is a broadcast receiver, and of course it can be registered dynamically. However, we need to be aware that our widget is dependent on the desktop application and not our App, so dynamic and static registration requires us to consider different product requirements. If our widget is not dependent on the App hosting the application for any period, this resident type of application widget static is preferred. If you don’t understand, look at the difference between broadcast static and dynamic registration. Next we register statically in AndroidManifest:

<? xml version="1.0" encoding="utf-8"? ><manifest xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    package="com.zui.recorder">
    
 <application
    android:name=".RecorderApplication"
    android:icon="@mipmap/ic_launcher_soundrecorder"
    android:label="@string/app_name"
    android:requestLegacyExternalStorage="true"
    android:resizeableActivity="true"
    android:supportsRtl="true"
    android:testOnly="false"
    android:theme="@style/AppBaseTheme">
       <! --name: is the concrete class for which we will create the Widget later -->
        <receiver
            android:name=".ui.translation.widget.RecorderAppWidget"
            android:exported="true">
            <intent-filter><! -- Action: is used to update the behavior of our widget -->
                <action android:name="android.appwidget.action.APPWIDGET_UPDATE" />
            </intent-filter>
            <meta-data
                android:name="android.appwidget.provider"
                android:resource="@xml/recorder_widget" />
        </receiver>
 </application>
</manifest>
Copy the code
  • The second step defines the basic characteristics of the application widget

For details, see the official website to define the basic features of the application widget, which should be almost easy to read. Set the space limit of the widget on the desktop, initialize the view layout, update the widget time, zoom direction mode, etc. If you set the maximum limit, you can upgrade the SDK to Android CompileSdkVersion = 31.

<? xml version="1.0" encoding="utf-8"? ><appwidget-provider xmlns:android="http://schemas.android.com/apk/res/android"
    android:description="@string/app_name"
    android:initialKeyguardLayout="@layout/widget_recorder_remote_view"
    android:initialLayout="@layout/widget_recorder_remote_view"
    android:minWidth="255dp"
    android:minHeight="100dp"
    android:minResizeWidth="255dp"
    android:minResizeHeight="100dp"
    android:previewImage="@drawable/blur_bg"
    android:resizeMode="horizontal|vertical"
    android:updatePeriodMillis="20000"
    android:widgetCategory="home_screen" />
Copy the code
  • The third step defines the initial layout of the application widgets

Create the application widget layoutDefine the initial layout of the application widgets in XML and save it in the project’sres/layout/Directory. We also mentioned above that Widget layout is based onRemoteViewsNot every layout or view widget is supported. Can support can see above or official website. Here is the first view effect we completed today:On the left is a button that can control playback stop and finish recording plus a recording text. On the right is a follow recording state where you can follow the recording for wave animation. Is this a bit of a vinegar patch? Can widgets be animated? Let’s take a look at our widget style.

<! -- For dimen these and naming their own specifications, the pursuit of speed so arbitrarily named and layout inside write dead dp etc. --> <? xml version="1.0" encoding="utf-8"? ><LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:layout_margin="10dp"
    android:background="@drawable/widget_recorder_shape"
    android:elevation="10dp"
    android:orientation="vertical">

    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content">

        <LinearLayout
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:layout_marginStart="10dp"
            android:layout_marginTop="10dp"
            android:layout_marginEnd="10dp"
            android:background="@drawable/widget_recorder_inner_shape"
            android:elevation="10dp"
            android:orientation="vertical"
            android:padding="5dp">

            <TextView
                android:id="@+id/widget_title_text"
                android:layout_width="match_parent"
                android:layout_height="wrap_content"
                android:layout_marginStart="10dp"
                android:layout_marginTop="3dp"
                android:layout_marginEnd="10dp"
                android:layout_marginBottom="2dp"
                android:gravity="start"
                android:text="@string/app_name"
                android:textColor="@color/recorder_widget_title"
                android:textSize="14sp" />

            <LinearLayout
                android:layout_width="match_parent"
                android:layout_height="wrap_content"
                android:layout_marginStart="5dp"
                android:layout_marginEnd="20dp"
                android:orientation="horizontal">
                <! -- Left play and pause button -->
                <ImageView
                    android:id="@+id/widget_stop_bn"
                    android:layout_width="wrap_content"
                    android:layout_height="wrap_content"
                    android:background="@drawable/notification_btn_pause" />
                <! -- Left end recording button -->
                <ImageView
                    android:id="@+id/widget_finish_bn"
                    android:layout_width="wrap_content"
                    android:layout_height="wrap_content"
                    android:layout_marginStart="20dp"
                    android:background="@drawable/notification_finish" />
            </LinearLayout>
        </LinearLayout>

        <RelativeLayout
            android:layout_width="0dp"
            android:layout_height="66dp"
            android:layout_marginTop="10dp"
            android:layout_marginEnd="10dp"
            android:layout_weight="1"
            android:background="@drawable/widget_recorder_inner_shape"
            android:padding="5dp">
            <! -- Right vibrate animation -->
            <ImageView
                android:id="@+id/widget_wave"
                android:layout_width="wrap_content"
                android:layout_height="35dp"
                android:layout_centerInParent="true"
                android:layout_marginStart="20dp"
                android:scaleType="fitXY" />
        </RelativeLayout>
    </LinearLayout>
</LinearLayout>
Copy the code
  • The fourth step inherits AppWidgetProvider

The AppWidgetProvider class extends BroadcastReceiver as a helper class to handle application widget broadcasting. AppWidgetProvider only receives event broadcasts related to application widgets, such as those made when an application widget is updated, deleted, enabled, and disabled. When these broadcast events happen, AppWidgetProvider Will receive and invoke the corresponding onUpdate (), onReceive (), onAppWidgetOptionsChanged, onDeleted, onRestored onDisabled, specific methods can watch source or website explanation. Next we implement our recorder widget class by inheriting AppWidgetProvider.

/**I * Created by wangfei44 on 2021/12/28. */
class RecorderAppWidget : AppWidgetProvider() {
   
    override fun onUpdate(
        context: Context? , appWidgetManager:AppWidgetManager? , appWidgetIds:IntArray?,
    ) {
        Log.i(TAG, "onUpdate")
        super.onUpdate(context, appWidgetManager, appWidgetIds)
    }

 
    override fun onEnabled(context: Context) {
        super.onEnabled(context)
    }

  
    override fun onDisabled(context: Context) {
        super.onDisabled(context)
    }

    override fun onReceive(context: Context? , intent:Intent?). {
        super.onReceive(context, intent)
    }
}
Copy the code

Next we’ll RecorderAppWidget registered to manifest file < receiver android: name = “. The UI. The translation. The widget. The RecorderAppWidget “… / > run. Long press the desktop to select our widget and it looks like this:

2. Communication and refresh between widgets and apps

For interactive refreshes between the application layer and the widget, we can refresh each other with an intent. The transmission of data can be carried with the intent. For example, when I record audio, I can send a broadcast in the service to pass the data and refresh the widget to display the recorded time or other data. Conversely, clicking the Pause and finish recording buttons on the Widget can also update the current status of the application by broadcasting to the recorder service. Widgets can be updated in real time through AppwidgetManager.

private BroadcastReceiver widgetBroadcastReceiver;
private void registerWidgetReceiver() {
    if (null == widgetBroadcastReceiver) {
        widgetBroadcastReceiver = new BroadcastReceiver() {
            @Override
            public String toString() {
                return "$classname{}";
            }

            @Override
            public void onReceive(Context context, Intent intent) {
                switch (intent.getAction()) {
                    case ACTION_CANCEL_TIMER: {
                        if (isRecording()) {
                            // The widget notifies the recorder to enter pause state
                            pauseRecording(true);
                            // To refresh the widget icon content:
                            sendBroadCastToRecorderWidget();
                        } else if (getState() == State.RECORD_PAUSED) {
                            resumeRecording(true);
                        }
                        break;
                    }
                    case ACTION_RESUME_TIMER: {
                        // Widget notification comes over to complete recording enter IDLE
                        stopRecording();
                        break; }}}}; } IntentFilter filter = new IntentFilter(); filter.addAction(ACTION_CANCEL_TIMER); filter.addAction(ACTION_RESUME_TIMER);try {
        registerReceiver(widgetBroadcastReceiver, filter);
    } catch (Exception e) {
        Logger.i("registerWidgetReceiver error ::: $e"); }}private void unregisterWidgetReceiver() {
    if (widgetBroadcastReceiver == null) {
        return;
    }
    try {
        unregisterReceiver(widgetBroadcastReceiver);
    } catch (java.lang.Exception e) {
        Logger.e("unregisterWidgetReceiver error ::: $e");
    }
    widgetBroadcastReceiver = null;
}



// Broadcast a high frequency refresh widget inside the RecorderService
private void sendBroadCastToRecorderWidget() {
   
    Intent updateWidgetIntent = new Intent();
    // Specify the name of the broadcast action
    updateWidgetIntent.setAction(RecorderAppWidget.UPDATE_ACTION);
    // Transfers the current recording status of the recorder
    updateWidgetIntent.putExtra(WIDGET_STATE_EXTRA_NAME,getState().ordinal());
    // Transfers the current recording time of the recorder
    updateWidgetIntent.putExtra(WIDGET_TIME_EXTRA_NAME, Utils.formatTime(getRecordingTime()));
    // Send a broadcast
    sendBroadcast(updateWidgetIntent);
}


//RecorderAppWidget
companion object {
    const val TAG = "RecorderAppWidget"
    const val UPDATE_ACTION = "android.appwidget.action.APPWIDGET_UPDATE"

    // The current status of the recorder and the recording time
    const val WIDGET_STATE_EXTRA_NAME = "state"
    const val WIDGET_TIME_EXTRA_NAME = "time"
    // Record status of the tape recorder

    const val STATE_IDLE = 0
    const val STATE_PLAYING = 1
    const val STATE_PLAY_PAUSED = 2
    const val STATE_RECORDING = 3
    const val STATE_RECORDING_FROM_PAUSED = 4
    const val STATE_RECORD_PAUSED = 5
}

// Update the Widget view by receiving more different states inside the RecorderAppWidget
override fun onReceive(context: Context, intent: Intent) {
    this.context = context
    super.onReceive(context, intent)
    Log.i(TAG, "onReceive")
    val remoteViews = RemoteViews(context.packageName, R.layout.widget_recorder_remote_view)
    val appWidgetIds = AppWidgetManager.getInstance(context)
        .getAppWidgetIds(
            ComponentName(
                context,
                RecorderAppWidget::class.java
            )
        )
    if (null== intent.action || UPDATE_ACTION ! = intent.action) {return
    }
    val titleStart = getTitleStart(context, getState(intent))
    when (getState(intent)) {
        STATE_RECORDING, STATE_RECORDING_FROM_PAUSED -> {
            remoteViews.setTextViewText(
                R.id.widget_title_text,
                getTimeString(titleStart, context, intent))
            remoteViews.setImageViewResource(
                R.id.widget_stop_bn,
                R.drawable.notification_btn_pause)
            remoteViews.setWidgetOnClickPendingIntent(context,
                R.id.widget_stop_bn,
                ACTION_CANCEL_TIMER)
            remoteViews.setWidgetOnClickPendingIntent(context,
                R.id.widget_finish_bn,
                ACTION_RESUME_TIMER)
            remoteViews.setTextViewText(R.id.widget_time,
                getTimeString("", context, intent))
            remoteViews.setTextViewText(R.id.widget_time_center,
                getTimeString("", context, intent))
        }
        STATE_IDLE, STATE_RECORD_PAUSED -> {
            updateAnimate(getState(intent))
            remoteViews.setTextViewText(
                R.id.widget_title_text,
                getTimeString(titleStart, context, intent))
            remoteViews.setImageViewResource(
                R.id.widget_stop_bn,
                R.drawable.notification_btn_resume)
        }

    }

    // Update widgets are handled by AppwidgetManager
    val awm = AppWidgetManager.getInstance(context.applicationContext)
    awm.updateAppWidget(appWidgetIds, remoteViews)

}
// Gets the recording status of the recorder corresponding to the widget
private fun getState(intent: Intent): Int {
    return intent.getIntExtra(WIDGET_STATE_EXTRA_NAME, STATE_IDLE)
}

// Get the recording time of the recorder
private fun getTimeString(titleStart: String, context: Context, intent: Intent): String {
    var time = intent.getStringExtra(WIDGET_TIME_EXTRA_NAME)
    if (null == time) {
        time = ""
    }
    if (time.isNotEmpty()) {
        time = "$titleStart $time"
    }
    return time
}

private fun RemoteViews.setWidgetOnClickPendingIntent(
    context: Context,
    id: Int,
    action: String.) = this.apply {
    setOnClickPendingIntent(id, PendingIntent
        .getBroadcast(
            context, 0, Intent().setAction(action),
            PendingIntent.FLAG_IMMUTABLE
        ))
}
Copy the code

Here we basic notice each other brush done

Widget animation implementation

Think about animation. What is animation? Let’s move on to the implementation of the animation. Animation simplicity: Play a sequence of images. Most developers have probably played with frame animation and tween animation. For animation fluency [FPS] is of course determined by the number of images [frames] that occur per second, let’s see what we need to achieve for this animation. Let’s see what happens on the right

Widgets are refreshed based on RemoteViews, not by View animation. How do I refresh RemoteViews? Understand the refresh principle we also have a breakthrough to achieve, we can refresh the ImageView resources through the sequence is not animation? The refresh of the frame animation should look and sound smooth at around 30 frames per second. And then we’re going to go to the UI and we’re going to get the material which is the image of each frame. Of course, if you write a demo like I did, you can make frames yourself. Let’s baidu a material:

  • The first step

    GIF or MP4 material, with Gifski,MP4 can be converted to GIF; Or using Kap, you can directly capture parts to generate MP4 or GIF.

  • The second step

    Create and export frames through PhotoShop. After opening the GIF, select all the layers on the right with Shift

Then select the required part through the giant box selection tool, and cut or cut the whole through image -> Crop

Select all images of the layer and quickly export them to PNG. Can be completed.

And then we drop it into the drawble directory.

For how to refresh the image, of course, you can control a certain period of time to refresh the view can be done. So how can we refresh images? Of course you can think of Handle and Runnable or CountDownTimer. Of course we use ValueAnimal to update, which is better than the other ValueAnimal refresh mechanism.

val IMAGES = arrayListOf( R.drawable.wave_animal_01, ... R.drawable.wave_animal_55)// Here we set the sequence of animation values to 0 to size-1, i.e., the bottom 0 to the last image in the corresponding image array.
val valueAnimator: ValueAnimator = ValueAnimator.ofInt(0, IMAGES.size - 1)<br/>
var duration = IMAGES.size * 55L
Copy the code
class RecorderAppWidget : AppWidgetProvider(a){
    companion object {
        const val TAG = "RecorderAppWidget"
        const val UPDATE_ACTION = "android.appwidget.action.APPWIDGET_UPDATE"

        // The current status of the recorder and the recording time
        const val WIDGET_STATE_EXTRA_NAME = "state"
        const val WIDGET_TIME_EXTRA_NAME = "time"
        // Record status of the tape recorder

        const val STATE_IDLE = 0
        const val STATE_PLAYING = 1
        const val STATE_PLAY_PAUSED = 2
        const val STATE_RECORDING = 3
        const val STATE_RECORDING_FROM_PAUSED = 4
        const val STATE_RECORD_PAUSED = 5

        var isFirst = true
        var lastIndex = 0

        val IMAGES = arrayListOf(
            R.drawable.wave_animal_01,
            R.drawable.wave_animal_02,
            R.drawable.wave_animal_03,
            R.drawable.wave_animal_04,
            R.drawable.wave_animal_05,
            R.drawable.wave_animal_06,
            R.drawable.wave_animal_07,
            R.drawable.wave_animal_08,
            R.drawable.wave_animal_09,
            R.drawable.wave_animal_10,
            R.drawable.wave_animal_11,
            R.drawable.wave_animal_12,
            R.drawable.wave_animal_13,
            R.drawable.wave_animal_14,
            R.drawable.wave_animal_15,
            R.drawable.wave_animal_16,
            R.drawable.wave_animal_17,
            R.drawable.wave_animal_18,
            R.drawable.wave_animal_19,
            R.drawable.wave_animal_20,
            R.drawable.wave_animal_21,
            R.drawable.wave_animal_22,
            R.drawable.wave_animal_23,
            R.drawable.wave_animal_24,
            R.drawable.wave_animal_25,
            R.drawable.wave_animal_26,
            R.drawable.wave_animal_27,
            R.drawable.wave_animal_28,
            R.drawable.wave_animal_30,
            R.drawable.wave_animal_31,
            R.drawable.wave_animal_32,
            R.drawable.wave_animal_33,
            R.drawable.wave_animal_34,
            R.drawable.wave_animal_35,
            R.drawable.wave_animal_36,
            R.drawable.wave_animal_37,
            R.drawable.wave_animal_38,
            R.drawable.wave_animal_39,
            R.drawable.wave_animal_40,
            R.drawable.wave_animal_41,
            R.drawable.wave_animal_42,
            R.drawable.wave_animal_43,
            R.drawable.wave_animal_44,
            R.drawable.wave_animal_45,
            R.drawable.wave_animal_46,
            R.drawable.wave_animal_47,
            R.drawable.wave_animal_48,
            R.drawable.wave_animal_49,
            R.drawable.wave_animal_50,
            R.drawable.wave_animal_51,
            R.drawable.wave_animal_52,
            R.drawable.wave_animal_54,
            R.drawable.wave_animal_55,
            R.drawable.wave_animal_56,
            R.drawable.wave_animal_57,
        )
        val valueAnimator: ValueAnimator = ValueAnimator.ofInt(0, IMAGES.size - 1)
        var duration = IMAGES.size * 55L
    }

    private lateinit var context: Context
    lateinit var viewModel: SmartTranslationViewModel

    override fun onUpdate(context: Context? , appWidgetManager: AppWidgetManager? , appWidgetIds: IntArray? .) {
        Log.i(TAG, "onUpdate")
        super.onUpdate(context, appWidgetManager, appWidgetIds)
    }

    // This method is called when the Widget is first created and then starts the service in the background
    override fun onEnabled(context: Context) {
        super.onEnabled(context)
    }

    // This method is called when all the widgets on the desktop have been deleted
    override fun onDisabled(context: Context) {
        super.onDisabled(context)
    }
    // We are sending a broadcast every second in RecorderServierce, and the Widget's onReceive will refresh after receiving it.
    override fun onReceive(context: Context, intent: Intent) {
        this.context = context
        super.onReceive(context, intent)
        Log.i(TAG, "onReceive")
        val remoteViews = RemoteViews(context.packageName, R.layout.widget_recorder_remote_view)
        val appWidgetIds = AppWidgetManager.getInstance(context)
            .getAppWidgetIds(
                ComponentName(
                    context,
                    RecorderAppWidget: :class.java
                )
            )
        if (null== intent.action || UPDATE_ACTION ! = intent.action) {return
        }
        val titleStart = getTitleStart(context, getState(intent))
        when (getState(intent)) {
            STATE_RECORDING, STATE_RECORDING_FROM_PAUSED -> {
                remoteViews.setTextViewText(
                    R.id.widget_title_text,
                    getTimeString(titleStart, context, intent))
                remoteViews.setImageViewResource(
                    R.id.widget_stop_bn,
                    R.drawable.notification_btn_pause)
                remoteViews.setWidgetOnClickPendingIntent(context,
                    R.id.widget_stop_bn,
                    ACTION_CANCEL_TIMER)
                remoteViews.setWidgetOnClickPendingIntent(context,
                    R.id.widget_finish_bn,
                    ACTION_RESUME_TIMER)
                remoteViews.setTextViewText(R.id.widget_time,
                    getTimeString("", context, intent))
                remoteViews.setTextViewText(R.id.widget_time_center,
                    getTimeString("", context, intent))
                if (isFirst) {
                    updateAnimate(getState(intent))
                    isFirst = false
                }
            }
            STATE_IDLE, STATE_RECORD_PAUSED -> {
                updateAnimate(getState(intent))
                remoteViews.setTextViewText(
                    R.id.widget_title_text,
                    getTimeString(titleStart, context, intent))
                remoteViews.setImageViewResource(
                    R.id.widget_stop_bn,
                    R.drawable.notification_btn_resume)
            }

        }

        // Update widgets are handled by AppwidgetManager
        val awm = AppWidgetManager.getInstance(context.applicationContext)
        awm.updateAppWidget(appWidgetIds, remoteViews)

    }

    @Synchronized
    private fun updateWave(context: Context, index: Int) {
        val remoteViews = RemoteViews(context.packageName, R.layout.widget_recorder_remote_view)
        val appWidgetIds = AppWidgetManager.getInstance(context)
            .getAppWidgetIds(
                ComponentName(
                    context,
                    RecorderAppWidget: :class.java
                )
            )
        if(index ! = lastIndex) { lastIndex = index remoteViews.setImageViewResource( R.id.widget_wave, IMAGES[index]) remoteViews.setImageViewResource( R.id.item_content, IMAGES_CIRCLE[index]) remoteViews.setImageViewResource( R.id.item_content_center, IMAGES_CIRCLE[index]) }// Update widgets are handled by AppwidgetManager
        val awm = AppWidgetManager.getInstance(context.applicationContext)
        awm.updateAppWidget(appWidgetIds, remoteViews)
    }

   

    // Update text prefixes based on status
    private fun getTitleStart(context: Context, state: Int): String {
        return if (state == STATE_RECORD_PAUSED) {
            context.resources.getString(R.string.title_record_pause)
        } else if (state == STATE_RECORDING || state == STATE_RECORDING_FROM_PAUSED) {
            context.resources.getString(R.string.title_recording)
        } else {
            ""}}// Gets the recording status of the recorder corresponding to the widget
    private fun getState(intent: Intent): Int {
        return intent.getIntExtra(WIDGET_STATE_EXTRA_NAME, STATE_IDLE)
    }

    // Get the recording time of the recorder
    private fun getTimeString(titleStart: String.context: Context, intent: Intent): String {
        var time = intent.getStringExtra(WIDGET_TIME_EXTRA_NAME)
        if (null == time) {
            time = ""
        }
        if (time.isNotEmpty()) {
            time = "$titleStart $time"
        }
        return time
    }

    private fun RemoteViews.setWidgetOnClickPendingIntent(
        context: Context,
        id: Int,
        action: String,) =this.apply {
        setOnClickPendingIntent(id, PendingIntent
            .getBroadcast(
                context, 0, Intent().setAction(action),
                PendingIntent.FLAG_IMMUTABLE
            ))
    }

    private fun updateAnimate(state: Int) {
        Log.i("valueAnimator:value=". valueAnimator.toString()) valueAnimator.repeatCount = INFINITE valueAnimator.duration = duration valueAnimator.repeatMode = RESTART valueAnimator.interpolator = LinearInterpolator() valueAnimator.addUpdateListener { updateWave(context, it.animatedValueas Int)
        }
        Log.i("state::==", state.toString())
        when (state) {
            STATE_RECORDING, STATE_RECORDING_FROM_PAUSED -> {
                if (valueAnimator.isPaused) {
                    valueAnimator.resume()
                } else if(! valueAnimator.isRunning) { valueAnimator.start() } } STATE_IDLE, STATE_RECORD_PAUSED -> { valueAnimator.removeAllUpdateListeners() valueAnimator.pause() isFirst =true}}}}Copy the code

Next run the result:

Similarly, we do not need an array of images to implement the water wave graph. Simply follow the steps above to find material images.

By this point, it seems that this implementation is not very impressive, but it is achieved by framing the image. Have the ability to write a wave of widget water ripples or sound ripples through code. Of course, let’s explore how to implement Canvas customization of widgets before we get to the more advanced level. Once we can move beyond Widget customization, this kind of animation should work just fine. Let’s explore how Canvas can be introduced to desktop widgets.

Customization of Widgets

RemoteViews.setImageViewBitmap(id, Developers who are naturally familiar with the Canvas API and use it extensively should think of Canvas(@nonnull bitmap bitmap). The bitmap is the real vector for pixels, the Canvas is just a raster Canvas, and all our fancy operations will end up being stored on the Bitmap and set on the view widget. So let’s draw a line, right? Find out if a wave is possible.

private fun drawCanvas(remoteViews: RemoteViews, index: Int) {
    val width = context.resources.getDimensionPixelSize(R.dimen.widget_canvas_width)
    val height = context.resources.getDimensionPixelSize(R.dimen.widget_canvas_height)
    val bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
    val canvas = Canvas(bitmap)
    val paint = Paint().apply {
        this.color = Color.argb(115.194.108.57)
        this.strokeWidth = 2f
        this.style = Paint.Style.STROKE
    }
 canvas.drawLine(0f, height/2f, width.toFloat(), height/2f, paint)
 remoteViews.setImageViewBitmap(
        R.id.widget_canvas, bitmap)
}

Copy the code

The running effect is as follows:

So far, we have found a breakthrough. As long as we can load bitmap, canvas customization is not a problem.

1. Customized Echarts radar series diagram

I’ve written some front-end Echarts in JS and custom front-end Echarts by hand, right? , you can see the effect, this case to the Android radar series diagram.

If you are not familiar with the custom can see my previous article, of course, the only way to start, see will, too forgotten. Let’s take a look at what a simple custom API plus a simple junior high school math calculation can bring us.

1. Analysis before drawing

The convenience brought by coordinate transformation to the center of the screen to draw multiple skeleton line segments how to map the actual data to the screen to complete the line filling

2. Coordinates are converted to the screen

canvas.translate(width / 2f, height / 2f)
canvas.scale(1f, -1f)
canvas.save()
Copy the code

3. Draw multiple skeleton line segments

We see that there are three skeleton lines dividing the screen into six equal parts, and we can easily solve the equation for the three line segments, right? Junior high school math I’m sure you can understand.

Yx=-tan30*x

Yx= tan30*x

// The one on the right
val pathRight = Path()
val tan30 = kotlin.math.tan(Math.PI / 180 * 30)
val y1 = tan30 * (-width / 2)
val y2 = tan30 * width / 2
pathRight.moveTo(-width.toFloat() / 2, y1.toFloat())
pathRight.lineTo(width.toFloat() / 2, y2.toFloat())
canvas.drawPath(pathRight, paint)
// Draw the middle one
canvas.drawLine(0f, y1.toFloat() * 1.7 f.0f, -y1.toFloat() * 1.7 f, paint)
// The one on the left
val pathLeft = Path()
pathLeft.moveTo(-width.toFloat() / 2, -y1.toFloat())
pathLeft.lineTo(width.toFloat() / 2, -y2.toFloat())
canvas.drawPath(pathLeft, paint)

paint.color = Color.argb(255.66.39.39)
for (index in 0.10.) {
    canvas.drawCircle(0f.0f.50f * index, paint)
}
Copy the code

4. How does the actual data map to the screen

Similarly, the radius of our circle can be viewed as the length of each skeleton coordinate axis, whereas our actual data is just the length data how do we map the length number to each irregular skeleton coordinate axis? Again, simple math. For example, let’s take a number 250 as shown below where two white dotted lines meet. Our actual 250 represents the length from the dot to the focal point. But we need to position ourselves in the coordinate system and we need to figure out the virtual coordinates of x,y in the coordinate system. In the same simple junior high school math, it is not difficult to conclude that (x,y) =(lengthcson30,lenghtsin30) if you carefully analyze all the coordinates on each skeleton axis satisfy (x,y) =(lengthcson30,lenghtsin30). So let’s go to the code and see what it looks like

paint.style = Paint.Style.FILL
paint.color = Color.argb(60.154.108.57)
val arrData = arrayListOf(
    arrayOf(300f.200f.300f.300f.266f.133f),
    arrayOf(200f.245f.300f.201f.220f.200f),
    arrayOf(130f.295f.180f.151f.220f.120f),
    arrayOf(220f.235f.200f.199f.200f.130f),
    arrayOf(110f.135f.300f.199f.150f.220f),
    arrayOf(150f.235f.100f.300f.50f.110f),
    arrayOf(100f.40f.80f.70f.36f.23f))for (index in 0 until arrData.size) {
    val result = Path().apply {
        moveTo(0f, arrData[index][0])
        val random2 = arrData[index][1]
        lineTo(random2, (random2 * tan30).toFloat())
        val random4 = arrData[index][2]
        lineTo(random4, -(random4 * tan30).toFloat())
        val random5 = arrData[index][3]
        lineTo(0f, -random5)
        val random6 = arrData[index][4]
        lineTo(-random6, -(random6 * tan30).toFloat())
        val random7 = arrData[index][5]
        lineTo(-random7, (random7 * tan30).toFloat())
        close()
    }
    canvas.drawPath(result, paint)

}
paint.strokeWidth = 2f
paint.style = Paint.Style.STROKE
paint.color = Color.argb(35.254.108.57)
Copy the code

5. Attach hooks

paint.strokeWidth = 2f
paint.style = Paint.Style.STROKE
paint.color = Color.argb(35.254.108.57)
for (index in 0 until arrData.size) {
    val result = Path().apply {
        moveTo(0f, arrData[index][0])
        val random2 = arrData[index][1]
        lineTo(random2, (random2 * tan30).toFloat())
        val random4 = arrData[index][2]
        lineTo(random4, -(random4 * tan30).toFloat())
        val random5 = arrData[index][3]
        lineTo(0f, -random5)
        val random6 = arrData[index][4]
        lineTo(-random6, -(random6 * tan30).toFloat())
        val random7 = arrData[index][5]
        lineTo(-random7, (random7 * tan30).toFloat())
        close()
    }
    canvas.drawPath(result, paint)
}
Copy the code

The running effect is as follows:

The final result

Now that we can customize widgets as much as we want, how about water ripples and audio jitter with frame animation? Of course, in order to restore the more realistic water ripple and shake animation frame animation can only be rough motion, then we will realize how to customize the water ripple and sound wave animation.

I have to wait for some time. I am busy with my company recently, and I have time to write on weekends for other things. I remember someone said in the QQ group that I must write this article because my work is not saturated. You can write one in three or four hours on the weekend. Programming is more about practice, no shortcuts, write more, read more, talk less, and ask more questions. Come on, friends.

I wrote a little bit earlier, but I didn’t finish it. These two will be presented to you.

Happy New Year

New Year ㊗️ everyone healthy, bonanza, give birth to triplets tiger tiger. Give you huhu blessing, huhu sweet, huhu luck.