This article has been authorized to be reprinted by guolin_blog, an official wechat account.
This article is about Memory management for Android.
An overview of
Both the Android Runtime (ART) vm and the Dalvik VM use Paging and memory-mapped files to manage Memory. This means that any memory that applies changes, whether by allocating new objects or tapping a memory-mapped page, stays in RAM and cannot be swapped out. To free memory from an application, only object references retained by the application can be freed, making the memory available to the garbage collector. There is one exception to this: any memory-mapped file (for example, code) that has not been modified can be swapped out of RAM if the system wants to use its memory elsewhere.
The garbage collection
The in-storage environment of the Android Runtime (ART) vm or Dalvik VM tracks each memory allocation. Once a program determines that it is no longer using a block of memory, it rereleases that memory into the heap without any programmer intervention. This mechanism for reclaiming unused memory in the in-pipe environment is called garbage collection. Garbage collection has two goals: to find data objects in the program that will not be accessible in the future, and to reclaim the resources used by those objects.
Android’s heap is generational, which means it keeps track of different allocated storage partitions based on the life expectancy and size of allocated objects. For example, recently allocated objects belong to the new generation, and when an object remains active long enough, it can be promoted to the older generation, and then to the permanent generation.
Each generation of the heap has its own dedicated upper limit on the amount of memory that the corresponding object can occupy. Whenever the generation starts to fill up, the system performs a garbage collection event to free memory. The duration of a garbage collection depends on which generation of objects it is collecting and how many live objects there are in each generation.
Although garbage collection is very fast, it still affects application performance. Typically, there is no control in the code over when a garbage collection event occurs, and the system has a standard for determining when a garbage collection event occurs, and when that happens, the system stops the execution process and begins garbage collection. If garbage collection occurs during intensive processing cycles such as animation or music playback, it may increase processing time, which in turn may cause code execution in the application to exceed the recommended 16ms threshold for efficient, smooth frame rendering.
In addition, our code flow performs various tasks that may force garbage collection events to occur more frequently or cause them to last longer than normal, for example: If we allocate multiple objects in the innermost layer of the for loop during each frame of the Alpha blending animation, we may create a large number of objects in the heap, in which case the garbage collector will perform multiple garbage collection events and may degrade the performance of the application.
The Shared memory
To accommodate everything you need in RAM, Android tries to share RAM pages across processes. This can be done in the following ways:
- Each application process forks from an existing process called Zygote. Zygote starts when the system starts and loads the generic Framework code and resources (for example, the Activity theme background). To start a new application process, the Zygote process is forked and the application code is loaded and run in the new process. This method allows most OF the RAM pages for Framework code and resource allocation to be shared among all application processes.
- Most static data is memory-mapped to a process, which allows data to be shared between processes and swapped out as needed. Examples of static data include: Dalvik code (direct memory mapping by putting it into a pre-linked.odex file), application resources (by designing resource tables into a memory-mappable structure and by allocating APK zip entries), and traditional project elements (such as native code in.so files).
- In many places, Android uses explicitly allocated shared memory areas (via Ashmem or Gralloc) to share the same dynamic RAM between processes. For example, the window Surface uses memory shared between the application and the screen synthesizer, while the cursor buffer uses memory shared between the content provider and the client.
Allocate and reclaim application memory
The Dalvik heap is limited to a single virtual memory range for each application process. This defines the logical heap size, which can grow as needed but cannot exceed the upper limit defined by the system for each application.
The logical size of the heap is different from the amount of physical memory the heap uses. When examining the application heap, Android calculates a proportional memory size (PSS) value that takes into account both dirty and clean pages shared with other processes, but is proportional to the number of applications sharing that RAM. This (PSS) total is what the system considers to be the physical memory footprint.
The Dalvik heap does not compress the logical size of the heap, which means Android does not defragment the heap to reduce space. Android can reduce the logical heap size only if there is unused space at the end of the heap, but the system can still reduce the physical memory used by the heap. After garbage collection, Dalvik traverses the heap looking for unused pages, and then returns those pages to the kernel using Madvise, so pairing and de-allocating large chunks of data should reclaim all (or nearly all) of the physical memory used, but it is much less efficient to reclaim memory from smaller allocations. Because pages used for smaller allocations may still be shared with other blocks of data that have not yet been released.
Limit application memory
To maintain a multitasking environment, Android sets a hard cap on the heap size of each application. The exact upper limit for heap size for different devices depends on the total RAM size of the device. An application may receive an OutOfMemory exception if it tries to allocate more memory after the heap capacity limit has been reached.
In some cases, for example, to determine how much data is safe to keep in the cache, we can query the system to determine the exact amount of heap space available on the current device by calling the getMemoryClass() method, which returns an integer representing the number of megabytes available for the application heap.
Switching applications
Android keeps non-foreground applications in the cache when the user switches between applications. A non-foreground application is an application where a foreground service (such as music playing) is not visible to the user or not running. For example, when a user starts an application for the first time, the system creates a process for the application. However, when the user leaves the application, the process is kept in the cache. If the user returns to the application later, the system uses the process repeatedly to speed up the application switchover.
If the application has cached processes and retains resources that are not currently needed, it will affect the overall performance of the system even if the user does not use the application. When the system resources (such as memory) are low, it will terminate the cached processes. The system may also consider terminating the processes that occupy the most memory to free up RAM.
Note that when an application is in cache, the less memory it consumes, the more likely it is to be able to recover quickly without being terminated, but the system can terminate it at any time based on current needs regardless of the resource usage of the cached process.
Memory allocation between processes
The Android platform does not waste available memory at runtime; it always tries to use all available memory. For example, the system keeps apps in memory after they are closed so that users can quickly switch back to them. As a result, Android devices typically have little memory available at runtime, so memory management is critical to properly allocate memory between important system processes and many user applications.
Here are the basics of how Android allocates memory to the system and user applications and how the operating system copes with low memory.
Memory type
Android devices contain three different types of memory: RAM, zRAM, and memory, as shown below:
Note that the CPU and GPU access the same RAM.
- RAM is the fastest type of memory, but its size is usually limited. High-end devices typically have the largest RAM capacity.
- ZRAM is the RAM partition used for swap space. All data is compressed as it is put into zRAM and then decompressed as it is copied out of zRAM. This portion of RAM grows or shrinks as pages enter and exit zRAM. Device manufacturers can set a zRAM size upper limit.
- Storage contains all persistent data (e.g., file systems, etc.) and object code added for all applications, libraries, and platforms. Memory is much larger than the other two types of memory. On Android, memory is not used for swap space as it is on other Linux implementations, because frequent writes can corrupt this memory and shorten the life of the storage medium.
Memory page
Random access memory (RAM) is divided into multiple pages. Typically, each page has 4KB of memory.
The system treats the page as available or used. Available pages are unused RAM, and used pages are RAM currently being used by the system, which can be categorized as follows:
- Cache pages:Memory supported by files in storage (for example, code or memory-mapped files).Cache memoryThere areTwo kinds ofType:
- Private page:Owned by a process and not shared.
- Clean pages: An unmodified copy of a file in storage that can be deleted by the kernel switching daemon (KSWAPD) to increase available memory.
- Dirty pages: modified copies of files in storage that can be moved to zRAM by the kernel swap daemon (KSWAPD) or compressed in zRAM to increase available memory.
- A Shared page:Used by multiple processes.
- Clean pages: An unmodified copy of a file in storage that can be deleted by the kernel swap daemon (KSWAPD) to increase available memory.
- Dirty pages: a copy of a file in storage that has been modified to allow more memory space to be overwritten back into the file in storage by the kernel swap daemon (KSWAPd) or by explicitly using msync() or munmap().
- Private page:Owned by a process and not shared.
- Anonymous page:Memory not supported by files in storage (e.g., allocated by mmap() with MAP_ANONYMOUS flag set).
- Dirty pages: Can be moved to zRAM by the kernel switching daemon (KSWAPD) or compressed in zRAM to increase available memory.
Note that clean pages contain an exact copy of a file (or part of a file) that exists in storage. If a clean page no longer contains an exact copy of the file (for example, due to an application operation), it becomes a dirty page. Clean pages can be deleted because they can always be recreated using data from storage; Dirty pages cannot be deleted; otherwise, data will be lost.
Insufficient Memory Management
Android has two main mechanisms for handling out-of-memory situations: kernel swap daemons and low-memory termination daemons.
Kernel switching daemon (KSWAPD)
The Kernel swap daemon (KSWAPD) is part of the Linux kernel that converts used memory to available memory. When the available memory on the device is low, the daemon becomes active. The Linux kernel has upper and lower thresholds for free memory. When the available memory falls below the lower threshold, kSWAPD starts to reclaim the memory. When the available memory reaches the upper threshold, KSWAPD stops reclaiming memory.
Kswapd can delete clean pages to reclaim them because they are supported by storage and have not been modified. If a process tries to process a clean page that has been removed, the system copies the page from storage to RAM, an operation known as request paging.
The following figure shows the clean pages supported by storage that have been deleted:
Kswapd can move cached private and anonymous dirty pages to zRAM for compression, which frees up available memory (available pages) in RAM. If a process tries to process a dirty page in zRAM, the page is decompressed and moved back to RAM. If the process associated with a compressed page is terminated, the page is removed from zRAM. If the amount of available memory falls below a certain threshold, the system begins to terminate the process.
The image below shows the dirty pages being moved to zRAM and compressed:
Low memory termination daemon (LMK)
Many times, the kernel swap daemon (KSWAPD) does not free enough memory for the system. In this case, the system uses the onTrimMemory() method to inform the application that it is out of memory and should reduce its allocation. If this is not enough, the Linux kernel starts terminating the process to free up memory, which it does using the ** low memory terminating daemon (LMK) **.
LMK determines which process to terminate by using an out-of-memory score named oOM_ADJ_score to determine the priority of the running process. The process with the highest score is terminated first. Background applications are terminated first and system processes are terminated last.
The figure below shows the LMK rating categories from high to low, with the highest rating category, the item in the first row, being the first to be terminated:
- Background apps: Applications that have been run before and are not currently active. LMK will first terminate the daemon process from the application with the highest OOM_ADJ_score.
- Previous App: Background application used recently. The previous app has a higher priority (lower score) than the background app because the user is more likely to switch to the previous app than to a background app.
- Home App: This is the launcher app. Terminating the application causes the wallpaper to disappear.
- Services: Services are started by applications, such as synchronization or uploading to the cloud.
- Perceptible apps: Non-foreground applications that are Perceptible to the user in some way, such as running a search that displays a small interface or listening to music.
- Foreground App: The application that is currently used. Terminating a foreground app looks like the app crashed and may alert the user that there is a problem with the device.
- Persistence (Services) (Persisient) : These are the core services of the device, such as telephony and WLAN.
- System: a System process. After these processes are terminated, the phone may appear to be about to restart.
- Native: a very low-level process used by the system, such as the kernel interactive termination daemon (KSWAPD).
Note that device manufacturers can change the behavior of LMK.
Calculate the memory usage
The kernel keeps track of all the memory pages in the system.
The following figure shows the pages used by different processes:
When determining the amount of memory an application uses, the system must consider shared pages. Apps that access the same service or library will share memory pages. For example, the Google Play service and a game app may share location information services, making it difficult to determine how much memory belongs to the entire service and how much memory belongs to each app. The following image shows a page shared by two apps (middle) :
If you need to determine the memory footprint of your application, you can use any of the following metrics:
- Resident memory size (RSS) : The number of shared and unshared pages used by the application.
- Proportioned memory size (PSS) : The number of unshared pages used by the application plus the evenly proportioned number of shared pages (for example, if three processes share 3MB, the PSS of each process is 1MB).
- Private memory size (USS) : the number of unshared pages (excluding shared pages) used by the application.
Proportional memory size (PSS) is useful if the operating system wants to know how much memory is used by all the processes, because pages are counted only once, but the calculation takes a long time because the system needs to determine which pages are shared and how many processes are sharing the page. Resident memory size (RSS) does not distinguish between shared and unshared pages, so it is faster to calculate and better suited for tracking changes in memory allocation.
Managing Application Memory
Random access memory (RAM) is a valuable resource in any software development environment, especially in mobile operating systems, where physical memory is often limited, making RAM even more valuable. Although both the Android Runtime (ART) and Dalvik virtual machines perform routine garbage collection tasks, this does not mean that we can ignore where and when the application allocates and frees memory. We still need to avoid introducing memory leaks ** (usually caused by keeping object references in static member variables) and release all Reference** objects when appropriate (e.g., life cycle call-back).
Monitor available memory and memory usage
We need to find the memory usage problem in our application before we can fix it. We can use the Memory Profiler in Android Studio to help us find and diagnose Memory problems:
- Learn how our application allocates memory over time. Memory profilers can display real-time graphs of the application’s Memory usage, the number of Java objects allocated, and the timing of garbage collection events.
- Initiate a garbage collection event and take a snapshot of the Java heap while the application is running.
- Log your application’s memory allocation, then examine the allocated objects, look at the stack trace for each allocation, and jump to the corresponding code in the Android Studio editor.
Free memory in response to events
As mentioned above, Android can reclaim memory from an application in a number of ways or, if necessary, terminate the application entirely to free up memory for critical tasks. To further help balance system memory and avoid the need for the system to terminate our application process, we can implement the ComponentCallback2 interface in the Activity class and override the onTrimMemory() method to listen for memory-related events in the foreground or background. Objects are then released in response to application life cycle events or system events indicating that the system needs to reclaim memory, as shown in the following example code:
/** * Created by TanJiaJun on 2020/7/7. */
class MainActivity : AppCompatActivity(), ComponentCallbacks2 {
/** * Free memory when UI is hidden or system resources are insufficient. *@paramLevel Memory-related events raised */
override fun onTrimMemory(level: Int) {
super.onTrimMemory(level)
when (level) {
ComponentCallbacks2.TRIM_MEMORY_UI_HIDDEN -> {
/** * Releases all UI objects currently holding memory. * * User interface has been moved to background. * /
}
ComponentCallbacks2.TRIM_MEMORY_RUNNING_MODERATE,
ComponentCallbacks2.TRIM_MEMORY_RUNNING_LOW,
ComponentCallbacks2.TRIM_MEMORY_RUNNING_CRITICAL -> {
/** * Frees memory that the application does not need to run. * * The device is out of memory while the application is running. * Raised events indicate the severity of memory-related events. * If the event is TRIM_MEMORY_RUNNING_CRITICAL, the system will start killing background processes. * /
}
ComponentCallbacks2.TRIM_MEMORY_BACKGROUND,
ComponentCallbacks2.TRIM_MEMORY_MODERATE,
ComponentCallbacks2.TRIM_MEMORY_COMPLETE -> {
/** * Free as much memory as the process can free. * * The application is in the LRU list and the system is out of memory. * Events raised indicate the position of the application in the LRU list. * If the event is TRIM_MEMORY_COMPLETE, the process will be the first to terminate. * /
}
else- > {/** * Publish any non-critical data structures. * * The application received an unrecognized memory level value from the system, and we can treat this message as a normal low memory message. * /}}}}Copy the code
Note that the onTrimMemory() method was only added to Android4.0. For earlier versions, we could use the onLowMemory() method, which is roughly equivalent to the TRIM_MEMORY_COMPLETE event.
See how much memory you should use
To allow multiple processes to run simultaneously, Android sets a hard limit on the size of the heap allocated to each application, which varies depending on how much RAM is available on the device overall. If our application has reached its heap limit and tries to allocate more memory, the system will throw an OutOfMemory exception.
To avoid runs out of memory, we can query system to determine the current equipment available on the heap space, by calling the getMemoryInfo () method to query the numerical system, this method returns the ActivityManager. MemoryInfo object, This object provides information about the current memory state of the device, such as available memory, total memory, and a memory threshold (if this level of memory is reached, the system will start terminating processes). ActivityManager. MemoryInfo lowMemory object will also provide a Boolean value, we can according to this value to determine whether equipment out of memory. The sample code looks like this:
fun doSomethingMemoryIntensive(a) {
Check whether the device is in a low memory state before executing logic that requires a large amount of memory
if(! getAvailableMemory().lowMemory) {// Execute logic that requires a lot of memory}}// A MemoryInfo object that gets the current memory state of the device
private fun getAvailableMemory(a): ActivityManager.MemoryInfo =
ActivityManager.MemoryInfo().also {
(getSystemService(Context.ACTIVITY_SERVICE) as ActivityManager).getMemoryInfo(it)
}
Copy the code
Use more memory-efficient code structures
We can choose a more efficient solution in our code to minimize the memory usage of our application.
Use Services with caution
If our application needs a Service to perform work in the background, do not leave it running unless it really needs to run a job, and it should be stopped after the Service completes its task, otherwise it may cause a memory leak.
After we start a service, the system is more inclined to let the process of this service keeps running state, this behavior will lead to service process cost is very high, because once the service using a certain part of the RAM, then this part of the RAM will no longer be used by other processes, which can reduce system can be retained in the LRU cache cache the number of processes, This reduces application switching efficiency. Memory jitter can occur when memory is tight and the system cannot maintain enough processes to host currently running services.
Persistent services should generally be avoided because they place persistent demands on available memory, and we can use JobScheduler to schedule background processes.
If we have to use a service, the best way to limit the life of that service is to use IntentService, which terminates immediately after handling the intent that started it.
Use optimized data containers
Some of the classes provided by the programming language are not optimized for mobile devices; for example, the memory efficiency of a regular HashMap implementation can be very low because each map needs to correspond to a separate item object.
The Android framework includes several optimized data containers, such as: SparseArray, SparseBooleanArray, and LongSparseArray, for example, are more efficient because it avoids the need for the system to automatically box keys (and sometimes values) (which creates one or two separate objects for each entry).
Use as compact data structures as possible, such as arrays, based on business needs.
Be careful with code abstraction
Developers tend to abstract simply as a good programming practice, because of abstraction can improve the flexibility and maintainability of code, but the high costs of abstraction, they need more code to perform normally, need more time and more RAM can be mapped into memory into the code, therefore, if the abstract there is no significant benefits, We should avoid using abstractions.
Protobuf lite is used for serialized data
Protocol Buffers is a language – and platform-independent and extensible mechanism designed by Google for serializing structured data. It is similar to XML, but smaller, faster, and simpler. Use a compact version of Protobuf on mobile because conventional Protobuf generates extremely verbose code, which can cause problems with applications such as increased RAM usage, significantly increased APK size, and slower execution.
Avoid memory jitter
As mentioned earlier, garbage collection events usually do not affect application performance, but if many garbage collection events occur in a short period of time, frame time can be quickly exhausted, and the more time a system spends on garbage collection, the less time it can spend on other tasks such as rendering interfaces or streaming audio.
In general, memory jitter may result in a large number of garbage collection events, and in fact, memory jitter can indicate the number of allocated temporary objects present in a given time, for example: We allocate multiple temporary objects in the for loop or create Paint or Bitmap objects in the View’s onDraw() method. In both cases, the application creates a large number of objects quickly. These operations can quickly consume all available memory in the young generation region. Thus forcing the garbage collection event to occur.
We can use the Memory Profiler in Android Studio to find places where Memory jitter is high. Once we have identified problem areas in our code, we can try to reduce the number of allocations in areas that are critical to performance. We can consider moving some code logic out of the internal loop or using the factory method pattern.
Remove resources and libraries that take up a lot of memory
Code of certain resources and library may we unwittingly consuming memory, the overall size of APK (including third party libraries or embedded resources) could affect the application of memory consumption, we can be removed from the code in any redundant and unnecessary or bloated components, resources, or libraries, reduce memory consumption.
Reduce the overall APK size
We can significantly reduce the memory usage of an application by reducing its overall size. Bitmap size, resources, animation frames, and third-party libraries all affect the size of APK. Android Studio and the Android SDK provide a variety of tools to help reduce the size of resources and external dependencies, such as R8 compilation.
When we build projects using Android Gradle plugin version 3.4.0 and later, the plugin no longer uses ProGuard to perform compile-time code optimizations, but works in conjunction with the R8 compiler to handle the following compile-time tasks:
- Code reduction (Tree Shaking) : Detecting and safely removing unused classes, fields, methods, and properties from applications and their library dependencies (this makes it a very useful tool for getting around the 64K reference limit). For example, if we use only a few apis for a library dependency, scaling can identify library code that is not being used by the application and remove that code from the application.
- Resource reduction: Removing unused resources from encapsulated applications, including unused resources from application library dependencies, can be used in conjunction with code reduction, so that when you remove unused code, you can safely remove all resources that are no longer referenced.
- Obfuscation handling: Shorten the names of classes and members to reduce the size of DEX files.
- Optimization: Review and rewrite the code to further reduce the application’s DEX file size. For example, if R8 detects that the ELSE branch of an if/else statement has never been used, it removes the else branch.
Upload an App using the Android App Bundle (Google Play only)
The easiest way to immediately reduce the size of your App when you publish it to Google Play is to publish your App as an Android App Bundle, a new upload format that contains all the compiled code and resources for your App. Google Play handles THE APK generation and signing.
The new application service model for Google Play, Dynamic Delivery, uses the App Bundle we provide to generate and provide optimized APK for each user’s device configuration, so they only need to download the code and resources needed to run our App. Instead of having to recompile, sign, and manage multiple APKs to support different devices, users can get smaller, more optimized download packages.
Note that Google Play limits the compressed download size of the signature APK we upload to 100MB and the compressed download size of the App distributed using the App Bundle to 150MB.
Use Android Size Analyzer
The Android Size Analyzer tool, which makes it easy to discover and implement a variety of strategies for reducing application Size, can be used as an Android Studio plug-in or as a standalone JAR.
Use Android Size Analyzer in Android Studio
You can download the Android Size Analyzer plug-in using the Plugin market in Android Studio by following these steps:
- Select Android Studio>Preferences, or File>Settings for Windows.
- Select the Plugins section in the left panel.
- Click the Marketplace TAB.
- Search for the Android Size Analyzer plug-in.
- Click the Install button for the analyzer plug-in.
As shown below:
After installing the plug-in, choose Analyze>Analyze App Size from the menu bar in order to run the Analyze App Size analysis on the current project. After analyzing the project, a tools window is displayed with suggestions on how to reduce the application Size, as shown in the following image:
Use profilers from the command line
We can download the latest version of Android Size Analyer from GitHub as a TAR or ZIP file. After decompressing the file, Run the size-Analyzer script (on Linux or MacOS) or the Size-Analyzer. Bat script (on Windows) on the Android project or Android App Bundle using one of the following commands:
./size-analyzer check-bundle <path-to-aab>
./size-analyzer check-project <path-to-project-directory>
Copy the code
Understand APK structure
Before discussing how to reduce the size of your application, it’s important to understand the structure of APK. An APK file consists of a Zip file that contains all the files that make up the application, including Java class files, resource files, and files containing compiled resources.
APK contains the following folders:
- Meta-inf / : contains cert. SF and cert. RSA signature files, as well as the manifest.mf MANIFEST file.
- Assets / : Contains the resources of the application, which can be retrieved using the AssetManager object.
- Res / : contains resources that are not compiled into resources.arsc.
- Lib / : contains compiled code specific to the processor software layer. This directory contains subdirectories for each platform type, such as armeabi, Armeabi-V7A, ARM64-V8A, x86, X86_64, and MIPS.
APK also contains the following files, in which only Androidmanifest.xml is required:
- Resources.arsc: Contains compiled resources. This file contains the XML content in all configurations of the RES /values/ folder. The packaging tool extracts this XML content, compiles it into binary file form, and compresses the content, which includes language strings and styles, as well as the path to content (such as layout files and images) that is not directly contained in the resources.arsc file.
- Classes. dex: contains classes compiled in the dex file format understandable by the Android Runtime (ART) vm and the Dalvik VM.
- Androidmanifest.xml: Contains the Android manifest file, which lists the name, version, access rights, and referenced library files of the application, using Android’s binary XML format.
Reduce the number and size of resources
The size of APK affects how fast an application loads, how much memory it uses, and how much power it consumes. An easy way to reduce the size of an APK is to reduce the number and size of resources it contains. Specifically, we can remove resources that are no longer used by the application, and we can replace image files with scalable Drawable objects.
Remove unused resources
Lint is a static code profiler shipped with Android Studio that can detect resources in the RES/folder that are not referenced by code. When Lint detects potentially unused resources in a project, it displays a message that looks like this:
res/layout/preferences.xml: Warning: The resource R.layout.preferences appears
to be unused [UnusedResources]
Copy the code
Note that the Lint tool does not scan assets/ folders, resources referenced by reflection, and library files linked to the app. In addition, it does not remove resources, only notifies us of their existence.
Gradle can automatically remove unused resources if we enable shrinkResource in our build.gradle file.
android {
buildTypes {
release {
minifyEnabled true
shrinkResources true
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'}}}Copy the code
To use shrinkResource, we must enable code reduction. R8 first removes unused code during compilation, and then the Android Gradle plugin removes unused resources.
In Android Gradle plugin version 0.7 and later, you can declare the configuration supported by your application. Gradle passes this information to the compilation system using the resConfig and resConfigs variants and the defaultConfig option. The compilation system then reduces the APK size by preventing resources from other unsupported configurations from being present in the APK.
Note that code reduction can clean up some unnecessary code in a library, but may not remove large internal dependencies.
Minimize the amount of resources used in the library
When developing Android applications, we often need to use external libraries to improve the usability and versatility of the application. For example, we can use Glide to load images.
If the library is designed for a server or desktop device, it may contain many objects and methods that the application does not need. If the library license allows us to modify the library, we can edit the library’s files to remove unwanted parts. We can also use the library suitable for mobile devices.
Only specific densities are supported
Android supports a variety of devices, covering a variety of screen densities. In Android 4.4 (API level 19) and later, the framework supports various densities: LDPI, MDPI, TVDPI, HDPI, XHDPI, XXHDPI, and XXXHDPI. Although Android supports all of these densities, there is no need to export rasterized resources to each density.
If we don’t add resources for a particular screen density, Android will automatically scale resources designed for other screen densities. It is recommended that each app include at least one xxHDPI image variant.
Use drawable objects
Some images do not require a static image resource, and the framework can draw images dynamically at run time. We can use Drawable objects (shape elements in XML) to dynamically draw images, which take up only a small amount of space in APK, and XML’s Drawable objects can produce monochrome images that conform to Material Design guidelines.
Reusing resources
We can add separate resources for variations on an image, such as a version of the same image that has been toned, shaded, or rotated. It is recommended to reuse the same set of resources and customize them as needed at run time.
On Android5.0 (API level 21) and later, you can change the color of a resource using the Android :tint and Android :tintMode properties, or on earlier platforms, the ColorFilter class.
We can omit a resource that is just the rotation equivalent of another resource. The following example shows how to change thumb up to thumb down by rotating 180 degrees around the center of the image, as shown in the following code:
<?xml version="1.0" encoding="utf-8"? >
<rotate xmlns:android="http://schemas.android.com/apk/res/android"
android:drawable="@drawable/ic_thumb_up"
android:fromDegrees="180"
android:pivotX="50%"
android:pivotY="50%" />
Copy the code
Render from code
We can reduce the APK size by programmatically rendering the image, which frees up a lot of space because we don’t need to store image files in APK.
Compressing PNG files
The AAPT tool can optimize image resources placed in res/drawable/ by lossless compression during compilation. For example, the AAPT tool can convert true-color PNGS that do not require more than 256 colors to 8-bit PNGS through a palette, which produces images of the same quality but with less memory footprint.
Note that the AAPT tool has the following limitations:
-
The AAPT tool does not shrink PNG files contained in the asset/ folder.
-
Image files need 256 or fewer colors to be optimized by the AAPT tool.
-
The aAPT tool may extend compressed PNG files. To prevent this, we can disable this process for PNG files with the cruncherEnabled flag in Gradle. Example code is as follows:
aaptOptions { cruncherEnabled = false } Copy the code
Compress PNG and JPEG files
We can use tools like PngCrush, PngQuant or Zopflipng to reduce the size of PNG files without losing quality. All of these tools can reduce the size of PNG files while maintaining visual quality.
The PngCrush tool is the most efficient: it iterates through the PNG filter and the Zlib (Deflate) parameter, using each combination of the filter and the parameter to compress the image, and then it selects the configuration that produces the least compressed output.
To compress JPEG files, we can use tools such as packJPG and Guetzli.
Use WebP file format
If you are targeting Android3.2 (API level 13) and higher, you can use WebP file format images instead of PNG or JPEG files. The WebP format provides lossy compression (e.g., JPEG) and transparency (e.g., PNG), but it provides better compression than PNG or JPEG.
You can use Android Studio to convert existing BMP, JPG, PNG, or static GIF images to WebP format.
Note that Google Play only accepts PNG launcher ICONS.
Using vector graphics
Vector graphics can be used to create resolution-independent ICONS and other scalable media, which can greatly reduce the amount of space taken up by APK. Vector images are represented in Android as VectorDrawable objects, 100-byte files that can produce a clear image the size of the screen.
Note that it takes a lot of time for the system to render each VectorDrawable. Rendering larger images with a VectorDrawable takes longer to appear on the screen, so it is recommended to use a VectorDrawable only when displaying small images.
Use vector graphics for animated pictures
Do not use AnimationDrawable to create frame-by-frame animations, as doing so would require adding a separate bitmap file for each frame of the animation, which would greatly increase the APK size. Should be changed to use AnimatedVectorDrawableCompat create animated vector can map resource.
Reduce Native and Java code
There are several ways to reduce the size of Native and Java codebase in your application.
Remove unnecessary generated code
Make sure you understand the space required to automatically generate any code, for example: many protocol buffer tools generate too many classes and methods, which can double or triple the size of your application.
Avoid enumerations
A single enumeration can increase the size of your application’s classes.dex file by approximately 1.0 to 1.4KB. These increases can quickly add up to complex systems or shared libraries. If possible, consider using @intdef annotations and code reduction to remove enumerations and convert them to integers. This type conversion preserves all the security benefits of enumerations.
Reduce the size of native binaries
If our application uses native code and the Android NDK, we can also reduce the size of the release application by optimizing the code. Removing debug symbols and avoiding decompression of the native library are two useful techniques.
Remove debug symbol
If the application is under development and still needs debugging, debugging symbols are appropriate. We can use the ARM-Eabi-Strip tool provided in the Android NDK to remove unnecessary debugging symbols from the native library, and then we can compile the release.
Avoid decompressing native libraries
In building an release, we can through the application in listing the application of element in the set the android: extractNativeLibs = “false”, will be uncompressed. So packaged in the APK file. Disabling this tag prevents the PackageManager from copying **.so files from APK to file system ** during installation, and has the added benefit of reducing the number of updates applied. This property is set to false by default when building applications using Android Gradle plugin version 3.6.0 and later.
Maintain multiple thin APKs
APK may contain content that users download but never use, such as resources in other languages or for a particular screen density. To ensure a minimal download for users, we should upload the App to Google Play using the Android App Bundle. By uploading the App Bundle, Google Play is able to generate and provide optimized APKS for each user’s device configuration, so users simply download the code and resources needed to run our App, and we don’t have to compile, sign, and manage multiple APKs to support different devices. Users can also get smaller, more optimized download packages.
If we don’t plan to publish the app to Google Play, we can divide the app into multiple APKs and differentiate them by factors like screen size or GPU texture support.
When users download our application, our device will receive the correct APK according to the device function and Settings, so that the device does not receive the device does not have the features and resources, for example, if the user has an HDPI device, there is no need for xxXHDPI resources for higher density displays.
Dependency injection is implemented using Dagger2
A dependency injection framework simplifies the code we write and provides an adaptive environment for testing and other configuration changes.
If we are going to use a dependency injection framework in our application, consider using Dagger2. Dagger2 doesn’t use reflection to scan an application’s code, and its static compile-time implementation means it can be used in Android applications without unnecessary run-time costs or memory consumption.
Other dependency injection frameworks that use reflection tend to initialize processes by scanning for comments in code, a process that can require more CPU cycles and RAM, and can cause significant delays when the application starts.
Use external libraries sparingly
External library code is usually not written for mobile environments and can be inefficient to run on mobile clients. If we decide to use an external library, we may need to optimize the library for mobile devices, so plan ahead and analyze the library in terms of code size and RAM consumption before deciding to use it.
Even some libraries optimized for mobile devices can cause problems due to different implementations. For example, one library may use a compact Protobuf while another uses Micro Protobuf, resulting in two different Protobuf implementations for our application. Different implementations of logging, analysis, image-loading frameworks, and many other features besides our own can lead to this.
While ProGuard can remove apis and resources with appropriate tags, it cannot remove large internal dependencies of libraries. The functionality we need in these libraries may require lower-level dependencies. This is especially problematic if we use an Activity subclass in a library (which tends to have a lot of dependencies), ** libraries use reflection (which is common, meaning we spend a lot of time manually tweaking ProGuard to get it running), etc.
Also, avoid using a shared library for one or two of the dozens of functions, which can create a lot of code and overhead that we don’t even need, and when considering whether to use the library look for an implementation that fits our needs well, otherwise we can decide to build our own.
Reference: Android Developers
My GitHub: TanJiaJunBeyond
Common Android Framework: Common Android framework
My nuggets: Tan Jiajun
My simple book: Tan Jiajun
My CSDN: Tan Jiajun