Memory Management In Android

Arsen Asatryan
8 min readNov 9, 2020

When developing an Android app, it’s important to learn what goes into optimization and memory management. Learn how to improve memory usage in this article.

Android uses paging and memory-maping (mmaping) instead of providing swap space, which means any memory your application touches cannot be paged out unless you release all references.The Dalvik* Virtual Machine’s heap size for application processes is limited. Applications start up with 2 MB, and the maximum allocation, marked as “largeHeap,” is limited to 36 MB (depending on the specific device configuration). Examples of large heap applications are Photo/Video Editor, Camera, Gallery, and Home Screen.

Paging and Memory-Maping

In computer operating systems, Paging is a memory management scheme by which a computer stores and retrieves data from secondary storage for use in main memory. In this scheme, the operating system retrieves data from secondary storage in same-size blocks called pages. Paging is an important part of virtual memory implementations in modern operating systems, using secondary storage to let programs exceed the size of available physical memory.For simplicity, main memory is called “RAM” (an acronym of “random-access memory”) and secondary storage is called “disk” (a shorthand for “hard disk drive, drum memory or solid-state drive”), but the concepts do not depend on whether these terms apply literally to a specific computer system.

A memory-mapped file is a segment of virtual memory that has been assigned a direct byte-for-byte correlation with some portion of a file or file-like resource. This resource is typically a file that is physically present on disk, but can also be a device, shared memory object, or other resource that the operating system can reference through a file descriptor. Once present, this correlation between the file and the memory space permits applications to treat the mapped portion as if it were primary memory.

Android Virtual Memory

Android doesn’t use virtual memory (in the sense you mean) by default, because it has a higher-level mechanism. Transparently writing pages of memory to flash storage is bad for battery life (and for the life of your flash storage, which can only handle a certain number of writes) and performance, especially since the application has no control over which parts of memory are “paged out”.Instead, Android manages memory using the same mechanism it uses to decide when to terminate apps that are no longer running. When it finds it needs to free up some RAM, it chooses an app that was cached (one that isn’t currently in use). It terminates this app to free its RAM, but first, it gives that app’s activities a chance to save some state by writing it to storage.By making the app explicitly choose what to save to storage, instead of just saving the whole of that app’s RAM contents, Android can reduce the amount it has to write from storage and later read back. This saves storage, and saves battery power and time, because each write to and read from storage costs time and power.Of course, Android is based on Linux, and uses virtual memory in other ways not visible to the user. Android systems must have an MMU, so apps use virtual, not physical, addresses. This protects apps from having their private data in RAM read by other apps, which is necessary for a secure system. It also uses Linux’s delayed-commit to save memory: when a process asks for more memory, it only gets pages of physical memory when it actually uses them. Virtual memory also allows memory-mapped access to files in the filesystem and to memory-mapped hardware. None of this is anything to do with swap files, but it means it’s not quite accurate to say that Android doesn’t use virtual memory.

LRU Cache

Android stores background application processes in an LRU cache. When the system runs low on memory, it will kill processes according to the LRU strategy, but it will also consider which application is the largest memory consumer. Currently, the maximum background process count is 20 (depending on the specific device configuration). If you need your app to live longer in the background, de-allocate unnecessary memory before moving to the background and the Android system will be less likely to generate an error message or even terminate the app.

A cache allows reusing objects which are expensive to create. If you load on object into memory, you can think of this as a cache for the object. For example, if you downloading images from the Internet to display them in a list you should hold them in memory to avoid that you download them several times.

At some point you need to recycle some of your objects, otherwise you run out of memory. A good approach to do this, is to recycle the objects which have not been used the longest in your application.

The Android platform provides the LruCache class, as of API 12 (or in the support-v4 library). The LruCache class provides a _least recently used cache _ (LRU cache) cache implementation. A LRU cache keeps track of the usage of its members. It has a given size and if this size is exceeded, it removes the items which have not be accessed the longest. This behavior is depicted in the following graphic.

Garbage Collection

A managed memory environment, like the ART or Dalvik virtual machine, keeps track of each memory allocation. Once it determines that a piece of memory is no longer being used by the program, it frees it back to the heap, without any intervention from the programmer. The mechanism for reclaiming unused memory within a managed memory environment is known as garbage collection. Garbage collection has two goals: find data objects in a program that cannot be accessed in the future; and reclaim the resources used by those objects.

Android’s memory heap is a generational one, meaning that there are different buckets of allocations that it tracks, based on the expected life and size of an object being allocated. For example, recently allocated objects belong in the Young generation. When an object stays active long enough, it can be promoted to an older generation, followed by a permanent generation.

Each heap generation has its own dedicated upper limit on the amount of memory that objects there can occupy. Any time a generation starts to fill up, the system executes a garbage collection event in an attempt to free up memory. The duration of the garbage collection depends on which generation of objects it’s collecting and how many active objects are in each generation.

Even though garbage collection can be quite fast, it can still affect your app’s performance. You don’t generally control when a garbage collection event occurs from within your code. The system has a running set of criteria for determining when to perform garbage collection. When the criteria are satisfied, the system stops executing the process and begins garbage collection. If garbage collection occurs in the middle of an intensive processing loop like an animation or during music playback, it can increase processing time. This increase can potentially push code execution in your app past the recommended 16ms threshold for efficient and smooth frame rendering.

Additionally, your code flow may perform kinds of work that force garbage collection events to occur more often or make them last longer-than-normal. For example, if you allocate multiple objects in the innermost part of a for-loop during each frame of an alpha blending animation, you might pollute your memory heap with a lot of objects. In that circumstance, the garbage collector executes multiple garbage collection events and can degrade the performance of your app.

How to Avoid Memory Leaks

Use memory carefully with above tips can bring benefit for your application incrementally, and make your application stay longer in the system. But all benefit will be lost if memory leakage happens. Here is some familiar potential leakage that developer needs to keep in mind.

  1. Remember to close the cursor after querying the database. If you want to keep the cursor open long-term, you must use it carefully and close it as soon as the database task finished.
  2. Remember to call unregisterReceiver() after calling registerReceiver().
  3. Avoid Context leakage. If you declare a static member variable “Drawable” in your Activity, and then call view.setBackground(drawable) in onCreate(), after screen rotate, a new Activity instance will be created and the old Activity instance can never be de-allocated because drawable has set the view as callback and view has a reference to Activity (Context). A leaked Activity instance means a significant amount of memory, which will cause OOM easily.
  4. There are two ways to avoid this kind of leakage:

1)Do not keep long-lived references to a context-activity. A reference to an activity should have the same life cycle as the activity itself.

2)Try using the context-application instead of a context-activity.

Be careful about using Threads. Threads in Java are garbage collection roots; that is, the Dalvik Virtual Machine (DVM) keeps hard references to all active threads in the runtime system, and as a result, threads that are left running will never be eligible for garbage collection. Java threads will persist until either they are explicitly closed or the entire process is killed by the Android system. Instead, the Android application framework provides many classes designed to make background threading easier for developers:

  • Use Loader instead of a thread for performing short-lived asynchronous background queries in conjunction with the Activity lifecycle.
  • Use Service and report the results back to the Activity using a BroadcastReceiver.
  • Use AsyncTask for short-lived operations.

How can I detect a memory leak?

The simplest way to detect a memory leak is also the way you’re most likely to find one: running out of memory. That’s also the worst way to discover a leak! Before you run out of memory and crash your application, you’re likely to notice your system slowing down. If you do, it can be time to start digging into your code to figure out just what’s using up all your RAM.

Often, you’ll do this using a profiling tool. Modern IDEs like Visual Studio have tools built in which will show you just how much memory is being used on which parts of your application. A profiling tool isn’t a silver bullet: it won’t tell you right away which parts of your application are leaking memory. What it will do is tell you which parts of your application are using the most memory. If you run them for an extended period of time, you can also see which parts of the application use more memory over time. Armed with this knowledge, you can narrow down the search in your application for leaky code.

Another method for memory leak detection is to use logging intelligently. Sometimes, faulty code doesn’t cause a memory leak, but your users do. Maybe a user has uploaded a very large file that they’re trying to access on your servers. If you’re loading that entire file into memory, you might exhaust the application’s memory through no fault of your own. Mature software organizations will often use automated tools to detect memory leaks in running applications.

--

--

Arsen Asatryan
0 Followers

Hi I am Arsen I am 23 old Android Developer From Armenia Trying to do my best for improving Android Community.