HomeDocumentationc_006_dart_ffi
c_006_dart_ffi
17

The Memory Map: Everything Your App's RAM Is Actually Doing

Flutter App Memory Explained: Heap, Stack, Android vs iOS

March 19, 2026

RAM Is Just a Very Long Array

Start with the simplest possible model. Your phone has, say, 8GB of RAM. Physically, that's 8 billion bytes of storage. Every byte has an address — a number from 0 to 8,589,934,591. Think of it as an enormous array:

javascript
Address   Value
──────────────
0         0x4A
1         0x00
2         0xFF
...
8589934591  0x12

When code runs, it reads and writes values to specific addresses. A pointer — a variable that "points to" something — is just a number: the address of the thing it points to. Pointer<Uint8> in Dart FFI isn't magic. It's an integer that says "the byte you want is at address 140,734,789,632."

That's the whole model. Everything else is built on top of this.

Virtual Memory: Every Process Gets Its Own Illusion

There's one problem with the "enormous array" model: multiple apps run simultaneously. If App A stores data at address 1,000,000 and App B also stores data at address 1,000,000, they'd overwrite each other.

The operating system solves this with virtual memory. Every process — every app — gets its own private virtual address space. On a 64-bit device, that's a range from 0 to 2^64 (about 18 quintillion addresses). The process believes it owns all of that space. In reality, the OS maps the addresses it actually uses to physical RAM, transparently.

javascript
App A's view:              App B's view:
Virtual 0x1000 → Physical 0x8400    Virtual 0x1000 → Physical 0x2C00
Virtual 0x2000 → Physical 0xA100    Virtual 0x2000 → Physical 0x5000

App A's pointer value 0x1000 and App B's pointer value 0x1000 look the same as integers, but they point to completely different physical memory. This is why pointers from one process mean nothing in another — the address is private to the virtual address space of the process that created it.

The OS manages this translation using a structure called a page table. Memory is divided into 4KB chunks called pages. The page table maps virtual pages to physical pages. When you allocate memory, the OS finds free physical pages and maps them into your virtual address space. When you free memory, those mappings are removed.

This is also where memory limits come from. An app hitting 800MB of memory means it has 800MB worth of physical pages mapped into its virtual address space. The OS watches this, and when it gets too high, it takes action — differently on Android and iOS, as we'll get to.

The Regions of Your App's Memory

Within a process's virtual address space, memory is organized into distinct regions. Each region has a different purpose, different permissions, and different lifetime rules.

Here's a simplified map of what's in memory when a Flutter app runs:

javascript
High addresses
┌─────────────────────────────┐
│        Kernel space         │  ← OS only, your code can't touch this
├─────────────────────────────┤
│          Stack(s)           │  ← function frames, local variables
├─────────────────────────────┤
│                             │
│    Memory-mapped region     │  ← shared libraries (.so/.dylib),
│    (mmap)                   │    mmap'd files, image assets
│                             │
├─────────────────────────────┤
│                             │
│           Heap              │  ← dynamically allocated memory
│      (grows upward ↑)       │    (malloc, calloc, Dart GC heap)
│                             │
├─────────────────────────────┤
│     BSS segment             │  ← uninitialized global/static variables
├─────────────────────────────┤
│     Data segment            │  ← initialized global/static variables
├─────────────────────────────┤
│     Text/Code segment       │  ← your compiled machine code (read-only)
└─────────────────────────────┘
Low addresses

Let's walk through each one.

The Code Segment: Your App's Instructions

The bottom of the address space holds your compiled code. On iOS, Flutter.framework and your app's compiled Dart code (AOT-compiled to machine code). On Android, libflutter.so and libapp.so. These files are loaded into the code segment when the app starts.

The code segment is read-only and executable. The CPU reads instructions from here. If code attempts to write to this region — which would mean modifying running code — the OS throws a segfault and kills the process. This is a security feature.

For Flutter specifically: the Dart code you wrote is in libapp.so on Android, and compiled into the app binary on iOS. When add(3, 4) runs, the CPU is literally executing machine code at some address in this segment. When you profile and see "time spent in X function," you're seeing time spent executing code at specific addresses in this region.

The Stack: Fast, Automatic, Finite

Every thread in your process has its own stack. The stack is where function call state lives.

When a function is called, the CPU pushes a stack frame onto the stack. That frame contains:

  • The return address (where to go when the function finishes)
  • The function's parameters
  • The function's local variables

When the function returns, its frame is popped. The memory is reclaimed instantly — no garbage collector needed, no deallocation call required. This is why local variables are "free" in terms of memory management: they disappear automatically when the function exits.

dart
void processOrder(Order order) {  // ← frame pushed
  final discount = order.total * 0.1;  // ← local variable, lives on the stack
  final tax = calculateTax(discount);  // ← another frame pushed for calculateTax
  return applyDiscount(order, discount, tax);
} // ← frame popped, discount and tax are gone

The stack is fast — pushing and popping frames is just incrementing and decrementing a register (the stack pointer). But it has two important constraints:

Fixed size. On Android, the main thread stack is 8MB. Worker threads get less (typically 1MB). On iOS, the main thread gets 8MB, secondary threads get 512KB. If your function call chain goes deep enough that frames exceed this limit, you get a stack overflow — the process crashes. Deep recursion without a base case is the classic way to trigger this.

Values only, no objects. The stack is for small, fixed-size values: integers, floats, pointers (which are just integers), small structs. Large data — arrays, strings, complex objects — lives on the heap. A local variable that "holds" an array is actually a pointer (stack) to data (heap). The stack stores the address; the heap stores the contents.

In Dart: local variables live on the Dart VM's stack (Dart has its own stack management inside each isolate). But Dart objects — class instances, strings, lists — always live on the heap and the local variable holds a reference to them.

The Heap: Where Everything Interesting Lives

The heap is the large, flexible pool of memory for objects whose lifetime can't be determined at compile time. "I'll create this Order object when the user taps a button, and it'll live until they navigate away" — that's a heap allocation.

The heap is managed by an allocator. When you call malloc(n) in C, or calloc<Uint8>(n) in Dart FFI, you're asking the allocator to find n contiguous bytes in the heap and give you a pointer to them. The allocator tracks which regions are in use and which are free. When you call free(ptr), you're telling the allocator those bytes are available again.

On Android, the system allocator is jemalloc (or a variant called Scudo on newer devices). On iOS, it's libmalloc (part of libSystem). These are sophisticated pieces of software — they handle fragmentation, thread-safety, and large-vs-small allocation strategies. Consider they're the black box that turns "I need 256 bytes" into a valid pointer.

Fragmentation is worth understanding. Imagine you allocate blocks of 100 bytes, 200 bytes, 100 bytes, then free the first and third:

javascript
[FREE 100][IN USE 200][FREE 100]

You have 200 bytes free, but if you ask for a 150-byte allocation, the allocator can't give you a contiguous 150-byte block — the two free regions are separated by the 200-byte used block. This is fragmentation. Real allocators have strategies to manage it, but it's why "free memory" and "allocatable memory" aren't always the same number.

The Dart Heap: A Managed World Inside the Native Heap

Dart doesn't call malloc for every object. Instead, it allocates a large block of native heap memory upfront and manages it internally — this is the Dart GC heap. Every object you create in Dart (User(), [], "hello") lives in this managed region.

The Dart heap is split into generations, as covered in the Flutter Under the Hood: Garbage Collection post:

javascript
Dart GC Heap (inside native heap)
┌─────────────────────────────────────────┐
│  New Space (Young Generation)           │
│  ┌──────────────────────────────────┐   │
│  │ From-space │ To-space            │   │  ← scavenger
│  └──────────────────────────────────┘   │
│                                         │
│  Old Space (Old Generation)             │
│  ┌──────────────────────────────────┐   │
│  │ Survived objects                 │   │  ← mark-sweep-compact
│  └──────────────────────────────────┘   │
└─────────────────────────────────────────┘

Objects are first allocated in new space. If they survive enough GC cycles (they're still referenced), they're promoted to old space. The GC collects new space frequently and cheaply (just flipping from-space and to-space). Old space is collected less often but takes longer.

Each Dart isolate has its own heap. This is why isolates can't share objects — they live in separate heap regions. When you pass a List<Product> to Isolate.run(), the data is serialized and copied into the receiving isolate's heap. They're different objects at different addresses that happen to contain the same values.

From the OS's perspective, all of this Dart heap activity — GC scavenges, promotions, compactions — is just writes to a region of the native heap. The OS doesn't know about Dart objects; it just sees memory pages being read and written.

The Native Heap: What calloc Actually Does

When Dart FFI code calls calloc<Uint8>(size), it allocates from the native heap — the same heap that C/C++ code uses. This allocation is invisible to the Dart GC.

This is the fundamental split that the FFI series post on why FFI exists introduces:

javascript
Process memory
├── Dart GC heap ──── Dart objects: List, String, your classes
│                     GC manages lifetime automatically
│
└── Native heap ───── calloc allocations: Pointer<Uint8>, Struct instances
                      YOU manage lifetime: calloc.free() or NativeFinalizer

A Pointer<Uint8> in Dart is a Dart object (lives in the Dart GC heap, will be collected when unreferenced) that contains a number — the address of a byte in the native heap. The Dart GC will collect the Pointer object itself. It will never touch the bytes the pointer points to. Those bytes are yours to free.

This is why the pointers post says: "a pointer is just a number." Concretely: it's an integer stored in the Dart heap, whose value is a virtual address in the native heap.

The Memory-Mapped Region: Libraries, Assets, Files

Between the heap and the stack, there's a large region managed by mmap — the OS call that maps files or anonymous memory directly into the virtual address space.

This is where shared libraries live. When Flutter loads libflutter.so on Android, the OS maps the file's contents into memory. Multiple apps using the same .so file can share the physical pages — the OS maps the same physical page into each app's address space. This is why shared libraries save memory at the system level: libflutter.so is loaded once into physical RAM, shared across all Flutter apps running simultaneously.

Your image assets work similarly when memory-mapped: the file on disk is mapped into the address space, and the OS pages in only the parts of the file that are actually read. A 10MB image file that you only partially decode doesn't cost 10MB of physical RAM.

Understanding this region matters for one practical reason: memory-mapped data doesn't show up in the Dart heap. Shared libraries, fonts, and on-disk assets are mapped here. Flutter's image cache — the decoded pixels from CachedNetworkImage — lives in the native heap (allocated by Skia/Impeller's internal allocator), also outside the Dart GC heap. This is why the memory profiling post says to watch RSS (total process memory) separately from Dart heap: image cache growth appears in the "Native Heap" line of dumpsys meminfo, not in Dart heap snapshots.

Android Memory Management: The Low Memory Killer

Android devices range from 1GB to 16GB of RAM, shared across dozens of processes. Android's approach to memory pressure is aggressive and systemic.

The Low Memory Killer (LMK) is a kernel daemon that monitors memory. Every process in Android has an OOM score (Out of Memory score) — a number from -1000 to 1000 that represents how expendable the process is. Higher score = killed first.

The score is determined by the process's importance:

javascript
Score   State
──────────────────────────────────────────────────
-900    System process (never killed)
 0      Foreground app (what the user sees right now)
 100    Visible app (partially visible, like a dialog host)
 200    Service (background service)
 500    Cached background app (user navigated away)
 900    Empty process (no components running)

When the system runs low on memory, the LMK kills processes starting from the highest score downward. Your Flutter app in the foreground (score 0) is safe. Your app in the background, two apps ago (score ~600), is the first to go when someone opens Chrome and loads a heavy page.

Your app gets callbacks before being killed — but these are native Android callbacks (Kotlin/Java), not Dart callbacks. They live in your MainActivity:

kotlin
// In MainActivity.kt (Kotlin — this is Android-native, not Dart)
override fun onLowMemory() {
    super.onLowMemory()
    // System memory is critically low — free everything you can
}

override fun onTrimMemory(level: Int) {
    super.onTrimMemory(level)
    if (level >= ComponentCallbacks2.TRIM_MEMORY_MODERATE) {
        // Moderate pressure — trim caches
    }
}

From Dart, you can observe memory pressure through WidgetsBindingObserver.didHaveMemoryPressure(), though the granularity is coarser than the native callbacks.

Android memory numbers to know:

On a running Flutter app, adb shell dumpsys meminfo <your.package.name> gives you:

javascript
App Summary
                       Pss(KB)
                        ------
           Java Heap:    12,544   ← ART/JVM heap (Flutter uses this minimally)
         Native Heap:    38,291   ← native malloc heap (Flutter engine + your FFI)
                Code:    18,432   ← code segment (libflutter.so, libapp.so)
               Stack:       132   ← thread stacks
            Graphics:    24,576   ← GPU buffers, image textures
       Private Other:     6,144   ← misc private mappings
              System:    11,234   ← system-shared mappings
           TOTAL PSS:   111,353   ← total proportional memory usage

PSS (Proportional Set Size) is the honest number: shared pages (like libflutter.so) are counted proportionally across all apps sharing them. If libflutter.so uses 10MB and three apps use it, each app's PSS includes ~3.3MB for that library.

The Native Heap line is what FFI allocations add to. Every calloc call in your Dart FFI code shows up here. If this number grows unboundedly as the user navigates around, you have a native memory leak.

iOS Memory Management: Jetsam and No Swap

iOS handles memory pressure differently — and more aggressively.

There is no swap space on iOS. On desktop Linux or Android (which can use zRAM), when physical RAM fills up, the OS can swap inactive pages to disk (or compress them). iOS has no swap partition. Once physical RAM is full, something has to go.

iOS uses Jetsam — its memory pressure daemon — which is faster and less forgiving than Android's LMK. Jetsam can terminate a background app in milliseconds when the foreground app requests a large allocation. There's no warning, no callback, no grace period. The app is just gone. This is why iOS "multitasking" is actually app state restoration — the app was killed, and iOS makes it look like it was suspended.

One thing iOS does instead of swap: memory compression. When memory pressure rises, iOS compresses the pages of inactive processes. A process using 200MB of RAM might compress to 50MB of compressed pages. This buys time before Jetsam has to kill anything. You can see this in Instruments under "VM Tracker" — the "Swapped" column on iOS is actually compressed memory.

iOS memory concepts to know:

  • Dirty pages: pages your app has written to. These are the pages iOS must either keep in RAM or kill the app to reclaim. This is the number that matters most.
  • Clean pages: read-only or file-backed pages (your code, memory-mapped assets). iOS can evict these freely and reload them from disk if needed.
  • Memory footprint: dirty pages + compressed pages. This is what os_proc_available_memory() and Instruments report. This is the number that triggers Jetsam.

Your app gets one warning before termination:

swift
// iOS — in AppDelegate
override func didReceiveMemoryWarning() {
    // Last chance to free memory before Jetsam terminates
    URLCache.shared.removeAllCachedResponses()
}

In Flutter, this surfaces as AppLifecycleState changes and the SystemChannels.lifecycle stream, but the window is narrow.

The practical implication for Flutter apps: the image cache is the most dangerous consumer of dirty pages on iOS. A 30-image feed where each image decoded to a 1080x1920 RGBA buffer uses 30 × 1080 × 1920 × 4 bytes ≈ 250MB of dirty pages. On a device with 3GB of RAM running multiple apps, Jetsam will terminate your app before the user even scrolls to the bottom. memCacheWidth/memCacheHeight in CachedNetworkImage — decoding images at display size instead of original size — directly reduces dirty page count.

The Full Picture: Flutter's Memory in One View

Here's how all the regions relate to each other in a running Flutter app:

javascript
Flutter App Process (virtual address space)
┌─────────────────────────────────────────────────────────┐
│                     Kernel space                        │
├─────────────────────────────────────────────────────────┤
│  UI thread stack (8MB)    │  Raster thread stack        │
│  (= main/platform thread)  │  IO thread stack            │
├─────────────────────────────────────────────────────────┤
│                                                         │
│  Native heap                                            │
│  ┌─────────────────────────────────────────────────┐   │
│  │  Dart GC Heap (Isolate 1)                        │   │
│  │  ┌──────────────────────────────────────────┐   │   │
│  │  │ New space │ Old space                    │   │   │
│  │  └──────────────────────────────────────────┘   │   │
│  │                                                  │   │
│  │  Dart GC Heap (Isolate 2 — worker isolates)      │   │
│  │                                                  │   │
│  │  Flutter engine internal allocations             │   │
│  │  (frame pipeline, text layout, etc.)             │   │
│  │                                                  │   │
│  │  YOUR FFI allocations (calloc)  ← YOU manage     │   │
│  └─────────────────────────────────────────────────┘   │
│                                                         │
│  Memory-mapped region                                   │
│  ┌─────────────────────────────────────────────────┐   │
│  │ libflutter.so / Flutter.framework (code)         │   │
│  │ libapp.so (your compiled Dart code)              │   │
│  │ Image textures (GPU-uploaded decoded images)     │   │
│  │ Asset files (fonts, images — lazily paged)       │   │
│  └─────────────────────────────────────────────────┘   │
│                                                         │
├─────────────────────────────────────────────────────────┤
│  BSS + Data segments (global variables)                 │
├─────────────────────────────────────────────────────────┤
│  Text/Code segment (read-only, executable)              │
└─────────────────────────────────────────────────────────┘

When DevTools shows you "Dart heap: 45MB," it's showing you the Dart GC heap region. When dumpsys meminfo shows "Native Heap: 92MB," it's showing you everything in the native heap box — Dart GC heap plus Flutter engine plus your FFI allocations. When Instruments shows "Memory Footprint: 280MB," it's showing you all dirty pages across all regions — native heap, graphics, stack, everything except clean file-backed pages.

These numbers measure different things. Comparing them directly leads to confusion. The memory profiling post covers which tool to use for which question.

New Section

Start with the simplest possible model. Your phone has, say, 8GB of RAM. Physically, that's 8 billion bytes of storage. Every byte has an address — a number from 0 to 8,589,934,591. Think of it as an enormous array:

javascript
Address   Value
──────────────
0         0x4A
1         0x00
2         0xFF
...
8589934591  0x12

When code runs, it reads and writes values to specific addresses. A pointer — a variable that "points to" something — is just a number: the address of the thing it points to. Pointer<Uint8> in Dart FFI isn't magic. It's an integer that says "the byte you want is at address 140,734,789,632."

That's the whole model. Everything else is built on top of this.

Related Topics

flutter memory modeldart heap vs stackandroid memory management flutterios memory management flutterflutter ram usagedart garbage collector memoryflutter ffi memoryvirtual address space mobile

Ready to build your app?

Turn your ideas into reality with our expert mobile app development services.