Part 11 of the Flutter Security Beyond the Basics series.
The data that survives logout
A user opens a banking app, logs in, reviews their account balances, scrolls through recent transactions, checks a pending transfer. Then they tap "Log out." The app navigates to the login screen. The logout handler clears the auth token from secure storage, exactly as it should.
But the AccountBloc that fetched and held all of that data — balances, transaction history, account numbers, the user's full name and address — was never disposed. The widget tree that displayed it is gone. The screen is gone. The route has been popped. But the BLoC is still alive in memory, holding every piece of sensitive data it ever received from the API.
On a stock device, this is a minor housekeeping issue. On a rooted device, another process can dump the app's memory and read that data directly. The user believed they logged out. Their data says otherwise.
This post covers an area of mobile security that almost nobody discusses in the Flutter community: memory as an attack surface. Not memory leaks in the performance sense — memory leaks in the security sense. The distinction matters, and it is more practical than it sounds.
Why memory matters for security
Most Flutter security discussions focus on storage (secure storage vs SharedPreferences), transport (TLS, certificate pinning), and authentication (tokens, biometrics). Memory sits beneath all of these. Every token you read from secure storage passes through RAM. Every API response containing user data lives in RAM while your app processes and displays it. Every password the user types exists as a Dart string in RAM.
The question is not whether sensitive data enters memory. It always does. The question is how long it stays there and who else can read it.
Memory is readable on compromised devices
On a rooted Android device or a jailbroken iOS device, an attacker with the right tools can read the memory of any running process. Tools like Frida can attach to your app's process and inspect heap contents — searching for strings that look like tokens, account numbers, or credentials. GameGuardian, a tool originally built for cheating in games, can scan any app's memory for specific values. On Android, /proc/<pid>/mem is readable with root access.
If this series has covered anything consistently, it is that rooted and jailbroken devices are a realistic part of your threat model. Everything in your app's memory is accessible to anyone with sufficient device access.
The OS may write your memory to disk
Under memory pressure — when the device is running low on RAM — the operating system may write memory pages to storage. On Android, this is zRAM (compressed swap in RAM) on most modern devices, but some devices and custom ROMs still use traditional swap on flash storage. On iOS, the system uses compressed memory rather than disk-based swap for foreground apps, but background apps can have their memory written to disk before being terminated.
The practical consequence: sensitive data that you thought existed only in volatile RAM may end up on persistent storage, unencrypted, outside your app's control. You cannot clear it because you do not know it is there.
Crash dumps capture heap contents
When your app crashes, the crash reporting framework — Crashlytics, Sentry, Bugsnag — captures diagnostic information. Depending on the configuration, this can include portions of the heap, local variable values, and string contents. If your app crashes while a user's access token, account number, or password is in memory, that data may end up in a crash report, transmitted to a third-party service, stored in a dashboard that your entire development team can access.
This is not a theoretical concern. It is a compliance concern. If you handle financial data or health data and a crash report containing PII lands in a third-party dashboard, you may have a data breach notification obligation depending on your jurisdiction.
Time is the multiplier
The longer sensitive data remains in memory, the larger the window during which any of the above vectors can capture it. A password that exists in memory for 200 milliseconds — long enough to hash and send to the server — is a much smaller risk than one that persists for the entire session because a TextEditingController was never disposed.
Memory leaks as a security risk
Every Flutter developer has encountered memory leaks. An undisposed AnimationController, an uncancelled StreamSubscription, a ScrollController that outlives its widget. These are performance problems. You notice them when your app gets sluggish or when DevTools shows a steadily climbing memory graph.
But when the leaked object holds sensitive data, the leak becomes a security problem. The object cannot be garbage collected because something still references it. The data it contains stays in memory indefinitely — or at least far longer than the user expects.
The user's mental model vs reality
When a user taps "Log out," they have a reasonable expectation: their session is over, their data is no longer accessible in the app. If you showed them a memory dump of the app's process five minutes after logout and they could see their account balance, transaction history, and home address sitting in a leaked BLoC, they would consider that a bug. They would be right.
The disconnect between what the user expects and what actually happens in memory is the core of the problem. It is not that every memory leak is a critical vulnerability. It is that memory leaks involving sensitive data violate a reasonable expectation of data lifecycle.
Specific Flutter patterns that cause security-relevant leaks
StreamSubscription to a real-time data feed, not cancelled on logout. Your app subscribes to a WebSocket or Firestore stream that pushes account updates in real time. The subscription holds a reference to the callback, which holds a reference to the BLoC or state object, which holds the latest data. If you navigate to the login screen without cancelling the subscription, the entire chain stays alive. The stream may even continue receiving data after logout if the server-side session has not been invalidated yet — meaning new sensitive data is flowing into a screen the user can no longer see.
// This subscription keeps the bloc alive after logout
class AccountBloc {
StreamSubscription<AccountUpdate>? _subscription;
void startListening(String userId) {
_subscription = accountStream(userId).listen((update) {
// This closure holds a reference to `this`
_latestBalance = update.balance;
_transactions = update.transactions;
});
}
// If this is never called, the bloc and all its data persist
void dispose() {
_subscription?.cancel();
_subscription = null;
}
}TextEditingController with a password field, never disposed. The user types their password into a TextField. The TextEditingController stores the password as a Dart String. If the controller is not disposed — or even if it is disposed but the String is not cleared first — the password remains in memory as an immutable Dart string until the garbage collector reclaims it, which may be much later.
// The password lives in _controller.text until GC collects the controller
final _passwordController = TextEditingController();
// Better: clear it as soon as you've used it
void _handleSubmit() {
final password = _passwordController.text;
_authService.login(email, password);
_passwordController.clear(); // Remove the password from the controller
}A global singleton service that caches user data. This is perhaps the most common pattern. A service registered as a singleton in GetIt or provided at the root of the widget tree caches user data — profile information, account details, preferences. Because it is a singleton, it lives for the entire lifetime of the app. When the user logs out and a different user logs in, the new user's data is loaded, but if the cache is not explicitly cleared, remnants of the previous user's data may remain in the service's internal state.
class UserProfileService {
UserProfile? _cachedProfile;
List<Transaction>? _cachedTransactions;
// These survive logout unless explicitly cleared
Future<UserProfile> getProfile() async {
_cachedProfile ??= await _api.fetchProfile();
return _cachedProfile!;
}
// This must be called on logout
void clearCache() {
_cachedProfile = null;
_cachedTransactions = null;
}
}NavigatorObserver or RouteObserver holding references to disposed screens. If you use a custom NavigatorObserver that stores references to routes or their associated widgets for analytics or debugging, those references prevent the garbage collector from reclaiming the widget and all the state it holds. A route that displayed a user's medical records, financial statements, or personal messages stays in memory because an observer is holding a reference to it.
Profiling for security-relevant leaks with DevTools
Flutter DevTools includes a memory profiler that is primarily used for performance analysis, but the same tool is effective for identifying security-relevant leaks. The process is straightforward but requires a specific methodology.
Step 1: Establish a baseline
Open DevTools and navigate to the Memory tab. With your app on the login screen (no user data loaded), take a heap snapshot. This is your baseline — it shows what your app's memory looks like with no sensitive data present.
Step 2: Log in and interact
Log into the app with a test account. Navigate to the screens that display sensitive data — account details, transaction history, profile information. Let the app settle for a few seconds so all data is loaded and rendered.
Take a second heap snapshot. This captures everything that is in memory during an active session.
Step 3: Log out
Trigger your logout flow. Wait for the app to return to the login screen. Give it a few seconds for any asynchronous cleanup to complete.
Take a third heap snapshot.
Step 4: Compare snapshots
This is where the security analysis happens. Filter the heap snapshot for your app's classes — your BLoCs, your repository classes, your model classes (UserProfile, Transaction, AccountDetails, whatever you have named them).
Objects from the second snapshot (logged in) that still appear in the third snapshot (logged out) are security-relevant leaks. They are instances that should have been collected but were not, because something still holds a reference to them.
Step 5: Trace the retention path
For each leaked object, DevTools can show you the retention path — the chain of references that prevents garbage collection. This tells you exactly what is keeping the object alive. Common culprits:
- A
StreamSubscriptionthat was not cancelled - A closure capturing
thisin a callback registered with a long-lived object - A singleton service holding a reference to user-scoped data
- A global list or map that was never cleared
The retention path is the diagnosis. It tells you what to fix.
Making this part of your process
For apps that handle sensitive data, this should not be a one-off exercise. Run this check after any significant change to your authentication flow, navigation structure, or state management setup. It takes ten minutes and catches problems that no other test will find.
The Dart GC problem — you cannot zero memory
In languages like C and C++, when you are done with sensitive data, you can overwrite the memory it occupied with zeros. Functions like memset or SecureZeroMemory on Windows write zeros to the exact bytes where the data was stored. Once zeroed, the data is gone — not just unreferenced, but actually destroyed.
Dart does not give you this option for its most common data type. Dart strings are immutable. Once created, they cannot be modified. You cannot overwrite a string's contents with zeros. You can remove all references to it, but the actual bytes remain in memory until the garbage collector decides to reclaim that memory region — and even then, the GC does not zero the memory. It simply marks it as available for reuse. The old data sits there until something else happens to be allocated in the same location.
This means that a password, token, or account number stored as a Dart String will persist in memory for an indeterminate period after your code has finished using it. You have no mechanism to force its destruction.
Uint8List — the closest Dart has to a mutable secure buffer
For data where this matters — encryption keys, temporary credentials, sensitive byte sequences — Uint8List provides a mutable buffer that you can zero explicitly:
import 'dart:typed_data';
// Create a buffer for sensitive key material
final keyBytes = Uint8List(32);
// ... populate keyBytes from a secure source ...
// ... use keyBytes for encryption/decryption ...
// Zero the memory when done
keyBytes.fillRange(0, keyBytes.length, 0);This pattern is used in cryptographic libraries that handle key material. It is the Dart equivalent of zeroing sensitive buffers in C.
The honesty caveat
This is a mitigation, not a guarantee. The Dart garbage collector is a generational, compacting collector. It may copy objects between memory regions as part of its normal operation. When it copies a Uint8List, the old copy remains in the old memory location until that region is reused. You zeroed the current copy, but you cannot zero the copies the GC may have made.
The OS may have paged the memory to disk before you zeroed it. A crash dump may have captured it. The data may exist in CPU cache lines.
For most Flutter apps, this level of concern is disproportionate to the actual risk. If your app displays a user's order history and you worry about the GC's copy behaviour, you are optimising for a threat that is several orders of magnitude less likely than, say, someone shoulder-surfing the screen.
For apps that handle encryption keys, cryptocurrency private keys, or financial credentials that directly enable monetary transactions, the Uint8List pattern is worth using. For everything else, the far more impactful step is ensuring proper disposal and clearing of state on logout.
Secure logout — the complete checklist
A proper logout is not a single action. It is a sequence of cleanup steps that, taken together, ensure the user's session and data are genuinely terminated. Here is the full sequence, with code for each step.
1. Clear tokens from secure storage
import 'package:flutter_secure_storage/flutter_secure_storage.dart';
final _storage = const FlutterSecureStorage();
Future<void> _clearTokens() async {
await _storage.delete(key: 'access_token');
await _storage.delete(key: 'refresh_token');
}2. Invalidate the refresh token server-side
Clearing the token locally is necessary but not sufficient. If the refresh token is still valid on the server, anyone who captured it can use it to obtain new access tokens.
Future<void> _invalidateServerSession() async {
try {
await httpClient.post('/api/auth/logout');
} catch (_) {
// Log but don't block — the local cleanup must proceed
// even if the server call fails
}
}3. Close and dispose all BLoCs and controllers that hold user data
// If using BLoC with manual lifecycle management
accountBloc.close();
transactionBloc.close();
profileBloc.close();
notificationsBloc.close();If your BLoCs are provided via BlocProvider, ensure the provider is scoped to the authenticated portion of your widget tree so that it is automatically disposed when the user navigates away from that subtree.
4. Clear in-memory caches
// Repository and service caches
userProfileService.clearCache();
transactionRepository.clearCache();
// Image cache — relevant if your app displays sensitive images
// (medical images, identity documents, private photos)
imageCache.clear();
imageCache.clearLiveImages();
// Network image cache
PaintingBinding.instance.imageCache.clear();
PaintingBinding.instance.imageCache.clearLiveImages();5. Reset the navigation stack
Prevent the user (or an attacker) from pressing the back button and returning to an authenticated screen that may still display cached data.
// Using Navigator 2.0 / GoRouter
GoRouter.of(context).go('/login');
// Using Navigator 1.0 — replace the entire stack
Navigator.of(context).pushAndRemoveUntil(
MaterialPageRoute(builder: (_) => const LoginScreen()),
(route) => false, // Remove all routes
);6. Reset service locator singletons (GetIt)
If you use GetIt to register services that hold user state, reset them on logout.
// Option A: Reset specific registrations
getIt.resetLazySingleton<UserProfileService>();
getIt.resetLazySingleton<TransactionRepository>();
// Option B: If you scope user services to a named scope
// (this is the cleaner approach)
getIt.dropScope('userSession');7. Invalidate user-scoped providers (Riverpod)
If you use Riverpod, invalidate any provider that holds user-specific data.
// Invalidate individual providers
ref.invalidate(userProfileProvider);
ref.invalidate(transactionsProvider);
ref.invalidate(notificationsProvider);
// Or, if you use a userSession provider that others depend on,
// invalidating it will cascade
ref.invalidate(userSessionProvider);Putting it together
Future<void> logout(BuildContext context) async {
// Server-side invalidation (best-effort)
await _invalidateServerSession();
// Clear persisted tokens
await _clearTokens();
// Dispose state management
context.read<AccountBloc>().close();
context.read<TransactionBloc>().close();
// Clear caches
userProfileService.clearCache();
PaintingBinding.instance.imageCache.clear();
PaintingBinding.instance.imageCache.clearLiveImages();
// Reset navigation
if (context.mounted) {
Navigator.of(context).pushAndRemoveUntil(
MaterialPageRoute(builder: (_) => const LoginScreen()),
(route) => false,
);
}
}The key insight is that each step addresses a different vector. Clearing tokens prevents future authentication. Disposing BLoCs removes sensitive data from the heap. Clearing caches removes data from in-memory stores that the GC does not manage. Resetting navigation prevents visual access to stale data. Missing any one of these leaves a gap.
Design patterns that minimise the risk
The secure logout checklist is necessary, but it is reactive — you are cleaning up after the fact. Better to structure your app so that sensitive data has a short, well-defined lifecycle by design.
Scope user data to the auth session
The single most effective pattern is to ensure that everything user-scoped is created when the user logs in and destroyed when the user logs out, automatically.
In practice, this means placing your user-scoped providers, BLoCs, and services inside a widget subtree that only exists when the user is authenticated.
class AppRoot extends StatelessWidget {
@override
Widget build(BuildContext context) {
return BlocBuilder<AuthBloc, AuthState>(
builder: (context, state) {
if (state is Authenticated) {
// This entire subtree — and all its BlocProviders — is
// created on login and destroyed on logout
return MultiBlocProvider(
providers: [
BlocProvider(create: (_) => AccountBloc(state.userId)),
BlocProvider(create: (_) => TransactionBloc(state.userId)),
BlocProvider(create: (_) => ProfileBloc(state.userId)),
],
child: const AuthenticatedApp(),
);
}
return const LoginScreen();
},
);
}
}When AuthBloc emits an unauthenticated state, Flutter tears down the MultiBlocProvider and its children. Each BlocProvider calls close() on its BLoC automatically. No manual cleanup needed. No BLoCs surviving logout. The sensitive data's lifetime is tied to the widget subtree's lifetime, which is tied to the auth state.
This is the pattern that prevents the opening scenario of this post. The AccountBloc cannot outlive the authenticated session because it does not exist outside of it.
Do not cache sensitive data in singletons
A singleton lives for the entire lifetime of the app. If it caches user data, that data persists across login/logout cycles. This is convenient for performance but problematic for security.
Instead, use session-scoped containers. Create a UserSession object on login that holds or provides access to user-specific services. Destroy it on logout. If you need a singleton for infrastructure (HTTP client, analytics), keep it. But user data should not live in infrastructure singletons.
Clear password fields immediately after use
A TextEditingController attached to a password field will hold the password string for as long as the controller exists. Clear it the moment you have extracted the value.
void _handleLogin() {
final email = _emailController.text;
final password = _passwordController.text;
// Clear the password from the controller immediately
_passwordController.clear();
// Now submit — the password is in a local variable that will
// go out of scope when this method returns
_authService.login(email: email, password: password);
}The password still exists as a Dart string in the local variable password, and it will persist until the GC collects it. But the window is much shorter than leaving it in a TextEditingController that lives for the lifetime of the screen.
Do not log sensitive data
This one is simple but consistently violated. During development, it is natural to add print statements or logger calls to debug authentication flows:
// Do not do this
print('User data: ${user.toJson()}');
log('Token refreshed: $newAccessToken');
debugPrint('Login response: ${response.body}');On Android, print writes to the system log, which is readable by any app with the READ_LOGS permission on older Android versions, and by adb logcat on any device connected to a computer. On iOS, print writes to the unified logging system, accessible via Console.app.
Even if you remove these before release, they persist in debug builds that testers and stakeholders install. Use a logger that respects build modes and never logs sensitive data even in debug:
// Safe: log the event, not the data
logger.d('Login successful for user ID: ${user.id}');
// Unsafe: logs the entire user object including tokens
logger.d('Login successful: ${user.toJson()}');The proportionate assessment
Memory security sits on a spectrum, and where your app falls on that spectrum determines how much effort is warranted.
For most apps
Dispose your controllers. Cancel your stream subscriptions. Close your BLoCs on logout. Clear your caches. Do not log sensitive data. Scope user data to the authenticated session.
This covers the vast majority of the risk. These are not exotic security measures — they are standard Flutter development practices that happen to have security implications. If you are already following them for performance and correctness reasons, you are already doing most of what memory security requires.
For fintech, health, and crypto apps
Add the following on top of the baseline:
- Use `Uint8List` for key material and zero it after use. Do not store encryption keys or private keys as Dart strings.
- Run heap snapshot analysis as part of your security testing process, not just when you notice performance problems. Verify that no sensitive model objects survive logout.
- Consider the swap and crash dump vectors. Configure your crash reporting to strip or redact sensitive fields. On Android, you can set
android:allowBackup="false"andandroid:fullBackupContentto prevent backup of app data that might include paged memory. - Audit third-party packages for their memory behaviour. A package that caches data internally may retain sensitive information longer than you expect.
The underlying principle
Memory security is where engineering discipline meets security awareness. The reassuring thing is that the fix is almost always "do the thing you were supposed to do anyway." Dispose controllers. Cancel subscriptions. Clear caches. Scope data to sessions. Do not log secrets.
The app that does these things for performance and correctness reasons is, without additional effort, an app that handles memory security well. The gap exists when developers skip cleanup because it is not visibly broken — when the app works fine despite leaked BLoCs and uncancelled streams. It works fine until someone with a rooted device and Frida decides to check what is in memory after logout.
What comes next
This post focused on memory as a passive attack surface — data that lingers longer than it should. The next post in this series will cover runtime integrity: detecting whether your app's code has been modified, whether a debugger is attached, and whether someone is actively tampering with your app while it runs. Memory security is about what an attacker can observe. Runtime integrity is about what they can change.