Your app needs to trim a video before upload. Or compress it to save bandwidth. Or extract a thumbnail at the 3-second mark. Or transcode from HEVC to H.264 because your server doesn't speak HEVC.
You look at the Flutter ecosystem. There are packages. Most of them wrap FFmpeg — because there's no real alternative. FFmpeg is 20+ years of media engineering, handles every codec and container format that exists, and is the engine behind VLC, YouTube's ingest pipeline, and about half the video infrastructure on the internet.
The question isn't whether to use FFmpeg. It's how to get it into your Flutter app without the build configuration driving you insane.
The options
There are two mainstream approaches. One important caveat before we start: arthenica archived the `ffmpeg-kit` project in January 2025. The original packages (ffmpeg_kit_flutter_*) are still on pub.dev but receive no updates. Community forks have emerged — ffmpeg_kit_flutter_new, ffmpeg_kit_flutter_community and others — which pick up the maintenance burden and ship updated FFmpeg versions. The API surface is identical, so code written against the original works against the forks.
1. A maintained ffmpeg-kit fork (recommended for most projects)
These packages bundle prebuilt FFmpeg binaries for all platforms. You install, call, done. Pick an actively maintained fork:
dependencies:
# Check pub.dev for current recommended fork — the ecosystem is still settling
# after arthenica/ffmpeg-kit was archived.
ffmpeg_kit_flutter_new: ^x.y.z # example — verify before usingMultiple variants typically exist with different codec support:
*_min: Smallest binary, basic codecs*_full: Most codecs, LGPL license*_full_gpl: Everything including x264/x265, GPL license
import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';
Future<void> trimVideo(String inputPath, String outputPath) async {
final session = await FFmpegKit.execute(
'-i "$inputPath" -ss 00:00:05 -t 00:00:10 -c copy "$outputPath"'
);
final returnCode = await session.getReturnCode();
if (ReturnCode.isSuccess(returnCode)) {
print('Trim complete');
} else {
final logs = await session.getLogsAsString();
print('FFmpeg failed: $logs');
}
}This isn't FFI in the traditional sense — ffmpeg_kit uses platform channels internally. But it's the pragmatic choice for most apps. The FFmpeg binary is prebuilt, tested, and packaged correctly for every platform. If no fork meets your needs, the direct FFI path below is the fallback — a larger investment but one you fully control.
2. Direct FFI (for custom pipelines)
If you need programmatic control over FFmpeg's C API — not just running commands, but feeding frames into encoders, decoding specific streams, or building custom processing pipelines — you need the actual libraries: libavcodec, libavformat, libavutil, libswscale.
This is significantly more complex. Let's walk through it.
Building FFmpeg for mobile
FFmpeg doesn't ship prebuilt binaries for mobile. You compile it yourself, or you use someone else's build scripts.
Android
The best approach is using ffmpeg-kit's build system, which handles the cross-compilation:
# Clone the build scripts
git clone https://github.com/arthenica/ffmpeg-kit.git
cd ffmpeg-kit
# Build for Android (arm64 + x86_64 for emulator)
./android.sh --enable-gpl --enable-x264This produces shared libraries in prebuilt/android-arm64/ffmpeg/lib/:
libavcodec.solibavformat.solibavutil.solibswscale.solibswresample.so
Place them in your Android project:
android/app/src/main/jniLibs/
├── arm64-v8a/
│ ├── libavcodec.so
│ ├── libavformat.so
│ ├── libavutil.so
│ ├── libswscale.so
│ └── libswresample.so
└── x86_64/ # For emulator
├── ...iOS
./ios.sh --enable-gpl --enable-x264This produces .xcframework bundles. Add them to your Xcode project's "Frameworks, Libraries, and Embedded Content" section, set to "Embed & Sign."
Dart FFI bindings for libavformat/libavcodec
FFmpeg's C API is enormous. You won't bind the whole thing. Bind what you need. Here's a minimal set for opening a video, reading stream info, and extracting a frame:
import 'dart:ffi';
import 'package:ffi/ffi.dart';
// Load the libraries
final DynamicLibrary _avformat = Platform.isAndroid
? DynamicLibrary.open('libavformat.so')
: DynamicLibrary.process();
final DynamicLibrary _avcodec = Platform.isAndroid
? DynamicLibrary.open('libavcodec.so')
: DynamicLibrary.process();
final DynamicLibrary _avutil = Platform.isAndroid
? DynamicLibrary.open('libavutil.so')
: DynamicLibrary.process();
// Register all muxers/demuxers/protocols
typedef _AvRegisterAllC = Void Function();
typedef _AvRegisterAllDart = void Function();
// Note: av_register_all is deprecated in FFmpeg 4.0+ — registration is automatic.
// Only needed if you're using FFmpeg 3.x.
// Open an input file
typedef _AvformatOpenInputC = Int32 Function(
Pointer<Pointer<Void>> ps, // AVFormatContext**
Pointer<Utf8> url,
Pointer<Void> fmt, // AVInputFormat* (null = auto-detect)
Pointer<Pointer<Void>> options,
);
typedef _AvformatOpenInputDart = int Function(
Pointer<Pointer<Void>> ps,
Pointer<Utf8> url,
Pointer<Void> fmt,
Pointer<Pointer<Void>> options,
);
final avformatOpenInput = _avformat.lookupFunction<
_AvformatOpenInputC, _AvformatOpenInputDart>('avformat_open_input');
// Find stream info
typedef _AvformatFindStreamInfoC = Int32 Function(
Pointer<Void> ic, Pointer<Pointer<Void>> options);
typedef _AvformatFindStreamInfoDart = int Function(
Pointer<Void> ic, Pointer<Pointer<Void>> options);
final avformatFindStreamInfo = _avformat.lookupFunction<
_AvformatFindStreamInfoC, _AvformatFindStreamInfoDart>(
'avformat_find_stream_info');
// Close
typedef _AvformatCloseInputC = Void Function(Pointer<Pointer<Void>> s);
typedef _AvformatCloseInputDart = void Function(Pointer<Pointer<Void>> s);
final avformatCloseInput = _avformat.lookupFunction<
_AvformatCloseInputC, _AvformatCloseInputDart>('avformat_close_input');This gets tedious fast. For any serious FFmpeg FFI work, use ffigen to auto-generate bindings from the header files:
# ffigen.yaml
name: FFmpegBindings
description: FFmpeg bindings
output: 'lib/src/ffmpeg_bindings.dart'
headers:
entry-points:
- 'ffmpeg/include/libavformat/avformat.h'
- 'ffmpeg/include/libavcodec/avcodec.h'
- 'ffmpeg/include/libavutil/avutil.h'
include-directives:
- 'ffmpeg/include/**'dart run ffigenThis generates thousands of lines of bindings. You wrap the ones you need in a clean Dart API.
Practical recipes
Thumbnail extraction (using ffmpeg_kit)
Future<String?> extractThumbnail(String videoPath, {int atSecond = 1}) async {
final outputPath = '${(await getTemporaryDirectory()).path}/thumb_${DateTime.now().millisecondsSinceEpoch}.jpg';
final session = await FFmpegKit.execute(
'-i "$videoPath" -ss $atSecond -vframes 1 -q:v 2 "$outputPath"'
);
if (ReturnCode.isSuccess(await session.getReturnCode())) {
return outputPath;
}
return null;
}-ss 1 seeks to 1 second. -vframes 1 extracts one frame. -q:v 2 sets JPEG quality (2 = high quality, range is 1-31).
Video compression
Future<String?> compressVideo(String inputPath, {int crf = 28}) async {
final outputPath = '${(await getTemporaryDirectory()).path}/compressed_${DateTime.now().millisecondsSinceEpoch}.mp4';
// CRF 28 = good balance of quality/size for mobile uploads
// -preset fast = reasonable encoding speed on mobile CPUs
// -movflags +faststart = puts metadata at the start for streaming
final session = await FFmpegKit.execute(
'-i "$inputPath" '
'-c:v libx264 -crf $crf -preset fast '
'-c:a aac -b:a 128k '
'-movflags +faststart '
'"$outputPath"'
);
if (ReturnCode.isSuccess(await session.getReturnCode())) {
return outputPath;
}
return null;
}Video trimming (without re-encoding)
Future<String?> trimVideo(
String inputPath, {
required Duration start,
required Duration duration,
}) async {
final outputPath = '${(await getTemporaryDirectory()).path}/trimmed_${DateTime.now().millisecondsSinceEpoch}.mp4';
final startStr = _formatDuration(start);
final durationStr = _formatDuration(duration);
// -c copy = stream copy, no re-encoding. Fast but only cuts on keyframes.
final session = await FFmpegKit.execute(
'-i "$inputPath" -ss $startStr -t $durationStr -c copy "$outputPath"'
);
if (ReturnCode.isSuccess(await session.getReturnCode())) {
return outputPath;
}
return null;
}
String _formatDuration(Duration d) {
final hours = d.inHours.toString().padLeft(2, '0');
final minutes = (d.inMinutes % 60).toString().padLeft(2, '0');
final seconds = (d.inSeconds % 60).toString().padLeft(2, '0');
return '$hours:$minutes:$seconds';
}-c copy is the key flag. It copies the video and audio streams without re-encoding — instant on any device. The tradeoff: the trim point must land on a keyframe, so the actual start time may be a fraction of a second off.
Progress tracking
Future<void> compressWithProgress(
String inputPath,
String outputPath,
void Function(double progress) onProgress,
) async {
// First, get the video duration
final probeSession = await FFprobeKit.getMediaInformation(inputPath);
final info = probeSession.getMediaInformation();
final durationMs = double.tryParse(info?.getDuration() ?? '0') ?? 0;
if (durationMs <= 0) return;
// Run FFmpeg with statistics callback
await FFmpegKit.executeAsync(
'-i "$inputPath" -c:v libx264 -crf 28 -preset fast "$outputPath"',
(session) async {
// Complete callback
final code = await session.getReturnCode();
if (ReturnCode.isSuccess(code)) {
onProgress(1.0);
}
},
null, // Log callback
(statistics) {
// Statistics callback — called periodically during encoding
final timeMs = statistics.getTime(); // Current position in ms
if (timeMs > 0) {
final percent = (timeMs / (durationMs * 1000)).clamp(0.0, 1.0);
onProgress(percent);
}
},
);
}Common errors
APK size explodes after adding FFmpeg
Cause: FFmpeg with full codec support adds 15-30MB per ABI. With arm64 + x86_64 + armeabi-v7a, you're looking at 50-90MB added to the APK.
Fix:
- Use app bundles (
.aab) so Google Play delivers only the relevant ABI - Choose a smaller FFmpeg variant (
ffmpeg_kit_flutter_mininstead offull_gpl) - If you only need basic operations (trim, compress with H.264), the min variant is sufficient
"FFmpeg returned non-zero exit code" with no useful error
Cause: The error is in the logs, not the return code.
Fix: Always read the logs:
final logs = await session.getLogsAsString();
print(logs); // The actual error message is here"-ss before -i" vs "-ss after -i"
Cause: Position of -ss changes behavior.
-ss 5 -i input.mp4— seeks in the input (fast, uses keyframes)-i input.mp4 -ss 5— decodes from the start and discards frames until 5s (slow, frame-accurate)
Fix: Put -ss before -i for speed. Put it after -i when you need frame-accurate seeking and are re-encoding anyway.
HEVC/H.265 playback fails on older Android
Cause: You transcoded to HEVC but the target device doesn't have hardware HEVC decoding (common on pre-2016 Android devices).
Fix: Use H.264 (libx264) as the output codec for maximum compatibility. Only use HEVC if you control the playback environment.
iOS build fails: "Undefined symbols for architecture arm64"
Cause: The FFmpeg framework isn't linked properly. Usually a missing "Embed & Sign" setting, or the framework search paths are wrong.
Fix: In Xcode:
- Go to target → General → Frameworks, Libraries, and Embedded Content
- Ensure all FFmpeg frameworks are listed and set to "Embed & Sign"
- Check Build Settings → Framework Search Paths includes the directory containing the
.xcframeworkfiles
Encoding is extremely slow on device
Cause: Video encoding is CPU-intensive. A 1080p re-encode on a mid-range phone takes roughly real-time (1 minute of video = 1 minute of encoding). That's with software encoding.
Fix:
- Use
-preset ultrafastor-preset veryfastfor faster (but larger) output - Lower the resolution:
-vf scale=720:-2(720p, maintain aspect ratio) - Use stream copy (
-c copy) when re-encoding isn't needed - Run FFmpeg on a background isolate to keep the UI responsive
- Consider hardware encoding (
-c:v h264_mediacodecon Android) — though support varies by device
When to use direct FFI vs ffmpeg_kit
Use `ffmpeg_kit_flutter` when: You're running FFmpeg commands (trim, compress, extract, convert). This covers 95% of video processing use cases in mobile apps.
Use direct FFI when: You need to process individual frames in real-time (camera filters, custom video effects), build a non-standard pipeline (feed frames from one source and audio from another), or integrate FFmpeg into a processing chain that also involves other C libraries.
For most Flutter apps, ffmpeg_kit_flutter is the right choice. Direct FFI into libavcodec is a serious undertaking — the API surface is large, the memory management is complex, and the edge cases are numerous. Only go there if ffmpeg_kit can't do what you need.
This is Post 13 of the FFI series. Next: Computer Vision With OpenCV.*