Why Flutter Can't Easily Do Liquid Glass
Apple’s new Liquid Glass UI is genuinely impressive. Panels refract what’s behind them. Content bleeds through with chromatic aberration. Depth, lighting, and motion all feed into the material in real time. It’s a physically-based glass simulation running on the GPU — not a static blur with rounded corners.
And Flutter can’t do it. Not easily, not well, and probably not without help from the platform.
What Liquid Glass actually does
Every frame, Liquid Glass samples the entire rendered scene behind the glass element — live composited output, not a screenshot. That sampled output runs through a physically-based shader that simulates refraction, chromatic dispersion, and specular highlights. Depth and motion modulate the result: tilt the device, the glass shifts. Light the room differently, the material responds. The shader reads directly from the framebuffer. No copies. No texture uploads. No latency.
In UIKit terms, this is UIGlassEffect backed by CALayer compositing, Metal shaders, and private rendering server integration. The compositor knows the exact pixel output of every layer behind the glass.
How Flutter renders
Flutter doesn’t use UIKit views. It paints its entire UI onto a single texture and hands it to the platform compositor as one flat surface.
Live compositing
Per-layer blur/effects"] c["Compositor has access
to individual layers"] end block:Flutter:1 columns 1 d["Flutter Rendering"] e["Widget tree painted
to single texture"] f["Compositor sees
one flat surface"] end
To the iOS compositor, a Flutter app is one opaque rectangle. It doesn’t know there’s a button behind your pretend-glass panel. It doesn’t know text is scrolling underneath. It sees a texture blob. That’s all the information Liquid Glass has to work with.
The BackdropFilter hack
Flutter’s BackdropFilter applies a blur to framebuffer pixels behind a widget. It gives you a basic frosted glass:
ClipRect(
child: BackdropFilter(
filter: ImageFilter.blur(sigmaX: 20, sigmaY: 20),
child: Container(
decoration: BoxDecoration(
color: Colors.white.withOpacity(0.1),
borderRadius: BorderRadius.circular(24),
),
),
),
);
Fine for iOS 7-era UIBlurEffect. Not fine for Liquid Glass.
Liquid Glass needs refraction that varies with what’s behind the glass — a text layer bends differently than an image layer. BackdropFilter sees undifferentiated pixels. No scene graph. No per-surface depth to compute refraction angles.
It needs chromatic aberration — separating and shifting RGB channels by different amounts based on incidence angle. A Gaussian blur treats all channels the same. You’d need a custom shader, and that shader still only sees a flat pixel buffer.
It needs motion data piped directly into the shader at draw time. Flutter can read the accelerometer through platform channels, but shoving that data Dart → native → shader at 120Hz is a latency disaster.
Impeller won’t save you
Impeller, Flutter’s newer rendering engine, was meant to close the gap with native. Precompiled shaders, lower-level GPU access. It gives you FragmentProgram for custom fragment shaders:
final program = await FragmentProgram.compile(
assetKey: 'shaders/liquid_glass.frag',
);
But your shader only gets whatever pixels Flutter passes in. Not the live composited output of the render tree behind your widget. That data is in the platform compositor. You’d need to render the entire scene to a separate texture, feed it to the shader as a sampler, and hope the phone GPU keeps up at 120Hz. It won’t.
Some developers have tried Scene + Texture widgets to capture render output and feed it back into shaders — effectively building a mini compositor inside Flutter. You’re reinventing what Apple’s compositor already does, doing it worse, with higher latency.
Platform views: the escape hatch that leaks
Flutter’s UiKitView embeds native UIKit views inside the widget tree. Theoretically you render background content natively and overlay Liquid Glass on top. In practice, the platform view is a separate texture that Flutter punches a hole for. Getting the effect to work across the Flutter-native boundary means rendering the native background to a Metal texture, passing the texture handle through a platform channel to Flutter, feeding it into a custom Impeller shader every frame, and synchronizing two rendering pipelines. At 120Hz the texture upload alone eats your frame budget.
What it would take
Compositor hooks — Flutter would need access to live CALayer output, not just a texture surface. That’s deep platform embedding.
A scene-graph-aware compositing pipeline in Impeller with depth buffers and multi-pass rendering. Or skip Impeller and call Metal through FFI.
Motion data at draw time — CADisplayLink-level integration, not async platform channels.
And the shaders themselves are proprietary Apple IP in Metal Shading Language. Replicating them in GLSL for Impeller’s pipeline would be reverse engineering.
The trade-off
Flutter owns its rendering pipeline. That ownership is why it can render consistently across platforms — but it also means you can’t borrow Apple’s. When Apple builds an effect directly into their compositor, Flutter can’t reach it. The Flutter team has an open issue tracking Liquid Glass support and has explicitly paused Cupertino design updates while they restructure how design libraries integrate with the framework. Even they aren’t shipping this yet.
For apps that need Liquid Glass today, the options are: build the glass-heavy screens in SwiftUI, or use a well-tuned BackdropFilter with gradient overlays and subtle animation — convincing frosted glass within Flutter’s limits, but not Liquid Glass.
Flutter renders to a texture. Liquid Glass needs the live compositor layer tree. No Dart code bridges that gap.
References
- Adopting Liquid Glass — Apple
- BackdropFilter — Flutter
- Impeller rendering engine
- FragmentProgram — Flutter
- UiKitView — Flutter