1 of 17

3D Physically Based Rendering with Flutter

Nick Fisher

github.com/nmfisher

twitter.com/NickFisherAU

mstdn.social/@nickfisherau

nick-fisher.com

2 of 17

Who am I?

Independent developer (freelance + self-published apps)

Focus on ML, ASR, TTS, 3D

Mostly Dart/C++/ Python/PyTorch

Not a rendering engineer!

Follow me on Twitter / github to see my daily work

https://twitter.com/NickFisherAU

https://github.com/nmfisher

3 of 17

What is 3D Rendering?

What do we need to represent a coloured cube in 3D?

  • Shape
    • 8 vertices + 6 faces (12 triangles)
  • Orientation
    • Location/rotation in world space
  • ‘Base color’ (before lighting)
  • Camera
    • position/orientation determines how we are looking at the cube
    • orthographic/perspective projection, etc
  • Lighting model + light source + shader + normals

Renderers take all that information and generate (rasterize) pixels, either directly to the framebuffer or some secondary render target.

4 of 17

Hardware/software rendering?

Renderers can be implemented purely on CPU (Rt2.gif)

Standard for 3D games up to mid-90s (e.g. Doom/Quake) as hardware acceleration wasn’t common in desktops.

Games industry took off, meshes became more complex, more realism, higher resolution displays, larger textures, etc.

5 of 17

Hardware/software rendering?

If rendering in realtime, need to do all of this rendering work 60 times per second (including game logic).

General purpose CPU not well-suited to this, so hardware companies started selling cards with special purpose chips.

Too cumbersome for developers to write rendering code for each individual manufacturer/model.

Various APIs were created (OpenGL/DirectX) so that devs could write against one API, and then manufacturers could release drivers that (hopefully) would translate API calls into correct hardware instructions.

6 of 17

“Physically Based” Rendering

Lighting is critical to aesthetic (both realistic and non-realistic).

But historically, computers were slow, so lighting was cheaply approximated (Phong shading, lightmaps, texture baking, etc). Dynamic (runtime) lighting was too expensive.

Computers are now faster, so we can use lighting model more closely based on reality, even in realtime on mobile devices.

Metallic vs non-metallic (dieletric) surfaces, rough vs smooth, etc - all reflect light differently.

7 of 17

Flutter + Filament

Filament is an open-source library from Google for physically-based rendering.

Written in C++, cross-platform, optimized for mobile (core contributions by Android rendering team).

Why wrap it inside Flutter?

  • C++ is unwieldy for many app developers
  • No UI support (widgets, text rendering, layout, gesture detection, asset loading, etc)
  • Cannot write a single codebase and build for multiple platforms
  • No cross-platform plugin support (e.g. audio)

8 of 17

How to integrate in Flutter?

Filament requires some kind of hardware-accelerated rendering “surface”, e.g:

  1. Windows HWND (WGL)
  2. Linux X11 window (GLX)
  3. Android View (GLES)
  4. iOS (GLES/Metal)
  5. Manually created off-screen buffer

Flutter also renders into its own hardware-accelerated surface (except on web, e.g. HTML renderer)

How to compose the two? Multiple options!

9 of 17

How to integrate in Flutter?

Texture widget

Regular Flutter widget, inserted inserted into Flutter widget tree (e.g. appear above/below other texture widgets)

Register a texture via native code, pass ID back to Flutter

Can be software/hardware texture

Can also be used to import raw texture feeds (e.g. camera)

PlatformView

Basically just a texture with some accessibility features

Not available on desktop

10 of 17

How to integrate in Flutter?

OS window composition

On most platforms, Flutter renders into a surface that is passed to the system compositor for overlaying other elements (e.g. minimize/maximize button, transparency, etc).

If you have some way to compose two windows on your system, you can create a rendering surface underneath and then simply put the Flutter app over the top with a background transparency!

11 of 17

Which did I choose?

Texture widgets on iOS (CVPixelBuffer), Android (SurfaceTexture) & macOS (CVPixelBuffer/CVMetalTexture)

System composition on Windows (create HWND, manually handle all window move/resize events) and web (overlay flutter host above <canvas>, instantiate WebGL2.0 context via C++/emscripten. Only works on Windows 8+.

Also tried importing Texture on Windows, however Flutter uses DirectX for rendering (not supported by Filament).

Can use ANGLE to translate GLES calls to DirectX (used by Chrome for hardware-accelerated rendering on Windows). This worked, but some visible rendering artifacts. Some shader customization work needed to ensure compatibility.

12 of 17

Calling Filament from Flutter

We now have a rendering surface, but how actually execute Filament methods from Flutter?

FFI (Foreign Function Interface):

  • Write core logic in C++, with a glue layer in C++ as extern “C”;
  • Write header file to expose C-compatible bindings
  • Use ffigen package to automatically generate Dart bindings for C code
  • manually manage memory on Dart side:

final ptr = allocator<Float>(1);

do_some_stuff_with_filament(ptr);

allocator.free(ptr);

13 of 17

Calling Filament from Flutter

Works perfectly….except for Flutter Web where FFI is still unsupported. Needed a lot of unseemly hacks to get working (and some browsers/drivers still choke).

Hoping the Dart team will land these updates later in the year.

14 of 17

Why not platform channels?

  • That was my original approach!
  • Problem is, you end up writing 4x more code:
    • Dart (Flutter)
    • C (Linux)
    • C++ (Windows)
    • Swift (iOS/macOS)
    • Kotlin (Android)
  • Every time you add a single Filament binding call, you have to write 4 different method call handlers, all in different languages.
  • Very painful, a lot of duplication
  • Also slower (marshalling overhead to convert from Dart<->Swift/Kotlin/etc types)

15 of 17

Does this overlap with Impeller?

Skia (the current backend use by the Flutter engine for rendering, used by Chrome) requires runtime shader compilation, which can often cause jank/stutter in animations.

Impeller is Flutter’s new purpose-built rendering engine to address this issue. As part of that process, a flutter_gpu package is being developed to directly expose the GPU shader pipeline so developers can implement their own 3D renderer.

This may expose much of the same functionality, but no idea when this will be available (end 2024?).

But! Filament also offers:

  • in-depth PBR model + material system
  • animation support
  • glTF/Filamesh/texture loading/etc

16 of 17

Let’s see it in action!

17 of 17

Questions?