Integration of Flutter Apps with Native Features – Our Approach to Adding an Android-Specific Camera Analysis Library

Integrating Native Features into Flutter Apps – Our Approach to Adding an Android-Specific Camera Analysis Library
Hello. My name is Osugi, and I’m part of the Toyota Woven City Payment development group.
Our team develops the payment system used in Woven by Toyota’s Toyota Woven City, covering a wide range of payment-related functions, from backend to Web frontend and mobile applications.
So far, we’ve been using Flutter to develop a mobile app for Proof of Concept (PoC). In this article, we have summarized the trial and error we went through to overcome the challenges we faced when developing new functions by incorporating a new camera analysis library that is only available natively on Android/iOS into the PoC app.
Introduction
Integrating native functions into a Flutter app doesn’t just add to the development workload—it also increases maintenance costs, making development more challenging.
In our project, considering the development timeline and available resources, we chose not to integrate native functions directly into the Flutter app. Instead, we developed a separate PoC app and a native app for camera analysis, linking them together to carry out the PoC. After completing the PoC, when we considered integrating the Flutter app with the camera analysis app, we found that the information on design guidelines and implementation methods for Flutter's native linking function was fragmented, and we felt that there were few systematic guidelines, especially for Android's complex UI configuration.
In this article, we’ll focus on Android and share design principles and practical methods for incorporating native UI into a Flutter app.
Hopefully, this will be helpful for engineers facing similar challenges.
App Overview
For the purposes of this article, we’ve simplified the app developed during the actual PoC. The app follows these specifications:
Specifications
- When you press the start button, the camera preview will be displayed.
- The camera analysis function runs on the preview image, and the analysis results are sent as notifications.
In this article, I would like to talk about this app.
Data Integration Between Flutter and Native Android
We implemented data exchange between Flutter and Android native using MethodChannel and EventChannel, enabling camera control from Flutter and analysis result notifications from Android native.
MethodChannel
is used for commands like starting and stopping the camera, while EventChannel
is used for sending analysis result notifications.
The sequence diagram below illustrates this process:
Next, I would like to talk about how to display the Android native camera preview UI on the Flutter side.
How to display native Android UI in a Flutter app
There are three main ways to display native Android UI in a Flutter app:
-
Texture widget – Displays an image rendered on an Android native Surface within the Flutter Widget tree.
-
PlatformView – Embeds, displays, and controls Android native UI inside the Flutter widget tree.
-
Intent – Launches a new Activity.
We’ll go over the characteristics of each method and how to implement them.
Texture Widget
The Texture
widget displays an image rendered on an Android native Surface within the Flutter Widget tree. In other words, it allows Flutter to draw native UI images directly to the GPU.
This approach works well for use cases where latency isn’t a major concern, such as camera previews and video playback. However, for UI animations requiring real-time performance, adjustments must be made on the native side. This means a solid understanding of both Flutter and Android native development is necessary.
Additionally, the Texture
widget itself does not detect user interactions like touch events, so this must be handled on the Flutter side using GestureDetector
or similar.
That said, if it aligns with your requirements, it can be implemented relatively easily using the approach shown below.
Implementation Steps
First, obtain TextureRegistry
. For Flutter apps, use FlutterEngine.FlutterRenderer ,which implements TextureRegistry . For Flutter plugins, retrieve it from FlutterPluginBinding.
// For Flutter apps
val textureRegistry = this.flutterEngine.renderer
// For Flutter plugin
val textureRegistry = this.flutterPluginBinding.textureRegistry
Next, create a textureEntry
, which is a SurfaceTexture, from the textureRegistry
, then set up a SurfaceProvider
to provide a Surface
to the CameraX preview instance. Once this is done, you’re all set. This Surface
acts as the drawing buffer mentioned earlier.
val textureEntry = textureRegistry.createSurfaceTexture()
val surfaceProvider = Preview.SurfaceProvider { request ->
val texture = textureEntry?.surfaceTexture()
texture?.setDefaultBufferSize(
request.resolution.width,
request.resolution.height
)
val surface = Surface(texture)
request.provideSurface(surface, cameraExecutor) { }
}
val preview = Preview.Builder().build().apply {
setSurfaceProvider(surfaceProvider)
}
// To meet the requirements for camera analysis mentioned at the beginning of the article,
// this can be achieved by setting up a cameraProvider and configuring the Preview and analysis processing for the camera.
try {
camera = cameraProvider?.bindToLifecycle(
this,
CameraSelector.DEFAULT_BACK_CAMERA,
preview,
analysis, // Set the camera image analysis process here
)
} catch(e: Exception) {
Log.e(TAG, "Exception!!!", e)
}
Then, simply return the ID of the TextureEntry
associated with the Surface
to Flutter as the return value of MethodChannel
.
fun onMethodCall(call: MethodCall, result: MethodChannel.Result) {
when(call.method) {
"startCamera" -> {
result.success(textureEntry.id())
}
"stopCamera" -> {
stopCamera()
}
else -> result.notImplemented()
}
}
To render a native SurfaceTexture
on the Flutter side, simply set the textureId obtained from MethodChannel
to the Texture
widget, and the camera preview will appear in the Flutter app.
static const platform =
MethodChannel('com.example.camera_preview_texture/method');
int? _textureId;
Future<void> onPressed() async {
try {
final result = await platform.invokeMethod<int>('startCamera');
if (result != null) {
setState(() {
_textureId = result;
});
}
} on PlatformException catch (e) {
print(e.message);
}
}
Widget build(BuildContext context) {
if (_textureId == null) {
return const SizedBox();
}
return SizedBox.fromSize(
size: MediaQuery.of(context).size,
child: Texture(
textureId: _textureId!,
),
);
}
For an implementation using the Texture
widget, the mobile_scanner serves as a great reference.
PlatformView
PlatformView
allows embedding Android native UI into Flutter’s widget tree, making it possible to display and control it.
There are three rendering modes for PlatformView
: Virtual Display (VD
), Hybrid Composition (HC
), and TextureLayerHybridComposition (TLHC
)[1]. When using the PlatformView
API, TLHC
is selected by default. However, if the Android native UI tree contains SurfaceView
, it will fall back to VD
or HC
[2].
In addition, Texture
improves frame rate synchronization between Flutter and Android native, which was not possible with the Texture widget. It also allows user interaction control and supports displaying UI elements beyond just camera previews and videos.
Implementation Steps
In this sample code using PlatformView
, the camera preview screen is implemented with Jetpack Compose.
To use Jetpack Compose in a Flutter app, add the following dependencies and configuration to app/build.gradle
:
android {
~
~
buildFeatures {
compose true
}
composeOptions {
kotlinCompilerExtensionVersion = "1.4.8"
}
}
dependencies {
implementation("androidx.activity:activity-compose:1.9.3")
implementation(platform("androidx.compose:compose-bom:2024.04.01"))
implementation("androidx.compose.material3:material3")
}
Now, let’s dive into the details of the implementation.
Implementing PlatformView
requires the following three steps:
- Implement NativeView that inherits
PlatformView
- Implement NativeViewFactory that inherits
PlatformViewFactory
- Register
PlatformViewFactory
toFlutterEngine
1. Implementing NativeView For a general implementation, please refer to Official.
One key difference from the official approach is that this implementation uses Jetpack Compose. Here, the CameraPreview
(built with Jetpack Compose) is embedded into the Android native View tree using ComposeView
.
class NativeView(context: Context, id: Int, creationParams: Map<String?, Any?>?, methodChannel: MethodChannel, eventChannel: EventChannel) : PlatformView {
private var nativeView: ComposeView? = null
override fun getView(): View {
return nativeView!!
}
override fun dispose() {}
init {
nativeView = ComposeView(context).apply {
setContent {
CameraPreview(methodChannel, eventChannel)
}
}
}
}
In the Jetpack Compose implementation, PreviewView
from CameraX, which is a View
, is Composed using AndroidView
. As a side note, AndroidView
can also be used for Fragment
.
@Composable
fun CameraPreview(methodChannel: MethodChannel, eventChannel: EventChannel) {
val context = LocalContext.current
val preview = Preview.Builder().build()
val previewView = remember {
PreviewView(context)
}
suspend fun startCamera(context: Context) {
val cameraProvider = context.getCameraProvider()
cameraProvider.unbindAll()
// To meet the requirements for camera analysis mentioned at the beginning of the article,
// this can be achieved by setting up a cameraProvider and configuring the Preview and analysis processing for the camera.
cameraProvider.bindToLifecycle(
LocalLifecycleOwner.current,
CameraSelector.Builder().requireLensFacing(CameraSelector.LENS_FACING_BACK).build(),
preview,
analysis, // Set the camera image analysis process here
)
preview.surfaceProvider = previewView.surfaceProvider
}
suspend fun stopCamera(context: Context) {
val cameraProvider = context.getCameraProvider()
cameraProvider.unbindAll()
}
LaunchedEffect(Unit) {
fun onMethodCall(call: MethodCall, result: MethodChannel.Result) {
when(call.method) {
"startCamera" -> {
runBlocking {
CoroutineScope(Dispatchers.Default).launch {
withContext(Dispatchers.Main) {
startCamera(context)
}
}
}
result.success("ok")
}
"stopCamera" -> {
runBlocking {
CoroutineScope(Dispatchers.Default).launch {
withContext(Dispatchers.Main) {
stopCamera(context)
}
}
}
}
else -> result.notImplemented()
}
}
methodChannel.setMethodCallHandler(::onMethodCall)
}
AndroidView(factory = { previewView }, modifier = Modifier.fillMaxSize())
}
Next, 2. implement NativeViewFactory and 3. register it to FlutterEngine as follows.
class MainActivity: FlutterFragmentActivity() {
~
~
override fun configureFlutterEngine(flutterEngine: FlutterEngine) {
super.configureFlutterEngine(flutterEngine)
val methodChannel = MethodChannel(
flutterEngine.dartExecutor.binaryMessenger,
METHOD_CHANNEL
)
val eventChannel = EventChannel(
flutterEngine.dartExecutor.binaryMessenger,
EVENT_CHANNEL
)
flutterEngine
.platformViewsController
.registry
.registerViewFactory(VIEW_TYPE, NativeViewFactory(methodChannel, eventChannel))
}
}
class NativeViewFactory(
private val methodChannel: MethodChannel,
private val eventChannel: EventChannel
) : PlatformViewFactory(StandardMessageCodec.INSTANCE) {
override fun create(context: Context, viewId: Int, args: Any?): PlatformView {
val creationParams = args as Map<String?, Any?>?
return NativeView(
context,
viewId,
creationParams,
methodChannel,
eventChannel
)
}
}
Finally, here is the implementation on the Flutter side.
PlatformViewsService.initSurfaceAndroidView()
is an API for using either TLHC
or HC
. PlatformViewsService.initAndroidView()
allows you to use either TLHC
or VD
.PlatformViewsService.initExpensiveAndroidView()
forces the use of HC
.
class CameraPreviewView extends StatelessWidget {
final String viewType = 'camera_preview_compose';
final Map<String, dynamic> creationParams = <String, dynamic>{};
CameraPreviewView({super.key});
Widget build(BuildContext context) {
return PlatformViewLink(
viewType: viewType,
surfaceFactory: (context, controller) {
return AndroidViewSurface(
controller: controller as AndroidViewController,
hitTestBehavior: PlatformViewHitTestBehavior.opaque,
gestureRecognizers: const <Factory<OneSequenceGestureRecognizer>>{},
);
},
onCreatePlatformView: (params) {
return PlatformViewsService.initSurfaceAndroidView(
id: params.id,
viewType: viewType,
layoutDirection: TextDirection.ltr,
creationParams: creationParams,
creationParamsCodec: const StandardMessageCodec(),
onFocus: () {
params.onFocusChanged(true);
},
)
..addOnPlatformViewCreatedListener(params.onPlatformViewCreated)
..create();
},
);
}
}
By using PlatformView
this way, you can integrate Android native UI into your Flutter app.
Intent
Intent
is an Android feature (not specific to Flutter) that allows launching an Activity separate from the MainActivity where Flutter runs. With Intent, you can navigate to another screen within your app, launch external apps, and exchange data between Activities.
The two methods mentioned above (Texture widget and PlatformView) have been reported to have performance issues [3]. To resolve these issues, a deep understanding of both Flutter and Android native is essential. In some cases, building a separate Android app might actually help keep development costs down.
However, this poses a different challenge.
-
If your team only has Flutter engineers, you will need to catch up on Android development.
-
If the app is developed as an external application, the interface between apps must include security measures and be designed with lifecycle considerations in mind.
For instance, the following measures may be necessary:
- Validate the data exchanged between activities.
- Restrict access so that only a specific app can call it.
- Ensure the called app functions correctly even if the calling app has been killed.
Now let’s take a look at how to use Intent
in Flutter. First, we’ll go over how to call another Activity from a Flutter app.
Calling Activity (MainActivity where the Flutter app runs)
override fun onMethodCall(call: MethodCall, result: MethodChannel.Result) {
if (call.method!!.contentEquals("startCamera")) {
val dummyData = call.argument<String>("dummy_data") ?: return result.error(
"ERROR",
"data is invalid",
null
)
// In case of screen transition
val intent = Intent(this, SubActivity::class.java)
// For external apps
val packageName = "com.example.camera_preview_intent"
val intent = activity.packageManager.getLaunchIntentForPackage(packageName) ?: return result.error(
"ERROR",
"unexpected error",
null
)
intent.setClassName(packageName, ".SubActivity")
// Store the sending data
intent.putExtra("EXTRA_DUMMY_DATA", dummyData)
intent.setFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP)
activity.startActivityForResult(intent, REQUEST_CODE)
}
}
override fun onListen(arguments: Any?, sink: EventChannel.EventSink?) {
eventSink = sink
}
override fun onCancel(arguments: Any?) {
eventSink = null
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?): Boolean {
if (requestCode == REQUEST_CODE && resultCode == Activity.RESULT_OK && data != null) {
val result = data.getStringExtra("RESULT_DATA") ?: "",
eventSink?.success(result)
return true
}
return false
}
Next, let’s implement the Activity that gets called from the Flutter app. Once a specific operation is completed, you can use Intent
to return data, as shown below.
Target Activity
val intent = Intent()
intent.putExtra("RESULT_DATA", resultData)
activity.setResult(Activity.RESULT_OK, intent)
finish()
By using Intent
this way, you can avoid dealing with complex UI control on both the Flutter and native Android sides while enabling data exchange between Flutter and native Android Activities.
However, security and data integrity must be carefully considered in this approach.
Summary
In this article, we've discussed how to incorporate native functionality into Flutter apps, with a focus on Android.
- Data communication between Flutter and Android native was achieved using
MethodChannel
andEventChannel
. - Here’s how to incorporate Android native UI into Flutter:
Texture
widget- Great for camera previews and video displays, and relatively easy to implement.
- However, it requires handling user interactions and may have some performance issues.
PlatformView
- Lets you integrate native UI into Flutter’s widget tree while enabling user interaction control.
- Supports embedding View, Fragment, and Jetpack Compose.
- Performance can also be an issue.
- Lets you integrate native UI into Flutter’s widget tree while enabling user interaction control.
Intent
- Allows seamless screen transitions and launching of other apps, making it possible to directly display Android’s UI and exchange data.
- However, security and data handling require careful attention.
As mentioned above, each method comes with its own strengths and limitations when integrating Android native features into a Flutter app. The best choice depends on your project’s specific needs.
Notes
The thumbnail of the Droid is reproduced or modified from work created and shared by Google and used according to terms described in the Creative Commons Attribution 3.0 License.
関連記事 | Related Posts

Introducing Redis Pub/Sub for System Date Changes in Testing

Pull Request Options on GitHub and Pitfalls

Continuous Delivery of Kubernetes Applications Using Only GitHub Actions

How We Built our Tech Blog Using Next.js

GitHub Copilotとプログラミングして分かったAIとの付き合い方(モバイルエンジニア編)
![Cover Image for [Android] Avoiding API Calls in the Application Class](/assets/blog/authors/10497_gota_hasegawa/2024-12-02/application_image.png)
[Android] Avoiding API Calls in the Application Class