Posts

Showing posts from March, 2023

Model

  Load Model: Tflite  provides us  loadModel  method to load our model. It takes two values model file path and labels file path. Future loadModel() async { Tflite. close (); await Tflite. loadModel ( model: "assets/ssd_mobilenet.tflite", labels: "assets/ssd_mobilenet.txt"); } Run Model: In this method, we will run the model using  Tflite . Here we are using the live stream of the image so we will have to use the  detectObjectOnFrame   method to run our model. runModel() async { recognitionsList = await Tflite. detectObjectOnFrame ( bytesList: cameraImage.planes.map((plane) { return plane.bytes; }).toList(), imageHeight: cameraImage.height, imageWidth: cameraImage.width, imageMean: 127.5, imageStd: 127.5, numResultsPerClass: 1, threshold: 0.4, ); setState(() { cameraImage; }); }

Initialize camera

  Initializing Camera: Inside the main method initialize the available cameras using  availableCameras . List<CameraDescription> cameras; Future<void> main() async { WidgetsFlutterBinding. ensureInitialized (); cameras = await availableCameras (); runApp(MyApp()); } camera  the package provides us support for live image streaming. Firstly create an object of the  CameraController  .  CameraController   takes two arguments  CameraDescription  and  ResolutionPreset  .  initialize  the  cameraController   and then we can start our image streaming using the  startImageStream  method.  startImageStream  the method provides us the images, we will give these images to  cameraImage , and then we will run our model. CameraImage cameraImage; CameraController cameraController; initCamera() { cameraController = CameraController(cameras[0], ResolutionPreset.medium); cameraControl...

Android Configuration

  Android Configuration: Change the minimum Android SDK version to 21 (or higher) in your  android/app/build.gradle  file. minSdkVersion 21 In  android/app/build.gradle , add the following setting in  android  block. aaptOptions { noCompress 'tflite' noCompress 'lite' } Add model and label files in the assets folder, also add them in  pubspec.yaml

Object Detection App With Flutter and TensorFlow Lite

Image
  Let’s learn how to build a flutter app that detects objects on a live camera. In this blog, we shall learn how to build an app that can detect Objects, and using AI and Deep Learning it can determine what the object is.  Tflite  provides us access to  TensorFlow Lite  . TensorFlow Lite  is an open-source deep learning framework for on-device inference. To integrate  tflite  into our flutter app, we need to install  tflite  package and we need two files  model.tflite  and  labels.txt  .  model.tflite   is the trained model and  labels.txt  the file is a text file containing all the labels. Many websites provide us facility to train our model with our dataset and deploy them on  TensorFlow Lite  and we can directly get these two files from there. You can read my blog on  Object  Detection App with Flutter and TensorFlow Lite  to trains your model with your own dataset.