Back to project

Firmatech - Smart Aquaculture AI Case Study

Public case study for Smart Aquaculture AI: end-of-study research turned into a Junction Algiers hackathon MVP with AI services, pond monitoring, alerts, and operator interfaces.

Overview

Firmatech is a smart aquaculture hackathon MVP grounded in earlier end-of-study work on smart fish farming. The case study focuses on AI, research, testing, data analytics, and the system architecture.

Problem

Fish farms need early signals for water quality, growth, feeding, and disease risk. Manual inspection is slow, records can fragment, and AI outputs need dashboards, alerts, and reports to be useful.

Context

The public organization splits the hackathon work into mobile app, dashboard, TypeScript API, FastAPI AI backend, fake-data simulation, and notifications. Earlier memoire work covers aquaculture monitoring, IoT sensors, mobile UX, ML, and deep learning.

My Role

AI/Research/Testing contributor in the Junction Algiers team, also listed as AI Module & Data Analytics. The case study credits the team and avoids solo-ownership claims for the mobile app, dashboard, or general backend.

Goals

  • Turn smart fish farming research into a working end-to-end MVP
  • Monitor pond conditions such as pH, oxygen, temperature, salinity, nitrate, suspended solids, and water level
  • Expose AI services for fish counting, weight/biomass estimation, disease detection, and report generation
  • Support farmer and admin workflows through mobile, dashboard, alerts, and reports
  • Use fake sensor/image data to demo realistic flows without live farm hardware

Technical Decisions

  • Use FastAPI so fish-counting, disease, weight, and report endpoints stay separate from UI concerns
  • Use Roboflow-hosted models for detection and disease classification
  • Use OpenCV plus centroid tracking so repeated detections do not inflate fish counts
  • Use an empirical length-to-weight formula for hackathon biomass estimation, with calibration assumptions documented as prototype constraints
  • Use Gemini through Agno to generate daily farm reports from tank measurements and fish details
  • Use a TypeScript/Express API with MongoDB/Mongoose for users, tanks, feeds, measurements, alerts, fish details, and reports
  • Use Expo/React Native for the farmer mobile app, Next.js for the admin dashboard, and Expo/Firebase-style notifications for urgent events
  • Use a cron-based simulator to feed measurements and dummy fish images into the demo pipeline at repeatable intervals

Architecture

The simulator produces sensor measurements and images. The TypeScript API stores users, tanks, feeds, measurements, reports, alerts, and fish details in MongoDB. The FastAPI backend handles Roboflow, OpenCV, tracking, and Gemini/Agno reports. Expo and Next.js consume the API, while notifications send push or email alerts.

Architecture Map

Mermaid
flowchart LR
  subgraph Research["Research foundation"]
    Thesis["End-of-study research: IoT, mobile UX, ML"]
  end

  subgraph Inputs["Pond inputs"]
    Sensors["Sensor measurements: pH, oxygen, temperature, water level"]
    Camera["Camera/image inputs: count, weight, disease"]
    Simulator["Fake data simulator: cron jobs and dummy images"]
  end

  subgraph Core["Core platform"]
    Api["TypeScript Express API: users, tanks, feeds, measurements"]
    Db[("MongoDB / Mongoose")]
    Scheduler["Scheduled reporting and measurement aggregation"]
  end

  subgraph AIServices["AI services"]
    FastAPI["FastAPI AI backend"]
    Roboflow["Roboflow models: YOLOv8 + disease classifier"]
    Vision["OpenCV + centroid tracker for counting and weight flow"]
    Gemini["Gemini + Agno farm report generation"]
  end

  subgraph Clients["Operator surfaces"]
    Mobile["Expo mobile app: farmer pond overview"]
    Dashboard["Next.js admin dashboard: monitoring and roles"]
    Notify["Expo / FCM + email alerts"]
  end

  Thesis --> Simulator
  Sensors --> Simulator
  Camera --> Simulator
  Simulator --> Api
  Simulator --> FastAPI
  Mobile <--> Api
  Dashboard <--> Api
  Api <--> Db
  Api --> Scheduler
  Scheduler --> FastAPI
  Api --> Notify
  FastAPI --> Roboflow
  FastAPI --> Vision
  FastAPI --> Gemini
  FastAPI --> Api

Case Study Screenshots

10 views
Backend data model for users, water tanks, measurements, reports, alerts, feeding, and fish details.
Mobile farm dashboard with pond counts, fish totals, alerts, status, and navigation.
Push-notification evidence for critical pond events, including a temperature alert.
Feed stock alert flow using Expo/Firebase-style mobile notifications.
Fish counting flow: simulated frames, Roboflow/YOLOv8 inference, and centroid tracking to avoid duplicate counts.
Weight and biomass flow: detect length, convert pixels to centimeters, then apply the empirical formula.
Disease detection flow with Roboflow classification for healthy fish, Aeromonas, Streptococcus, Tilapia Lake Virus, and uncertain cases.
Daily report flow that formats tank measurements and generates operator-readable recommendations through Gemini/Agno.
Fish detection training curves covering loss, precision, recall, mAP50, and mAP50-95.
Disease model training curves for classification and detection quality review.

Key Features

  • Fish counting endpoint for video frames with detection and duplicate-count protection
  • Weight endpoint that estimates length, converts pixels to centimeters, and returns fish weight/biomass signals
  • Disease endpoint that classifies fish health and returns sickness status/type
  • Farm report endpoint that turns tank measurements into daily recommendations
  • Water tank, measurement, feed, alert, report, user, and admin API models
  • Mobile dashboard with pond overview, alerts, multi-pond navigation, notifications, and profile flows
  • Admin dashboard for pond monitoring, analytics, feeding operations, user roles, and authentication
  • Simulation service for pH, water level, temperature, monitoring values, weight images, and disease images

Challenges

  • Making the demo credible without live pond hardware by simulating sensor and camera input
  • Connecting model outputs to product actions instead of isolated AI experiments
  • Keeping the MVP understandable across several repositories and runtimes
  • Presenting model metrics and prototype assumptions accurately, especially for calibration, image quality, and disease data coverage
  • Coordinating team-owned surfaces while keeping the public portfolio role precise and fair

Results / Outcomes

  • Produced a smart aquaculture MVP with mobile, dashboard, API, AI backend, simulation, notifications, and visual documentation
  • Covers AI engineering beyond notebooks: routed services, model workflows, tracking logic, report generation, and integration
  • Connects research, AI, backend architecture, data simulation, and product surfaces
  • Uses public repo assets and sanitized screenshots so the case study is visual without exposing private Drive content

What I Learned

  • AI value is clearer when every model output has an operator workflow
  • Hackathon projects need architecture diagrams because the repo split can hide the actual product shape
  • Simulation is useful for demos, but the case study should label assumptions and avoid production claims
  • Team-role wording matters: it should show contribution and technical depth without erasing other contributors