My Rol: Collaborating with a shrimp and tuna fishing industry expert, I contributed throughout this project’s lifecycle, from initial design to deployment. I trained the models using YOLO with a tracking algorithm and developed the back-end logic and cloud functions to handle front-end requests. A sourced programmer developed the front-end.
π― Overview
This solution automates shrimp farm monitoring, replacing manual sampling with a strategically positioned sensor tower. The tower combines computer vision, ultrasonic sensors, and IoT water measurement devices, capturing images and gathering data via cameras, hydrophones, and shallow-water marine probesβtechnology adapted from tuna biomass recognition and identification. This data feeds an AI platform that provides real-time monitoring and automated control of industrial shrimp ponds.
The system automatically extracts data on key parameters such as salinity, pH, water and ambient temperature, turbidity, and dissolved oxygen. This data is then transmitted to a cloud-based platform where predictive models and predefined parameters trigger alerts, communicated via SMS, email, or a web application. These alerts enable early disease detection, efficient feed management, and optimized growth and welfare analysis, including average weight and length, molting cycles, and gender distribution.
A high-performance computer within the sensor tower performs real-time computer vision tasks. On the farm, data from interconnected devices is unified and sent to the cloud platform. A web application is also under development to consolidate data from all ponds into dashboards, along with alert notifications, creating a fully automated system.
At the core of this solution are pre-trained models and computer vision techniques that extract relevant shrimp data from images. Combined with sensor data, this information undergoes consolidation, cleaning, and analysis to provide actionable insights. This advanced technology seamlessly integrates into existing operations, empowering shrimp farmers to enhance efficiency and sustainability.
β¨ Key Features
π Real-time Monitoring
- Computer vision analysis of shrimp behavior
- Early disease/stress detection
- Automated biomass estimation
- Continuous water quality monitoring
π€ Smart Automation
- Automated IoT device control
- Dynamic feed optimization
- Predictive maintenance
- Environmental parameter adjustment
π Analytics & Alerts
- Real-time SMS/email alerts
- Predictive analytics
- Historical trend analysis
- Data-driven insights
π§ Technical Architecture
Data Acquisition
- Sensor-equipped buoys
- Underwater cameras
- Ultrasound devices
- Water quality sensors
AI & Computer Vision
- YOLO object detection
- DeepSORT tracking
- Custom CNN for health assessment
- LSTM/ARIMA prediction models
Infrastructure
- LoRaWAN network
- Cloud-based AI platform
- IoT control systems
- Web/mobile interface
π Performance Metrics
| Model | Accuracy | Conditions |
|---|---|---|
| Object Detection | >85% | Clean water |
| Object Detection | <20% | High turbidity |
| Health Assessment | 90% | Controlled conditions |
| Biomass Estimation | Β±5% | Standard conditions |
π Note:
In a pond with dense algae and green water with zero visibility, the accuracy of traditional computer vision (based on visible light) would drop dramatically. Giving precise values is difficult without specific experimentation in that environment, but to illustrate the severity:
"(accuracy drops below 20% in turbidity levels above 100 NTU)"
Justification:
- 20% accuracy: With zero visibility, shrimp detection is likely to be mostly random, resulting in very low accuracy. 20% is a conservative estimate; it could be even lower.
- 100 NTU: Turbidity values above 100 NTU indicate very murky water. Ponds with dense algae and green water can easily exceed this value.
π₯ Video
This is the location where images and videos were captured for the datasets, and where field tests were conducted.

