Background
In the classic computer vision setup there are a lot of assumptions: you assume that you have access to plentiful computing resources, high bandwidth connections and an abundant supply of power. There’s also an implicit assumption that your models are being deployed on dry land. Throw away all these expectations, and the problem gets a lot harder.
We were approached by Pebl as a co-partner for a UK Research and Innovation grant to investigate how AI can help them monitor remote seaweed farms. This was the kind of project that we live for: apart from offering some interesting challenges, we loved the clear positive impact of this work. Pebl’s work on seaweed farming has the explicit goals of empowering coastal communities, protecting coastal environments and encouraging sustainable farming practices.
Close Collaboration
A big part of proof of concept work is working out what is possible, what is useful, and where those two things meet. As the technical experts, we have a solid understanding of what can be achieved, but it’s essential to listen to the customer and embrace their domain expertise when planning our work.
Through conversations with the customer, we were able to start enumerating the challenges that they faced with regards to monitoring the seaweed farm. We identified some key concerns that they had: being able to regularly check on the farm to ensure that it’s intact and functioning, and to monitor for foreign objects that might pose risks to the farm.
Eventually, we settled on two strands of work. In the first, we would look at automated monitoring of the farm, checking that the farm’s buoys were intact and in their expected locations. In the second, we would look specifically at tracking passing ships to record any interactions they have with the farm. The latter is useful for insurance purposes, if there is a collision, you can see exactly what happened and who is at fault. Think of it as dashcam for seaweed farms.
Throughout this project, we needed to balance open-ended research tasks with building something of tangible value. We achieved this by rapidly building and sharing prototypes with the customer and iterating on their feedback through open communication
The Technical Details
The monitoring system we built would have to operate strapped to a buoy, with no external source of power and no means of live communication with shore. You don’t get closer to the edge than that.
The hardware setup was a simple but robust one: a battery powered raspberry pi with pi camera module. This represented a solid tradeoff between flexibility (we could run a range of surprisingly powerful models) and power consumption. Of course, back on land, we were able to make use of GPUs and a more conventional MLOps setup when it came to training and data labelling.
While computer vision has become somewhat synonymous with convolutional neural networks, our initial approaches were much simpler. For the first challenge of monitoring the farm, we found that we were able to reduce computational cost a lot by eschewing machine learning and going for some more classic computer vision algorithms. We found that the distinctive shape and colour of the buoys against an ocean backdrop meant that a Canny edge detector and Hough transform were sufficient to do background removal, before some careful hand-tuned colour filtering to detect the buoys themselves.
Detecting ships was more challenging. For this we did resort to some more heavy duty machine learning. We found that YOLO 8 could run comfortably on the pi, and was sufficient for identifying ships with enough accuracy to begin recording quickly after the vessels come into view.
A single device was capable of running either or both of these algorithms simultaneously.
One of the nice things about this kind of setup is its simplicity. It strips away a lot of the complexities that often come with bigger cloud-based systems: no microservices, no container registries, no infrastructure as code, this is a scenario where simpler is better. On device, our models ran in a single Python poetry environment and the monitoring service ran entirely in a single script.
Back on land, we were able to throw a bit more computing power into training our models: we would track experiments with MLFlow and manage dataset labelling with Label Box. This allowed us to finetune the model for the specific environment, and set the groundwork for future work on discriminating between known and unknown vessels.
Outcomes
We were able to validate the idea that remote monitoring of seaweed farms using edge devices was possible. Even using low power devices and without running energy-hungry machine learning models, useful data could be collected for monitoring. We were also able to validate the idea that, where conditions allowed, we could squeeze some pretty powerful models onto these devices to give more useful insights.
As well as making the most of the data and equipment currently in place, we were able to give insights into what would be needed to take seaweed farm monitoring to the next level, giving the client specific insights into exactly where higher resolution devices and different camera placements would lead to the greatest increase in utility.
At the end of our journey with Pebl, we were able to ensure that they were left with something in a state where they could easily take steps towards turning the prototype into a full-fledged product. The monitoring system that we developed has gone from the initial prototype to a staple of their seaweed farm systems, with the system being rolled out to ~30 sites across the UK.
What Pebl Had to Say
“We selected Fuzzy Labs as a partner due to their experience with machine learning on the edge, particularly computer vision, and their expertise in productionising artificial intelligence. Working with the Fuzzy Labs crew was highly collaborative and their approach to knowledge sharing, throughout the project, was exceptional - they are not just really smart engineers!“ - Christian Berger, Co-Founder @ PEBL