Robotics student develops innovative model for tracking climate change across ecosystems
03 July 2025

Cas Penfold, a third-year Robotics BSc student, has just completed her dissertation – a project that examines the limitations and importance of capturing botanical surveys. It offers an innovative solution: to make environmental monitoring both cheaper and faster than current options by creating a dog-mounted computer vision system.
The project explores how monitoring plant species is vital for not only identifying harmful botanic, but for detecting and tracking the impact of climate change across different ecosystems. Cas has taken her skillset in robotics to create an alternative way to monitor these species, supplying a way to increase the amount surveys able to take place.
We chatted to Cas to discover more about this unique and visionary project and how robotics can help us to track the impact of climate change.
Can you tell us more about your project?
Botanical surveys are crucial for environmental monitoring; they help detect and track general ecosystem health and reactions to environmental changes such as temperature, time of year and presence of pollutants. Surveys also monitor invasive species and identify harmful plant species that may cause damage to livestock, other animals or plants in the area.
This process is currently done by hand, which is slow, tedious and costly. It’s alternatively done using detection (sniffer) dogs to identify certain plant species which has a high initial cost for specialised training ($15,000-30,000 Australian dollars each) but is much faster and more effective with a lower maintenance cost. This method isn’t really utilised much in the UK.
My project aimed to determine whether it was possible to collect camera footage from a pet dog with only basic training that could be processed through a custom-trained computer vision model. I focused on botanical surveys and using footage from the dog to identify plant species presence around the area. This could make conducting widespread botanical surveys much more accessible and cost-effective as dogs that are walked off the lead in rural environments are currently an abundant resource.
How does the model work?
The computer vision model was trained on thousands of hand-labelled supervised training images of sunflowers to detect their presence in frames of collected videos, and a bespoke harness was created with a gimbal and small action camera to capture footage from the dog while he explores the environment.
The tests were considered very successful and the computer vision model managed to identify the sunflowers in the footage from the dog’s camera with a high level of accuracy which proves the potential of this kind of technology. However, more improvements and tests need to be made before this could be ready for deployment.
How have you found your time studying Robotics at Falmouth?
It’s been really nice! The lecturers were all lovely and go above and beyond to help; they're always friendly and approachable.
Obviously, the coursework was a lot at times, but it was usually very easy to get hold of someone who was happy to explain things, show you an easier solution or help you structure larger assignments. The assignments were all fun, allowing a lot of creative freedom and allowing you to explore any topic or concept you like – even working with animals!
What do you hope to do next after finishing your course at Falmouth?
I'm taking a little break from academia but in the future, when I get the chance, I'd like to either pursue a robotics master’s or another degree in mechanical engineering because I'm really interested in physical systems and I love combining different mediums to make things more efficient, accessible and cost-effective.