Humans are a step closer to seeing what the world looks like through the eyesof animals, thanks to technology developed by researchers from The Universityof Queensland and the University of Exeter.
PhD candidate Cedric van den Berg from UQ’s School of Biological Sciences saidthat, until now, it had been difficult to understand how animals saw theworld.
“Most animals have completely different visual systems to humans, so for manyspecies it is unclear how they see complex visual information or colourpatterns in nature, or how this drives their behaviour,” he said.
“The Quantitative Colour Pattern Analysis (QCPA) framework helps solve thatproblem.
A field of bluebells from the perspective of a human(left) and a bee (right).
“The framework is a collection of software and hardware, combining innovativeimage processing techniques with digital visualisation and analytical tools.
“Collectively, these tools greatly improve our ability to analyse complexvisual information through the eyes of animals.”
The QPCA is designed to analyse calibrated digital images from both aquaticand terrestrial habitats.
These images can be captured using both off-the-shelf cameras and purpose-built camera systems.
“You can even access most of its capabilities by using a $100 smartphone tocapture footage,” Mr van den Berg said.
It took four years to develop and test the technology, including thedevelopment of an extensive interactive online platform to provideresearchers, teachers and students with user guides and tutorials.
UQ’s Dr Karen Cheney said that the framework could be applied to a wide rangeof environmental conditions and visual systems.
“The flexibility of the framework allows researchers to investigate the colourpatterns and natural surroundings of a wide range of organisms such asinsects, birds, fish and flowering plants,” she said.
“For example, we can now truly understand the impacts of coral bleaching forcamouflaged reef creatures in a new and informative way.
“We’re helping people to cross the boundaries between human and animal visualperception.
“It’s really a platform that anyone can build on, so we’re keen to see whatfuture breakthroughs are ahead.”
Previous Draft EIS Proposes 14 Commercial Aquarium Fishing Permits For WestHawaii
Next Women sleep better next to dogs instead of other people, study says