Collective Construction for Autonomous Agents:
Presently I'm concentrating on developing multimodal vision systems and mechatronic sensor arrays to study the building behaviour of termites. Our lab spends three or four weeks out of every year in Namibia, researching the mound-building of two species of Macrotermes. These ingenious insects construct elaborate temperature- and humidity-regulated mounds, in order to cultivate a delicate fungus that acts as their primary food source. From them, we hope to develop algorithms for autonomous construction that will allow robots to build elaborate, robust, multi-functional structures in hostile environments, such as disaster zones, or outer space.
To investigate and model these behaviours, we use a variety of tools and techniques. Precise, high-speed RGB-D reconstruction of termite building is accomplished using the Intel Realsense, and (coming soon) a low-cost photometry array (prototype pictured). Additionally, the environmental factors that may play a significant role in termite excavation and deposition are recorded using humidity and temperature sensing arrays. At the moment, I'm working on designing and building a Droplet robot, which will allow us to test algorithms on a small low-cost swarm of robots that react to humidity, and attempt to regulate their environment.
How do we integrate robots into society in ways that support, rather than disrupt, our most vulnerable populations? In particular, the proliferation of social robots in care services is raising questions about ethics, privacy, and and the nature of future human societies.
Together with researchers in Australia and the UK, I am exploring the roles that robots should, and even more critically, should not play in care delivery. Although there is a burgeoning literature on the topic of robots in social and care settings, the majority of this commentary and evidence tends to revolve around their technical efficacy, their acceptability to consumers, or the legal ramifications of such innovations. Yet, there remains a serious lack of attention within the public policy and public management to the actual implementation of robots in care settings. See my Scientific American blog post for some additional thoughts: https://blogs.scientificamerican.com/observations/can-robots-tighten-the...
Principal Collaborators: Helen Dickinson, Gemma Carey
Other advisors and collaborators: Mary-Anne Williams, Sascha Griffiths
Agile Outdoor Robots:
The Self-Organizing Research Group has many swarm robots ensconced in its laboratory space, including fish, ant, termite, and cell-inspired robots. However all of these require a controlled environment in order to operate effectively. Starting summer 2017, we've embarked on a project to bring swarm robots outdoors. Our REU (Research Experience for Undergraduates) team is developing a robust outdoor platform we can use to conduct experiments in less hospitable arenas. Equipped with RGB cameras, depth sensors, and ultrasonics, these robots will not only be able to navigate autonomously over rugged outdoor terrain, eventually they wil be able to communicate and cooperate to solve more complex problems.
In 2010, my colleagues and I at the Centre of Excellence in Cognitive Interaction Technology embarked on a project to develop low-cost polarisation-sensitive vision systems for UAVs, to enable them to navigate using polarised signals in the natural world, in the same way locusts, bees, and other insects do. This led to on-going and fruitful collaborations, and presently myself, Martin Howe at University of Bristol, and Keram Pfeiffer at University of Wurzburg are working on creating cross-platform integrated vision software that will enable researchers to quickly and accurately reconstruct and analyse polarised images (
Principal Collaborators: Martin How, Keram Pfeiffer, Uwe Homberg
Byrun: a dynamic legged biped
Byrun is conceived as a dynamic robot for human interaction and communication, and a development platform for research and experimentation. As part of the R&D team at Engineered Arts Ltd, I was involved with hardware development and bipedal balancing and locomotion. See http://www.engineeredarts.co.uk for details.
Socibot: a low-cost sociable assistive robot
Socibot is an integrated, commercial, open `social hardware' platform with unique intuitive communication abilities, designed expressly for public spaces, business settings and care facilities. As a control engineer at Engineered Arts, I was part of the team developing Socibot's interaction capabilities and interfacing. See http://www.engineeredarts.co.uk for details.
BIONA (BIO-inspired Navigation Algorithms) was a project funded by the DFG (Deutsche Forschungsgemeinschaft) to develop flight stabilization and navigation techniques based on insect behaviour. Insects such as bees, wasps, locusts, and dragonflies lack halteres or other force-based sensors, and must rely heavily on visual cues to stabilise their flight and orient themselves. I was particularly interested in the role played by the ocelli in flight stabilization, as these low-resolution wide-field sensors contribute to fast, low-computation steering loops in the insect brain. Working with neurobiologists, ethologists, and computer vision experts, I developed control strategies patterned after the neuro-muscular linkages in these insects, and deployed these on small UAVs to test their efficacy compared to standard IMU-based stabilization.