Research

Current Projects

Collective Construction for Autonomous Agents:

Presently I'm concentrating on developing multimodal vision systems and mechatronic sensor arrays to study the building behaviour of termites. Our lab spends three or four weeks out of every year in Namibia, researching the mound-building of two species of Macrotermes. These ingenious insects construct elaborate temperature- and humidity-regulated mounds, in order to cultivate a delicate fungus that acts as their primary food source. From them, we hope to develop algorithms for autonomous construction that will allow robots to build elaborate, robust, multi-functional structures in hostile environments, such as disaster zones, or outer space.

Systems and software for tracking termites in 3D

To investigate and model these behaviours, we use a variety of tools and techniques. Precise, high-speed RGB-D reconstruction of termite building is accomplished using the Intel Realsense, and (coming soon) a low-cost photometry array (prototype pictured). Additionally, the environmental factors that may play a significant role in termite excavation and deposition are recorded using humidity and temperature sensing arrays. At the moment, I'm working on designing and building a Droplet robot, which will allow us to test algorithms on a small low-cost swarm of robots that react to humidity, and attempt to regulate their environment.

Primary Collaborators: Justin WerfelJ Scott Turner, Paul Bardunias, Radhika Nagpal, Rupert Soar,
Other advisors and collaborators: Sebastian Oberst, Theo Evans, Jochen Zeil

Responsible Social Robots

How do we integrate robots into society in ways that support, rather than disrupt, our most vulnerable populations? In particular, the proliferation of social robots in care services is raising questions about ethics, privacy, and and the nature of future human societies.

Care robotsTogether with researchers in Australia and the UK, I am exploring the roles that robots should, and even more critically, should not play in care delivery. Although there is a burgeoning literature on the topic of robots in social and care settings, the majority of this commentary and evidence tends to revolve around their technical efficacy, their acceptability to consumers, or the legal ramifications of such innovations.  Yet, there remains a serious lack of attention within the public policy and public management to the actual implementation of robots in care settings.  

Principal Collaborators: Helen Dickinson, Gemma Carey
Other advisors and collaborators: Mary-Anne Williams, Sascha Griffiths

Agile Outdoor Robots:

The Self-Organizing Research Group has many swarm robots ensconced in its laboratory space, including fish, ant, termite, and cell-inspired robots. However all of these require a controlled environment in order to operate effectively. Starting summer 2017, we've embarked on a project to bring swarm robots outdoors. Our REU (Research Experience for Undergraduates) team is developing a robust outdoor platform we can use to conduct experiments in less hospitable arenas. Equipped with RGB cameras, depth sensors, and ultrasonics, these robots will not only be able to navigate autonomously over rugged outdoor terrain, eventually they wil be able to communicate and cooperate to solve more complex problems.

Insect-Inspired Vision:

In 2010, my colleagues and I at the Centre of Excellence in CIllustration of locust navigating by polarized lightognitive Interaction Technology embarked on a project to develop low-cost polarisation-sensitive vision systems for UAVs, to enable them to navigate using polarised signals in the natural world, in the same way locusts, bees, and other insects do. This led to on-going and fruitful collaborations, and presently myself, Martin Howe at University of Bristol, and Keram Pfeiffer at University of Wurzburg are working on creating cross-platform integrated vision software that will enable researchers to quickly and accurately reconstruct and analyse polarised images ( canopy_movie.mp4. I am also an advisor on the related DFG-funded project No: HO 950/25-1 (Sky compass signaling of central-complex neurons in locusts exposed to the natural sky).

Principal Collaborators: Martin How, Keram Pfeiffer, Uwe Homberg

 

Previous Projects

Humanoid robots, bipeds, social robots

Byrun: a dynamic legged biped
Byrun is conceived as a dynamic robot for human interaction and communication, and a development platform for research and experimentation. As part of the R&D team at Engineered Arts Ltd, I was involved with hardware development and bipedal balancing and locomotion. See http://www.engineeredarts.co.uk for details. 

Socibot: a low-cost sociable assistive robot
Socibot is an integrated, commercial, open `social hardware' platform with unique intuitive communication abilities, designed expressly for public spaces, business settings and care facilities. As a control engineer at Engineered Arts, I was part of the team developing Socibot's interaction capabilities and interfacing. See http://www.engineeredarts.co.uk for details.

BIONA
BIONA (BIO-inspired Navigation Algorithms) was a project funded by the DFG (Deutsche Forschungsgemeinschaft) to develop flight stabilization and navigation techniques based on insect behaviour. Insects such as bees, wasps, locusts, and dragonflies lack halteres or other force-based sensors, and must rely heavily on visual cues to stabilise their flight and orient themselves. I was particularly interested in the role played by the ocelli in flight stabilization, as these low-resolution wide-field sensors contribute to fast, low-computation steering loops in the insect brain. Working with neurobiologists, ethologists, and computer vision experts, I developed control strategies patterned after the neuro-muscular linkages in these insects, and deployed these on small UAVs to test their efficacy compared to standard IMU-based stabilization.

 

Bio-inspired visual navigation methods for UAVs

 

canopy_movie.mp45.89 MB