Collective problem-solving is a prominent signature of animal societies. It’s a hallmark of the human race, but we’re not the only ones. Ants, like us, cooperate with each other to solve many complex problems such as transporting and building things. They do so with simple rules of thumb. Scientists from L. Mahadevan’s group in SEAS, Physics, and OEB and Venki Murthy’s group in MCB have recently uncovered some of these secrets of ant cooperation. Their findings, published in eLife, show that ants might use simple interaction rules to cooperate with each other to successfully perform a task. They also managed to reproduce this cooperative behavior in robot-ants by encoding these rules in them.
The authors studied the behavior of the infamous black carpenter ants – those banes of homeowners which make their nests by digging tunnels into wood in complete darkness. The researchers exploited this digging behavior by trapping a group of carpenter ants into a chamber made of agarose gel sandwiched between two glass sheets and then video-recording the digging process using infrared light, which is invisible to ants. The ants made tunnels and escaped each time, and in most cases, they worked together to make one escape tunnel rather than each of them working on their own. The authors quantified the movement and excavation work of the ants using the open-source deep learning-based tool SLEAP and custom-made image processing algorithms. From the movement data, the authors found out that ants entered a cycle of digging, transporting the debris from the chamber boundary to the interior and going back to the boundary after dropping the debis. They converted this periodic behavior into a usable simulation.
In the simulation, multiple ant-inspired agents interacted among themselves and with a digital environment similar to the agarose gel chamber from the earlier experiment. Through this, the researchers were able to model the minimal rules of interaction between the agents and the environment needed for successful cooperation.
In the mathematical model, agents cooperate by relying on information they acquire from sensing their proximity to others and the environment, similar to how ants use their antennae and the presence of pheromones (Trible et al.). Tuning the interaction levels of the agents in the simulations enhanced or diminished the efficiency of their collective excavation. Highly interacting agents form a cluster and bump into each other all the time, barely engaging with the environment resulting in no or minimal excavation. On the other hand, when the agents interacted very little, they showed independent exploratory behavior with each agent digging a separate tunnel or moving similar to foraging. They did not make any progress towards task execution either.
However, when the agents hit an optimum sweet spot level of interaction among themselves, they engaged in what the authors called an “exploitatory behavior” resulting in an efficient cooperative excavation and ultimately a glorious escape. This allowed the researchers to characterize the different regimes of functional behavior in terms of the strength of interaction and rate of excavation.
Next, the researchers wanted to see whether it is possible to synthetically create physical agents inspired by the rules of cooperation they found from the ants and the simulations. To do so, they designed robot ants, which they called RAnts, that mimic the functional capabilities of the ants at a minimal level. The RAnts could walk around, move obstacles, and determine each other’s locations using light sensors. When they were fed the interaction rules determined from the previous experiments, the robots exhibited cooperative excavation remarkably similar to real ants and corroborated the different regimes of behavior observed in simulations, using simple local interaction rules without a plan or planner i.e., in the absence of a central controller.
This work represents one of the early attempts to derive a theoretical understanding of animal-word collaborative behavior from experimentation, and then create a synthetic replication of the knowledge in a real-world application.
Altogether, by combining experiments on collective excavation behavior with a theoretical understanding and the creation of a synthetic robotic system, this study illuminates how spontaneous cooperation can arise using slowly decaying signals in the environment that serve as a communication channel, and then how to apply such knowledge to create a synthetic replication in a real-world application. Once this understanding is refined, it could be applied to robots designed for cleaning industrial waste sites, sewage blockages, and many other situations that would be dangerous for human workers.
This work is not the end — ants in the wild engage in many different phases of functional cooperative behavior. A lot remains to be understood as to how ants engage in phases of functional behavior and arrive at a show of cooperative intelligence.