New paper on vision-based flocking!

By | 13. February 2020

Our new paper on vision-based flocking is out! Renaud Bastien and Pawel Romanczuk formulate a mathematical framework for social interactions in groups of agents based only on information available in the visual input received by individuals. Then, following a bottom-up approach, we formulate a minimal vision-based model: Agents only see black & white – something is there or not – and respond only to blobs and edges in their visual field. Surprisingly, with this minimal mechanism we observe different modes of collective movements observed in nature and in “classical” phenomenological flocking models. Our work offers a change of perspective for modeling collective movements, potentially allows to build links to sensory neuroscience and provides new inspirations for swarm robotics.

R. Bastien & P. Romanczuk, Science Advances 05 Feb 2020: Vol. 6, no. 6, eaay0792, DOI: 10.1126/sciadv.aay0792

Checkout also the Twitter threads by Renaud Bastien and Pawel Romanczuk providing more information, animations and a download of the simulator.