Project Details
Projekt Print View

Neural representation of insect spatial memory

Applicant Dr. Jerome Beetz
Subject Area Sensory and Behavioural Biology
Cognitive, Systems and Behavioural Neurobiology
Term since 2025
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 553749470
 
How animals navigate across our globe has fascinated people over centuries. For efficient navigation, animals that always start their journey at a fixed origin, for example a nest, may benefit from memorizing locations of profitable food sources around the nest. Due to its importance to survival, understanding neural mechanisms of spatial memory is a central goal in neuroscience. A major challenge of memory research is the fact that the behavioral output to quantify memory formation is often delayed with respect to the underlying neural processes. For example, in a spatial-memory task, an animal could acquire memory at any time point from the beginning of training. Although scientists can test whether the animal has learned, the exact time point at which memories were acquired remains unclear. This shortcoming can be overcome in honeybees. Aside from humans, honeybees are the only known species that convey spatial information to conspecifics by adopting a ‘symbolic communication’. Upon arrival from a foraging bout, a bee often advertises distance and direction of a profitable food source by performing a ‘dance’ in the hive. This vector information gets extracted and stored by hive mates (recruits) that track the dancing bee in complete darkness. When leaving the hive, recruits retrieve the vector information that guides them to the food source. Given the link between dancing behavior and spatial memory, the dance offers a unique opportunity to get a glimpse into the bee’s mind and shed light onto the neural processes underlying spatial memory. By combining behavioral approaches with neural recordings from the bee brain, we aim to find the neural mechanisms underlying spatial memory in insects. In essence, we want to understand how recruits extract spatial information from the dance and translate this information into visual cues that guide the insect during flight. Additionally, using extracellular long-term recordings from the brain of bees that are stationarily flying in a virtual reality (VR) setup, we will study how vector information is processed during goal-directed navigation. By building a 3-D model of the bee’s natural habitat and presenting in the VR setup, we can perturb the visual scene and explicitly test the influence of visual landmarks on foraging behavior. Despite anatomical differences between vertebrate and invertebrate brains, molecular mechanisms of learning and neural processing of directions are remarkably conserved. This allows us to gain fundamental insights into the neural mechanisms of spatial memory.
DFG Programme Emmy Noether Independent Junior Research Groups
Major Instrumentation Two Tetrode setups
Instrumentation Group 3440 Elektrophysiologische Meßsysteme (außer 300-309 und 340-343)
 
 

Additional Information

Textvergrößerung und Kontrastanpassung