Quick News Bit

Neuroscientists illuminate how brain cells ‘navigate’ in the light and dark: Brain mechanism identified that tracks angular head motion during navigation

0

To navigate successfully in an environment, you need to continuously track the speed and direction of your head, even in the dark. Researchers at the Sainsbury Wellcome Centre at UCL have discovered how individual and networks of cells in an area of the brain called the retrosplenial cortex encode this angular head motion in mice to enable navigation both during the day and at night.

“When you sit on a moving train, the world passes your window at the speed of the motion of the carriage, but objects in the external world are also moving around relative to one another. One of the main aims of our lab is to understand how the brain uses external and internal information to tell the difference between allocentric and egocentric-based motion. This paper is the first step in helping us understand whether individual cells actually have access to both self-motion and, when available, the resultant external visual motion signals” said Troy Margrie, Associate Director of the Sainsbury Wellcome Centre and corresponding author on the paper.

In the study, published today in Neuron, the SWC researchers found that the retrosplenial cortex uses vestibular signals to encode the speed and direction of the head. However, when the lights are on, the coding of head motion is significantly more accurate.

“When the lights are on, visual landmarks are available to better estimate your own speed (at which your head is moving). If you can’t very reliably encode your head turning speed, then you very quickly lose your sense of direction. This might explain why, particularly in novel environments, we become much worse at navigating once the lights are turned out,” said Troy Margrie.

To understand how the brain enables navigation with and without visual cues, the researchers recorded from neurons across all layers in the retrosplenial cortex as the animals were free to roam around a large arena. This enabled the neuroscientists to identify neurons in the brain called angular head velocity (AHV) cells, which track the speed and direction of the head.

Sepiedeh Keshavarzi, Senior Research Fellow in the Margrie Lab, and lead author on the paper, also then recorded from these same AHV neurons during head-fixed conditions to allow the removal of specific sensory/motor information. By comparing very precise angular head rotations in the dark and in the presence of a visual cue (vertical gratings), with the results of the freely-moving condition, Sepiedeh was able to determine the while vestibular inputs alone can generate head angular velocity signals, their sensitivity to head motion speed is vastly improved when visual information is available.

“While it was already known that the retrosplenial cortex is involved in the encoding of spatial orientation and self-motion guided navigation, this study allowed us to look at integration at both a network and cellular level. We showed that a single cell can see both kinds of signals: vestibular and visual. What was also critically important was the development of a behavioural task that enabled us to determine that mice improve their estimation of their own head angular speed when a visual cue is present. It’s pretty compelling that both the coding of head motion and the mouse’s estimates of their motion speed both significantly improve when visual cues are available,” commented Troy Margrie.

The next steps will be to explore the pathways that bring vestibular and visual information to the retrosplenial cortex and where these signals might be relayed to. We now know there is, for example, a strong feedback loop with primary visual cortex that also receives motor signals relating to running speed. Future experiments designed to isolate and manipulate specific types of neural activity will inform us as to how the cortex disambiguates self-motion generated signals from allocentric ones, a process that is critical to how we navigate through a complex visual world.

This research was funded by the Sainsbury Wellcome Centre Core Grant from the Gatsby Charity Foundation (GAT3361) and Wellcome Trust (090843/F/09/Z and 214333/Z/18/Z).

Story Source:

Materials provided by Sainsbury Wellcome Centre. Note: Content may be edited for style and length.

For all the latest Health News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment