How the Brain Turns Sensory Input Into Action – Neuroscience News

by thinkia.org.in
0 comment


Summary: Neuroscientists have uncovered how sensory input is transformed into motor action across multiple brain regions in mice. The study shows that decision-making is a distributed process across the brain, where neurons link sensory evidence to actions.

Researchers found that after learning a task, mice process information across numerous brain regions, offering new insights into brain-wide neural dynamics. This work could help design more distributed neural networks for artificial intelligence systems.

Key Facts:

  • Decision-making in the brain is a global, distributed process involving many regions.
  • Learning enhances the brain’s ability to integrate sensory input across multiple areas.
  • The study provides insights that may aid in developing advanced AI neural networks.

Source: Sainsbury Wellcome Center

Neuroscientists have revealed how sensory input is transformed into motor action across multiple brain regions in mice.

The research, conducted at the Sainsbury Wellcome Centre at UCL, shows that decision-making is a global process across the brain that is coordinated by learning.

The findings could aid artificial intelligence research by providing insights into how to design more distributed neural networks.

“This work unifies concepts previously described for individual brain areas into a coherent view that maps onto brain-wide neural dynamics. We now have a complete picture of what is happening in the brain as sensory input is transformed through a decision process into an action,” explained Professor Tom Mrsic-Flogel, Director of the Sainsbury Wellcome Centre at UCL and corresponding author on the paper.

The study, published today in Nature, outlines how the researchers used Neuropixels probes, a state-of-the-art technology enabling simultaneous recordings across hundreds of neurons in multiple brain regions, to study mice taking part in a decision-making task.

The task, developed by Dr Ivana Orsolic at SWC, allowed the team to distinguish between sensory processing and motor control.

The researchers also revealed the contribution of learning through studying animals trained in the task and comparing them to naïve animals.

“We often make decisions based on ambiguous evidence. For example, when it starts to rain, you have to decide how high frequency the raindrops need to be before you open your umbrella. We studied this same ambiguous evidence integration in mice to understand how the brain processes perceptual decisions,” explained Dr Michael Lohse, Sir Henry Wellcome Postdoctoral Fellow at SWC and joint first author on the paper.

Mice were trained to stand still while they watched a visual pattern moving on a screen. To receive a reward, the mice had to lick a spout when they detected a sustained increase in the speed of movement of the visual pattern. The task was designed so that the speed of the movement was never constant, instead it continuously fluctuated.

The timing of the increase in the average speed also changed from trial to trial so that the mice could not simply remember when the sustained increase occurred. Thus, the mice had to constantly pay attention to the stimulus and integrate information to work out whether the increase in the speed had happened.

“By training the mice to stand still, the data analysis we could perform was much cleaner and the task allowed us to look at how neurons track random fluctuations in speed before the mice made an action.

“In trained mice, we found that there is no single brain region that integrates sensory evidence or orchestrates the process. Instead, we found neurons that are sparsely but broadly distributed across the brain link sensory evidence and action initiation,” explained Dr Andrei Khilkevich, Senior Research Fellow in the Mrsic-Flogel lab and joint first author on the paper.

The researchers recorded from each mouse multiple times and collected data from over 15,000 cells across 52 brain regions in 15 trained mice. To look at learning, the team also compared the results to recordings from naïve mice.

“We found that when mice don’t know what the visual stimulus means, they only represent the information in the visual system in the brain and a few midbrain regions. After they have learned the task, cells integrate the evidence all over the brain,” explained Dr Lohse.

In this study, the team only looked at naïve animals and those that had fully learned the task, but in future work they hope to uncover how the learning process occurs by tracking neurons over time to see how they change as mice begin to understand the task.

The researchers are also looking to explore whether specific areas in the brain act as causal hubs in establishing these links between sensations and actions.

A number of additional questions raised by the study include how the brain incorporates an expectation of when the speed of visual pattern will increase such that animals only react to the stimulus when the information is relevant. The team plan to study these questions further using the dataset they have collected.

Funding: This study was funded by Wellcome awards (217211/Z/19/Z and 224121/Z/21/Z) and by the Sainsbury Wellcome Centre’s Core Grant from the Gatsby Charitable Foundation (GAT3755) and Wellcome (219627/Z/19/Z).

About this neuroscience research news

Author: April Cashin-Garbutt
Source: Sainsbury Wellcome Center
Contact: April Cashin-Garbutt – Sainsbury Wellcome Center
Image: The image is credited to Neuroscience News

Original Research: Open access.
Brain-wide dynamics transforming sensation into action during decision-making” by Tom Mrsic-Flogel et al. Nature


Abstract

Brain-wide dynamics transforming sensation into action during decision-making

Perceptual decisions rely on learned associations between sensory evidence and appropriate actions, involving the filtering and integration of relevant inputs to prepare and execute timely responses.

Despite the distributed nature of task-relevant representations, it remains unclear how transformations between sensory input, evidence integration, motor planning and execution are orchestrated across brain areas and dimensions of neural activity.

Here we addressed this question by recording brain-wide neural activity in mice learning to report changes in ambiguous visual input. After learning, evidence integration emerged across most brain areas in sparse neural populations that drive movement-preparatory activity.

Visual responses evolved from transient activations in sensory areas to sustained representations in frontal-motor cortex, thalamus, basal ganglia, midbrain and cerebellum, enabling parallel evidence accumulation. In areas that accumulate evidence, shared population activity patterns encode visual evidence and movement preparation, distinct from movement-execution dynamics.

Activity in movement-preparatory subspace is driven by neurons integrating evidence, which collapses at movement onset, allowing the integration process to reset. Across premotor regions, evidence-integration timescales were independent of intrinsic regional dynamics, and thus depended on task experience.

In summary, learning aligns evidence accumulation to action preparation in activity dynamics across dozens of brain regions. This leads to highly distributed and parallelized sensorimotor transformations during decision-making.

Our work unifies concepts from decision-making and motor control fields into a brain-wide framework for understanding how sensory evidence controls actions.

You may also like

Thinkia is a professional platform where we provide informative content like current world news, all types of educational content, health awareness, food awareness, travel awareness, ideas and tips. We hope you like all the content provided by us.

Editors' Picks

Latest Posts

Copyright © 2024 | Thinkia | All Right Reserved