How the Brain Learns and Integrates Prior Knowledge with New Information – Neuroscience News

by thinkia.org.in
0 comment


Summary: The brain builds a hierarchy of knowledge, connecting lower-order sensory details to higher-order concepts, shaping our perception of the world. A new study reveals how visual experience influences the brain’s feedback connections, allowing us to integrate context and recognize patterns based on past experiences.

Researchers found that visual input refines these connections, enhancing our ability to interpret complex stimuli and update our understanding of the environment.

Key facts:

  • Feedback connections in the brain link higher-order concepts with lower-order sensory details, aiding in perception.
  • Visual experience fine-tunes these connections, enabling more effective integration of contextual information.
  • Understanding how prior knowledge and new sensory input are combined can offer insights into conditions like autism and schizophrenia.

Source: Champalimaud Centre for the Unknown

How do we learn to make sense of our environment?

Over time, our brain builds a hierarchy of knowledge, with higher-order concepts linked to the lower-order features that comprise them. For instance, we learn that cabinets contain drawers and that Dalmatian dogs have black-and-white patches, and not vice versa.

The researchers therefore set about investigating how visual experience influences the organisation of these feedback projections, whose functional role remains largely unknown. Credit: Neuroscience News

This interconnected framework shapes our expectations and perception of the world, allowing us to identify what we see based on context and experience.

“Take an elephant”, says Leopoldo Petreanu, senior author of the la Caixa-funded study.

“Elephants are associated with lower-order attributes such as colour, size, and weight, as well as higher-order contexts like jungles or safaris. Connecting concepts helps us understand the world and interpret ambiguous stimuli. If you’re on a safari, you may be more likely to spot an elephant behind the bushes than you would otherwise.

“Similarly, knowing it’s an elephant makes you more likely to perceive it as grey even in the dim light of dusk. But where in the fabric of the brain is this prior knowledge stored, and how is it learned?”.

The brain’s visual system consists of a network of areas that work together, with lower areas handling simple details (e.g. small regions of space, colours, edges) and higher areas representing more complex concepts (e.g. larger regions of space, animals, faces).

Cells in higher areas send “feedback” connections to lower areas, putting them in a position to learn and embed real-world relationships shaped by experience. For instance, cells encoding an “elephant” might send feedback to cells processing features like “grey”, “big” and “heavy”.

The researchers therefore set about investigating how visual experience influences the organisation of these feedback projections, whose functional role remains largely unknown.

“We wanted to understand how these feedback projections store information about the world”, says Rodrigo Dias, one of the study’s first authors.

“To do this, we examined the effects of visual experience on feedback projections to a lower visual area called V1 in mice. We raised two groups of mice differently: one in a normal environment with regular light exposure, and the other in darkness. We then observed how the feedback connections, and cells they target in V1, responded to different regions of the visual field”.

In mice raised in darkness, the feedback connections and V1 cells directly below them both represented the same areas of visual space. First author Radhika Rajan picks up the story,

“It was amazing to see how well the spatial representations of higher and lower areas matched up in the dark-reared mice.

“This suggests that the brain has an inherent, genetic blueprint for organising these spatially aligned connections, independent of visual input”.

However, in normally-reared mice, these connections were less precisely matched, and more feedback inputs conveyed information from surrounding areas of the visual field.

Rajan continues, “We found that with visual experience, feedback provides more contextual and novel information, enhancing the ability of V1 cells to sample information from a broader area of the visual scene”.

This effect depended on the origin within the higher visual area: feedback projections from deeper layers were more likely to convey surround information compared to those from superficial layers.

Moreover, the team discovered that in normally-reared mice, deep-layer feedback inputs to V1 become organized according to the patterns they “prefer” to see, such as vertical or horizontal lines.

“For instance”, Dias says, “inputs that prefer vertical lines avoid sending surround information to areas located along the vertical direction. In contrast, we found no such bias in connectivity in dark-reared mice”.

“This suggests that visual experience plays a crucial role in fine-tuning feedback connections and shaping the spatial information transmitted from higher to lower visual areas”, notes Petreanu.

“We developed a computational model that shows how experience leads to a selection process, reducing connections between feedback and V1 cells whose representations overlap too much. This minimises redundancy, allowing V1 cells to integrate a more diverse range of feedback”.

Perhaps counterintuitively, the brain might encode learned knowledge by connecting cells that represent unrelated concepts, and that are less likely to be activated together based on real-world patterns.

This could be an energy-efficient way to store information, so that when encountering a novel stimulus, like a pink elephant, the brain’s preconfigured wiring maximises activation, enhancing detection and updating predictions about the world.

Identifying this brain interface where prior knowledge combines with new sensory information could be valuable for developing interventions in cases where this integration process malfunctions.

As Petreanu concludes, “Such imbalances are thought to occur in conditions like autism and schizophrenia. In autism, individuals may perceive everything as novel because prior information isn’t strong enough to influence perception.

“Conversely, in schizophrenia, prior information could be overly dominant, leading to perceptions that are internally generated rather than based on actual sensory input. Understanding how sensory information and prior knowledge are integrated can help address these imbalances”.

About this learning and visual neuroscience research news

Author: Hedi Young
Source: Champalimaud Centre for the Unknown
Contact: Hedi Young – Champalimaud Centre for the Unknown
Image: The image is credited to Neuroscience News

Original Research: Open access.
Visual experience reduces the spatial redundancy between cortical feedback inputs and primary visual cortex neurons” by Leopoldo Petreanu et al. Neuron


Abstract

Visual experience reduces the spatial redundancy between cortical feedback inputs and primary visual cortex neurons

Highlights

  • Visual experience reduces the receptive field overlap between LM inputs and V1 neurons
  • LM inputs from L5 convey more surround information to V1 neurons than those from L2/3
  • The tuning-dependent organization of LM inputs from L5 requires visual experience
  • Spatial redundancy minimization explains visual experience effects on LM inputs

Summary

The role of experience in the organization of cortical feedback (FB) remains unknown. We measured the effects of manipulating visual experience on the retinotopic specificity of supragranular and infragranular projections from the lateromedial (LM) visual area to layer (L)1 of the mouse primary visual cortex (V1).

LM inputs were, on average, retinotopically matched with V1 neurons in normally and dark-reared mice, but visual exposure reduced the fraction of spatially overlapping inputs to V1. FB inputs from L5 conveyed more surround information to V1 than those from L2/3.

The organization of LM inputs from L5 depended on their orientation preference and was disrupted by dark rearing.

These observations were recapitulated by a model where visual experience minimizes receptive field overlap between LM inputs and V1 neurons.

Our results provide a mechanism for the dependency of surround modulations on visual experience and suggest how expected interarea coactivation patterns are learned in cortical circuits.

You may also like

Thinkia is a professional platform where we provide informative content like current world news, all types of educational content, health awareness, food awareness, travel awareness, ideas and tips. We hope you like all the content provided by us.

Editors' Picks

Latest Posts

Copyright © 2024 | Thinkia | All Right Reserved