Computational models explore how regions of the visual cortex jointly represent visual information

Computer-discovered images shape real brain responses in new participants. a: We tested whether images that influenced our computer-simulated brain models would have the same effect on real human brains. We showed these images to six new people in an fMRI scanner. b–c: Results from the simulated brain models. d–e: Results from the real participants' brain activity. Notice how the real brain responses closely matched the computer predictions. Credit: Nature Human Behaviour (2025). DOI: 10.1038/s41562-025-02252-z

Understanding how the human brain represents the information picked up by the senses is a longstanding objective of neuroscience and psychology studies. Most past studies focusing on the visual cortex, the network of regions in the brain’s outer layer known to process visual information, have focused on the contribution of individual regions, as opposed to their collective representation of visual stimuli.

Researchers at Freie Universität Berlin recently carried out a study aimed at shedding new light on how regions across the human visual cortex collectively encode and process visual information, by simulating their contribution using computational models. Their findings, published in Nature Human Behaviour, highlight specific rules that could govern the relations between these different regions of the visual cortex.

“Most of us take seeing for granted, but the process is surprisingly complex,” Alessandro Gifford, first author of the paper, told Medical Xpress. “When we look at the world, it’s not just our eyes doing the work—it’s our brain, specifically an area at the back called the visual cortex. Think of the visual cortex as a team of specialists. Each member of the team (or brain region) handles a different aspect of what we see—one might focus on shapes, another on motion, another on faces.”

The various regions of the visual cortex are known to work together in unison, similarly to an orchestra, to represent and process visual information. So far, however, most researchers have studied each of them individually, rather than their coordinated and collective representation of visual stimuli.

“This is like trying to understand a symphony by listening to how each instrument contributes to the full piece,” explained Gifford. “Our study set out to take a different approach. We wanted to understand not just what each region does individually, but how they relate to one another—how similar or different their ‘visual languages’ are. To explore how these different regions of the brain ‘talk’ about visual information, we needed a lot of data—more than what’s currently possible to collect from real human brains.”

Instead of analyzing neuroimaging data showing what happens in the brain when people are processing visual information, Gifford and his colleagues developed computer models of brain regions known to play a part in the processing of visual information. These models act as “digital twins,” simulating how regions of the visual cortex would collectively respond when a person is shown various images.

Subsequently, they tried to better understand the patterns underpinning the collective functioning of these artificial models of visual cortex regions. To do this, they used neural control algorithms, computational techniques that can control or optimize the activity of artificial neural networks or brain models.

“We asked the algorithms: ‘Can you find images that make two brain regions respond in the same way? And others that make them respond very differently?'” said Gifford. “By testing many images, we could map out how much two regions share—or don’t share—the same way of seeing the world. Finally, to make sure this wasn’t just a quirk of our simulation, we tested those same images on real people’s brains in an MRI scanner—and they behaved just as predicted.”

When they compared the patterns observed in their computer simulations with the MRI scans of people who were seeing the same images processed by their models, the team found that they were very similar. Their analyses also showed that the relations between different visual regions, both in simulations and in MRI scans, were far from random.

“The regions’ response similarities and differences seem to follow three main rules, which we broadly refer to as distance, category and hierarchy,” said Gifford. “Firstly, we found that brain regions that are physically closer tend to ‘think’ more alike. Secondly, regions that specialize in the same kinds of things (like faces or scenes) are more in sync. Finally, some regions deal with raw details, like edges or light, while others interpret higher-level things like objects or actions. These levels shape the regions’ response similarity.”

Collectively, the rules identified by the researchers appear to limit the range of visual representations that the brain can produce, similarly to how the layout of a musical instrument defines the music that it can produce. In the future, the findings gathered as part of this study could thus help to shed light on the “space” of possible visual experiences that the brain enables and on the complex interactions underpinning these experiences.

In their next studies, Gifford and his colleagues would also like to better simulate the speed with which the brain makes sense of visual information. So far, they have looked at the relations between brain regions in terms of individual “snapshots,” as they relied on fMRI imaging scans. These scans are great for understanding where in the brain the activity is taking place, but they are not great for predicting the timing of specific events.

“In our next studies, we want to explore how these relationships evolve over time as we perceive something,” added Gifford. “Moreover, here’s a more mind-bending idea: What if we could push the brain’s visual system outside its usual patterns? Could special kinds of images—or gentle electrical stimulation—make your brain ‘see’ in a way it normally wouldn’t? Maybe even unlock new kinds of visual experiences? It’s speculative, but it could teach us a lot about the limits—and possibilities—of human perception.”

Source : https://medicalxpress.com/news/2025-07-explore-regions-visual-cortex-jointly.html
Pict by Medical Xpress Blog

Leave a Reply

Your email address will not be published. Required fields are marked *