How your brain retrieves knowledge about the world

How your brain retrieves knowledge about the world

In order to understand the world, we combine individual objects, people and events of different categories or concepts. Concepts like that of a telephone are mainly composed of visual elements, shapes and colors, and sounds, like ringing. There are also actions.

However, the concept of the telephone not only appears in our brain when there is a telephone in front of us but also when the word is mentioned. Our brain also uses the concept of a telephone. It can activate the same brain area as the telephone we actually see, hear or use. If only the term telephone is mentioned, the brain seems to mimic the characteristics of the telephone. However, depending on the situation, it is still unclear whether the entire phone concept is triggered or only a single function (such as sound or action) is triggered, and whether only the brain area that processes the corresponding function is activated. When we think of the telephone, do we consider all its functions or just the part that is needed now?

Researchers at the Max Planck Institute for Human Brain and Cognitive Sciences in Leipzig have found the answer to it: it depends on the specific situation. For example, when research participants consider sounds related to the word “telephone”, the corresponding auditory area in the cerebral cortex is activated, which is also activated in the actual hearing. When we consider the use of the telephone, the body movement area, which is the source of movement, comes into play. In some areas, people find that some areas can handle sound and motion at the same time. One of these regions (called the multimodal region) is the lower Left Inferior Parietal Lobule (IPL), which is activated when both functions are requested at the same time. It was discovered that in addition to the characteristics based on sensory impressions and actions, there must be other criteria to understand and classify terms. This became obvious when participants were asked to distinguish between real words and fictional words. An area that is not active for movement or sound: the so-called Anterior Temporal Lobe (ATL) sprung to action.

Image Credits: Radiopedia.org

Based on these results, scientists finally developed a hierarchical model that reflects the way conceptual knowledge is represented in the human brain. According to this model, information is transferred from one level to another. After each step it becomes more and more abstract. Therefore, at the lowest level, a modal area for processing various sensory impressions or actions is defined. These sensory impressions or actions transmit their information to multi-modal areas such as IPL. This mode deals with multiple simultaneous related perceptions (such as sound and action). The modeless ATL with non-sensory experience characteristics has the highest level. The more abstract the attribute, the higher the level of processing it, and the further away from the real sensory impression. “In this way, it shows that our notions of things, people and events are part of sensory impressions and related behaviors on the one hand, and part of abstract symbolic features on the other hand,” explained lead author Philippe Kuhnke. The study was published in Cortex in “Brain” magazine.”The activated functions depend to a large extent on the respective situation or task,” Kuhnke added.


In a follow-up study of the cerebral cortex, the researchers also found that, depending on the situation, multimodal and modality-specific regions can work together when searching. Multimodal IPL interacts with the auditory area in sound extraction and interacts with the body motion area in action extraction. This indicates that the interaction between a specific modality and the multimodal area determines the behavior of the study participants. Participants strongly associate words with actions and sounds. The researchers examined these correlations through various oral tasks performed by participants on a functional magnetic resonance imaging (fMRI) scanner. Named objects with sounds or actions. The researchers showed them four categories of words: 1) objects related to sounds and actions, such as “guitar”, 2) objects related to sounds but not actions, such as “spirals”, and 3) objects that are not related to sounds. Related objects, but with actions such as “napkins” and 4) objects that are not related to sounds or actions (such as “satellites”).

Reference: Max Planck Institute for Human Brain and Cognitive Studies.

Have a look at another article: Harsh Parenting has Long-term Consequences for Brain Development

Author