Английская Википедия:Bi-directional hypothesis of language and action
The bi-directional hypothesis of language and action proposes that the sensorimotor and language comprehension areas of the brain exert reciprocal influence over one another.[1] This hypothesis argues that areas of the brain involved in movement and sensation, as well as movement itself, influence cognitive processes such as language comprehension. In addition, the reverse effect is argued, where it is proposed that language comprehension influences movement and sensation. Proponents of the bi-directional hypothesis of language and action conduct and interpret linguistic, cognitive, and movement studies within the framework of embodied cognition and embodied language processing. Embodied language developed from embodied cognition, and proposes that sensorimotor systems are not only involved in the comprehension of language, but that they are necessary for understanding the semantic meaning of words.
Development of the bi-directional hypothesis
The theory that sensory and motor processes are coupled to cognitive processes stems from action-oriented models of cognition.[2] These theories, such as the embodied and situated cognitive theories, propose that cognitive processes are rooted in areas of the brain involved in movement planning and execution, as well as areas responsible for processing sensory input, termed sensorimotor areas or areas of action and perception.[3] According to action-oriented models, higher cognitive processes evolved from sensorimotor brain regions, thereby necessitating sensorimotor areas for cognition and language comprehension.[2] With this organization, it was then hypothesized that action and cognitive processes exert influence on one another in a bi-directional manner: action and perception influence language comprehension, and language comprehension influences sensorimotor processes.
Although studied in a unidirectional manner for many years, the bi-directional hypothesis was first described and tested in detail by Aravena et al.[1] These authors utilized the Action-Sentence Compatibility Effect (ACE), a task commonly used to study the relationship between action and language, to test the effects of performing simultaneous language comprehension and motor tasks on neural and behavioral signatures of movement and language comprehension.[1] These authors proposed that these two tasks cooperate bi-directionally when compatible, and interfere bi-directionally when incompatible.[1] For example, when the movement implied by the action language stimuli is compatible with the movement being performed by the subject, it was hypothesized that performance of both tasks would be enhanced.[1] Neural evidence of the bi-directional hypothesis was demonstrated by this study,[1] and the development of this hypothesis is ongoing.
Effects of language comprehension on systems of action
Language comprehension tasks can exert influence over systems of action, both at the neural and behavioral level. This means that language stimuli influence both electrical activity in sensorimotor areas of the brain, as well as actual movement.
Neural activation
Language stimuli influence electrical activity in sensorimotor areas of the brain that are specific to the bodily association of the words presented. This is referred to as semantic somatotopy, which indicates activation of sensorimotor areas that are specific to the bodily association implied by the word. For example, when processing the meaning of the word “kick,” the regions in the motor and somatosensory cortices that represent the legs will become more active.[4][5] Boulenger et al.[5] demonstrated this effect by presenting subjects with action-related language while measuring neural activity using fMRI. Subjects were presented with action sentences that were either associated with the legs (e.g. “John kicked the object”) or with the arms (e.g. “Jane grasped the object”). The medial region of the motor cortex, known to represent the legs, was more active when subjects were processing leg-related sentences, whereas the lateral region of the motor cortex, known to represent the arms, was more active with arm-related sentences.[5] This body-part-specific increase in activation was exhibited about 3 seconds after presentation of the word, a time window that is thought to indicate semantic processing.[6] In other words, this activation was associated with subjects comprehending the meaning of the word. This effect held true, and was even intensified, when subjects were presented with idiomatic sentences.[5] Abstract language that implied more figurative actions were used, either associated with the legs (e.g. “John kicked the habit”) or the arms (e.g. “Jane grasped the idea”).[5] Increased neural activation of leg motor regions were demonstrated with leg-related idiomatic sentences, whereas arm-related idiomatic sentences were associated with increased activation of arm motor regions.[5] This activation was larger than that demonstrated by more literal sentences (e.g. “John kicked the object”), and was also present in the time window associated with semantic processing.[5]
Action language not only activates body-part-specific areas of the motor cortex, but also influences neural activity associated with movement. This has been demonstrated during an Action-Sentence Compatibility Effect (ACE) task, a common test used to study the relationship between language comprehension and motor behavior.[7] This task requires the subject to perform movements to indicate understanding of a sentence, such as moving to press a button or pressing a button with a specific hand posture, that are either compatible or incompatible with movement implied by the sentence.[7] For example, pressing a button with an open hand to indicate understanding of the sentence "Jane high-fived Jack" would be considered a compatible movement, as the sentence implies an open-handed posture. Motor potentials (MP) are an Event Related Potentials (ERPs) stemming from the motor cortex, and are associated with execution of movement.[8] Enhanced amplitudes of MPs have been associated with precision and quickness of movements.[1][8][9] Re-afferent potentials (RAPs) are another form of ERP, and are used as a marker of sensory feedback[10] and attention.[11] Both MP and RAP have been demonstrated to be enhanced during compatible ACE conditions.[1] These results indicate that language can have a facilitory effect on the excitability of neural sensorimotor systems. This has been referred to as semantic priming,[12] indicating that language primes neural sensorimotor systems, altering excitability and movement.
Movement
The ability of language to influence neural activity of motor systems also manifests itself behaviorally by altering movement. Semantic priming has been implicated in these behavioral changes, and has been used as evidence for the involvement of the motor system in language comprehension. The Action-Sentence Compatibility Effect (ACE) is indicative of these semantic priming effects. Understanding language that implies action may invoke motor facilitation, or prime the motor system, when the action or posture being performed to indicate language comprehension is compatible with action or posture implied by the language. Compatible ACE tasks have been shown to lead to shorter reaction times.[1][7][13] This effect has been demonstrated on various types of movements, including hand posture during button pressing,[1] reaching,[7] and manual rotation.[13]
Language stimuli can also prime the motor system simply by describing objects that are commonly manipulated. In a study performed by Masson et al., subjects were presented with sentences that implied non-physical, abstract action with an object (e.g. "John thought about the calculator" or "Jane remembered the thumbtack").[14] After presentation of language stimuli, subjects were cued to perform either functional gestures, gestures typically made when using the object described in the sentence (e.g. poking for calculator sentences), or a volumetric gesture, gestures that are more indicative of whole hand posture (e.g. horizontal grasp for calculator sentences).[14] Target gestures were either compatible or incompatible with the described object, and were cued at two different time points, early and late. Response latencies for performing compatible functional gestures significantly decreased at both time points, whereas latencies were significantly lower for compatible volumetric gestures in the late cue condition.[14] These results indicate that descriptions of abstract interactions with objects automatically (early time point) generate motor representations of functional gestures, priming the motor system and increasing response speed.[14] The specificity of enhanced motor responses to the gesture-object interaction also highlights the importance of the motor system in semantic processing, as this enhanced motor response was dependent on the meaning of the word.
A study performed by Dr. Olmstead et al.,[15] described in detail elsewhere, demonstrates more concretely the influence that the semantics of action language can have on movement coordination. Briefly, this study investigated the effects of action language on the coordination of rhythmic bimanual hand movements. Subjects were instructed to move two pendulums, one with each hand, either in-phase (pendulums are at the same point in their cycle, phase difference of roughly 0 degrees) or anti-phase (pendulums are at the opposite point in their cycle, phase difference of roughly 180 degrees).[15] Robust behavioral studies have revealed that these two phase states, with phase differences 180 and 0 degrees, are the two stable relative phase states, or the two coordination patterns that produce stable movement.[16] This pendulum swinging task was performed as subjects judged sentences for their plausibility; subjects were asked to indicate whether or not each presented sentence made logical sense.[15] Plausible sentences described actions that could be performed by a human using the arms, hands, and/or fingers ("He is swinging the bat"), or actions that could not be performed ("The barn is housing the goat").[15] Implausible sentences also used similar action verbs ("He is swinging the hope"). Plausible, performable sentences lead to a significant change in the relative phase shift of the bimanual pendulum task.[15] The coordination of the movement was altered by action language stimuli, as the relative phase shift that produced stable movement was significantly different than in the non-performable sentence and no language stimuli conditions.[15] This development of new stable states has been used to imply a reorganization of the motor system utilized to plan and execute this movement,[15] and supports the bi-directional hypothesis by demonstrating an effect of action language on movement.
Effects of systems of action on language comprehension
The bi-directional hypothesis of action and language proposes that altering the activity of motor systems, either through altered neural activity or actual movement, influences language comprehension. Neural activity in specific areas of the brain can be altered using transcranial magnetic stimulation (TMS), or by studying patients with neuropathologies leading to specific sensory and/or motor deficits. Movement is also used to alter the activity of neural motor systems, increasing overall excitability of motor and pre-motor areas.
Neural activation
Altered neural activity of motor systems has been demonstrated to influence language comprehension. One such study that demonstrates this effect was performed by Dr. Pulvermüller et al.[17] TMS was used to increase the excitability of either the leg region or the arm region of the motor cortex.[17] Authors stimulated the left motor cortex, known to be more closely involved in language processing in right-handed individuals, the right motor cortex, as well as a sham stimulation where stimulation was prevented by a plastic block placed between the coil and the skull.[17] During the stimulation protocols, subjects were shown 50 arm, 50 leg, 50 distractor (no bodily relation), and 100 pseudo- (not real) words.[17] Subjects were asked to indicate recognition of a meaningful word by moving their lips, and response times were measured.[17] It was found that stimulation of the left leg region of the motor cortex significantly reduced response times for recognition of leg words as compared to arm words, whereas the reverse was true for stimulation of the arm region.[17] Stimulation site on the right motor cortex, as well as sham stimulation, did not exhibit these effects.[17] Therefore, somatotopically-specific stimulation of the left motor cortex facilitated word comprehension in a body-part-specific manner, where stimulation of the leg and arm regions lead to enhanced comprehension of leg and arm words, respectively.[17] This study has been used as evidence for the bi-directional hypothesis of language and action, as it showcases that manipulating motor cortex activity alters language comprehension in a semantically-specific manner.[17]
A similar experiment has been performed on the articulatory motor cortex, or the mouth and lip regions of the motor cortex used in the production of words.[18] Two categories of words were used as language stimuli: words that involved the lips for production (e.g. "pool") or the tongue (e.g. "tool).[18] Subjects listened to the words, were shown pairs of pictures, and were asked to indicate which picture matched the word they heard with a button press.[18] TMS was used prior to presentation of the language stimuli to selectively facilitate either the lip or tongue regions of the left motor cortex; these two TMS conditions were compared to a control condition where TMS was not applied.[18] It was found that stimulation of the lip region of the motor cortex lead to a significantly decreased response time for lip words as compared to tongue words.[18] In addition, during recognition of tongue words, reduced reaction times were seen with tongue TMS as compared to lip TMS and no TMS.[18] Although this same effect was not seen with lip words, authors attribute this to the complexity of tongue as opposed to lip movements, and the increase difficulty of tongue words as opposed to lip.[18] Overall, this study demonstrates that the activity in the articulatory motor cortex influences the comprehension of single spoken words, and highlights the importance of the motor cortex in speech comprehension[18]
Lesions of sensory and motor areas have also been studied to elucidate the effects of sensorimotor systems on language comprehension. One such example of this is the patient JR; this patient has a lesion in areas in the auditory association cortex implicated in processing auditory information.[19] This patient showcases significant impairments in conceptual and perceptual processing of sound-related language and objects.[19] For example, processing the meaning of words describing sound-related objects (e.g., "bell') was significantly impaired in JR as compared to non-sound-related objects (e.g., "armchair").[19] These data suggest that damage of sensory regions involved in processing auditory information specifically impair processing of sound-related conceptual information,[19] highlighting the necessity of sensory systems for language comprehension.
Movement
Movement has been shown to influence language comprehension. This has been demonstrated by priming motor areas with movement, increasing the excitability of motor and pre-motor areas associated with the body part being moved.[20] It has been demonstrated that motor engagement of a specific body part decreases neural activity in language processing areas when processing words related to that body part.[20] This decreased neural activity is a feature of semantic priming, and suggests that activation of specific motor areas through movement can facilitate language comprehension in a semantically-dependent manner.[20] An interference effect has also been demonstrated. During incompatible ACE conditions, neural signatures of language comprehension have been shown to be inhibited.[1] Combined, these pieces of evidence have been used to support a semantic role of the motor system.
Movement can also inhibit language comprehension tasks, particularly tasks of verbal working memory.[21] When asked to memorize and verbally recall four-word sequences of either arm or leg action words, performing complex, rhythmic movements after presentation of the word sequences was demonstrated to interfere with memory performance.[21] This performance deficit was body-part specific, where movement of the legs impaired performance of recall of leg words, and movement of the arms impaired recall of arm words.[21] These data indicate that sensorimotor systems exhibit cortically specific "inhibitory casual effects" on memory of action words,[21] as impairment was specific to motor engagement and bodily association of the words.
Organization of neural substrates
Relating cognitive functions to brain structures is done in the field of cognitive neuroscience. This field attempts to map cognitive processes, such as language comprehension, onto neural activation of specific brain structures.The bi-directional hypothesis of language and action requires that action and language processes have overlapping brain structures, or shared neural substrates, thereby necessitating motor areas for language comprehension. The neural substrates of embodied cognition are often studied using the cognitive tasks of object recognition, action recognition, working memory tasks, and language comprehension tasks. These networks have been elucidated with behavioral, computational, and imaging studies, but the discovery of their exact organization is ongoing.
Circuit organization
It has been proposed that the control of movement is organized hierarchically, where movement is not controlled by individually controlling single neurons, but that movements are represented at a gross, more functional level.[22] A similar concept has been applied to the control of cognition, resulting in the theory of cognitive circuits.[23] This theory proposes that there are functional units of neurons in the brain that are strongly connected, and act coherently as a functional unit during cognitive tasks.[23] These functional units of neurons, or "thought circuits," have been referred to as the "building blocks of cognition".[23] Thought circuits are believed to have been originally formed from basic anatomical connections, that were strengthened with correlated activity through Hebbian learning and plasticity.[23] Formation of these neural networks has been demonstrated with computational models using known anatomical connections and Hebbian learning principles.[24] For example, sensory stimulation through interaction with an object activates a distributed network of neurons in the cortex. Repeated activation of these neurons, through Hebbian plasticity, may strengthen their connections and form a circuit.[23][25] This sensory circuit may then be activated during the perception of known objects.[23]
This same concept has been applied to action and language, as understanding of the meaning of action words requires an understanding of the action itself. During language and motor skill development, one likely learns to associate an action word with an action or a sensation.[2][23] This action or sensation, and the correlated sensorimotor areas involved, are then incorporated into the neural representation of that concept.[23][24] This leads to semantic topography, or the activation of motor areas related to the meaning and bodily association of action language.[4][5] These networks may be organized into "kernels," areas highly activated by language comprehension tasks, and "halos," brain areas in the periphery of networks that experience slightly increased activation.[23][24] It has been hypothesized that language comprehension is housed in the left-perisylvian neuronal circuit, forming the "kernel," and sensorimotor regions are peripherally activated during semantic processing of action language, forming the "halo".[23][24]
Many studies that have demonstrated a role of the motor system in semantic processing of action language have been used as evidence for a shared neural network between action and language comprehension processes.[1][5][7][12][13][14][15][17][18][19][21] For example, facilitated activity in language comprehension areas, evidence of semantic priming, with movement of a body part that is associated with the action word has also been used as evidence for this shared neural network.[20] A more specific method for identifying whether certain areas of the brain are necessary for a cognitive task is to demonstrate impaired performance of said task following a functional change to the brain area of interest.[26] A functional change may involve a lesion, or altered excitability through stimulation, or utilization of the area for another task.[21] According to this theory, there is only a finite amount of neural real-estate available for each task. If two tasks share a neural network, there will be competition for the associated neural substrates, and the performance of each task will be inhibited when performed simultaneously.[26] Using this theory, proponents of the bi-directional hypothesis have postulated that performance of verbal working memory of action words would be impaired by movement of the concordant body part.[21] This has been demonstrated with the selective impairment of memorization of arm and leg words when coupled with arm and leg movements, respectively.[21] This implies that the neural network for verbal working memory is specifically tied to the motor systems associated with the body part implied with the word.[21][23] This semantic topography has been suggested to provide evidence that action language shares a neural network with sensorimotor systems, thereby supporting the bi-directional hypothesis of language and action.
See also
References
- ↑ 1,00 1,01 1,02 1,03 1,04 1,05 1,06 1,07 1,08 1,09 1,10 1,11 Шаблон:Cite journal
- ↑ 2,0 2,1 2,2 Шаблон:Cite book
- ↑ Шаблон:Cite journal
- ↑ 4,0 4,1 Шаблон:Cite journal
- ↑ 5,0 5,1 5,2 5,3 5,4 5,5 5,6 5,7 5,8 Шаблон:Cite journal
- ↑ Humphries, C., Binder, J.R., Medler, D.A., Liebenthal, E. (2007). "Time-course of semantic processes during sentence comprehension: an fMRI study." Neuroimage. 36: 924-932.
- ↑ 7,0 7,1 7,2 7,3 7,4 Шаблон:Cite journal
- ↑ 8,0 8,1 Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ 12,0 12,1 Шаблон:Cite journal
- ↑ 13,0 13,1 13,2 Шаблон:Cite journal
- ↑ 14,0 14,1 14,2 14,3 14,4 Шаблон:Cite journal
- ↑ 15,0 15,1 15,2 15,3 15,4 15,5 15,6 15,7 Шаблон:Cite journal
- ↑ Шаблон:Cite book
- ↑ 17,0 17,1 17,2 17,3 17,4 17,5 17,6 17,7 17,8 17,9 Шаблон:Cite journal
- ↑ 18,0 18,1 18,2 18,3 18,4 18,5 18,6 18,7 18,8 Шаблон:Cite journal
- ↑ 19,0 19,1 19,2 19,3 19,4 Шаблон:Cite journal
- ↑ 20,0 20,1 20,2 20,3 Шаблон:Cite journal
- ↑ 21,0 21,1 21,2 21,3 21,4 21,5 21,6 21,7 21,8 Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ 23,00 23,01 23,02 23,03 23,04 23,05 23,06 23,07 23,08 23,09 23,10 Шаблон:Cite journal
- ↑ 24,0 24,1 24,2 24,3 Pulvermüller, F., & Garagnani, M. (2014). From sensorimotor learning to memory cells in prefrontal and temporal association cortex: a neurocomputational study of disembodiment. Cortex, 57: 1-21.
- ↑ Doursat, R., & Bienenstock, E. "Neocortical self-structuration as a basis for learning." 5th International conference on development and learning (ICDL 2006). 2006.
- ↑ 26,0 26,1 Shallice, T. From neuropsychology to mental structure. Cambridge University Press, 1988.
External links
- Шаблон:TED talk
- A Brief Guide to Grounded Cognition
- Brain Language Laboratory
- Cambridge Neuroscience
- Research for this Wikipedia entry was conducted as part of a Locomotion Neuromechanics course (APPH 6232) offered in the School of Biological Sciences at Georgia Tech