Discover the most talked about and latest scientific content & concepts.

Concept: Gesture


Cross-species comparison of great ape gesturing has so far been limited to the physical form of gestures in the repertoire, without questioning whether gestures share the same meanings. Researchers have recently catalogued the meanings of chimpanzee gestures, but little is known about the gesture meanings of our other closest living relative, the bonobo. The bonobo gestural repertoire overlaps by approximately 90% with that of the chimpanzee, but such overlap might not extend to meanings. Here, we first determine the meanings of bonobo gestures by analysing the outcomes of gesturing that apparently satisfy the signaller. Around half of bonobo gestures have a single meaning, while half are more ambiguous. Moreover, all but 1 gesture type have distinct meanings, achieving a different distribution of intended meanings to the average distribution for all gesture types. We then employ a randomisation procedure in a novel way to test the likelihood that the observed between-species overlap in the assignment of meanings to gestures would arise by chance under a set of different constraints. We compare a matrix of the meanings of bonobo gestures with a matrix for those of chimpanzees against 10,000 randomised iterations of matrices constrained to the original data at 4 different levels. We find that the similarity between the 2 species is much greater than would be expected by chance. Bonobos and chimpanzees share not only the physical form of the gestures but also many gesture meanings.

Concepts: Human, Hominidae, Chimpanzee, Ape, Gesture, Bonobo


Referential communication occurs when a sender elaborates its gestures to direct the attention of a recipient to its role in pursuit of the desired goal, e.g. by pointing or showing an object, thereby informing the recipient what it wants. If the gesture is successful, the sender and the recipient focus their attention simultaneously on a third entity, the target. Here we investigated the ability of domestic horses (Equus caballus) to communicate referentially with a human observer about the location of a desired target, a bucket of food out of reach. In order to test six operational criteria of referential communication, we manipulated the recipient’s (experimenter) attentional state in four experimental conditions: frontally oriented, backward oriented, walking away from the arena and frontally oriented with other helpers present in the arena. The rate of gaze alternation was higher in the frontally oriented condition than in all the others. The horses appeared to use both indicative (pointing) and non-indicative (nods and shakes) head gestures in the relevant test conditions. Horses also elaborated their communication by switching from a visual to a tactile signal and demonstrated perseverance in their communication. The results of the tests revealed that horses used referential gestures to manipulate the attention of a human recipient so to obtain an unreachable resource. These are the first such findings in an ungulate species.

Concepts: Horse, Gesture, Wild horse, Odd-toed ungulate, Equus, Equidae, Donkey, Domestication of the horse


Co-speech gestures have been shown to interact with working memory (WM). However, no study has investigated whether there are individual differences in the effect of gestures on WM. Combining a novel gesture/no-gesture task and an operation span task, we examined the differences in WM accuracy between individuals who gestured and individuals who did not gesture in relation to their WM capacity. Our results showed individual differences in the gesture effect on WM. Specifically, only individuals with low WM capacity showed a reduced WM accuracy when they did not gesture. Individuals with low WM capacity who did gesture, as well as high-capacity individuals (irrespective of whether they gestured or not), did not show the effect. Our findings show that the interaction between co-speech gestures and WM is affected by an individual’s WM load.

Concepts: Gesture, Gestures


Abstract Co-speech gestures have a close semantic relationship to speech in adult conversation. In typically developing children co-speech gestures which give additional information to speech facilitate the emergence of multi-word speech. A difficulty with integrating audio-visual information is known to exist for individuals with Autism Spectrum Disorder (ASD), which may affect development of the speech-gesture system. A longitudinal observational study was conducted with four children with ASD, aged 2;4 to 3;5 years. Participants were video-recorded for 20 min every 2 weeks during their attendance on an intervention programme. Recording continued for up to 8 months, thus affording a rich analysis of gestural practices from pre-verbal to multi-word speech across the group. All participants combined gesture with either speech or vocalisations. Co-speech gestures providing additional information to speech were observed to be either absent or rare. Findings suggest that children with ASD do not make use of the facilitating communicative effects of gesture in the same way as typically developing children.

Concepts: Scientific method, Autism, Pervasive developmental disorder, Asperger syndrome, Autism spectrum, PDD-NOS, Gesture, Gestures


Referential and iconic gesturing provide a means to flexibly and intentionally share information about specific entities, locations, or goals. The extent to which nonhuman primates use such gestures is therefore of special interest for understanding the evolution of human language. Here, we describe novel observations of wild female bonobos (Pan paniscus) using referential and potentially iconic gestures to initiate genito-genital (GG) rubbing, which serves important functions in reducing social tension and facilitating cooperation. We collected data from a habituated community of bonobos at Luikotale, DRC, and analysed n = 138 independent gesture bouts made by n = 11 females. Gestures were coded in real time or from video. In addition to meeting the criteria for intentionality, in form and function these gestures resemble pointing and pantomime-two hallmarks of human communication-in the ways in which they indicated the relevant body part or action involved in the goal of GG rubbing. Moreover, the gestures led to GG rubbing in 83.3% of gesture bouts, which in turn increased tolerance in feeding contexts between the participants. We discuss how biologically relevant contexts in which individuals are motivated to cooperate may facilitate the emergence of language precursors to enhance communication in wild apes.

Concepts: Human, Female, Primate, Hominidae, Gesture, Sign language, David McNeill, Bonobo


Eavesdropping involves the acquisition of information from third-party interactions, and can serve to indirectly attribute reputation to individuals. There is evidence on eavesdropping in dogs, indicating that they can develop a preference for people based on their cooperativeness towards others. In this study, we tested dogs' eavesdropping abilities one step further. In a first experiment, dogs could choose between cooperative demonstrators (the donors) who always gave food to an approaching third person (the beggar); here, the only difference between donors was whether they received positive or negative reactions from the beggar (through verbal and gestural means). Results showed that dogs preferentially approached the donor who had received positive reactions from the beggar. By contrast, two different conditions showed that neither the beggar’s body gestures nor the verbal component of the interaction on their own were sufficient to affect the dogs' preferences. We also ran two further experiments to test for the possibility of dogs' choices being driven by local enhancement. When the donors switched places before the choice, dogs chose at random. Similarly, in a nonsocial condition in which donors were replaced by platforms, subjects chose at chance levels. We conclude that dogs' nonrandom choices in the present protocol relied on the simultaneous presence of multiple cues, such as the place where donors stood and several features of the beggar’s behavior (gestural and verbal reactions, and eating behavior). Nonetheless, we did not find conclusive evidence that dogs discriminated the donors by their physical features, which is a prerequisite of reputation attribution.

Concepts: Person, Interaction, Experiment, Choice, Preference, Gesture, Affect display, Gestures


In animal communication, signallers and recipients are typically different: each signal is given by one subset of individuals (members of the same age, sex, or social rank) and directed towards another. However, there is scope for signaller-recipient interchangeability in systems where most signals are potentially relevant to all age-sex groups, such as great ape gestural communication. In this study of wild bonobos (Pan paniscus), we aimed to discover whether their gestural communication is indeed a mutually understood communicative repertoire, in which all individuals can act as both signallers and recipients. While past studies have only examined the expressed repertoire, the set of gesture types that a signaller deploys, we also examined the understood repertoire, the set of gestures to which a recipient reacts in a way that satisfies the signaller. We found that most of the gestural repertoire was both expressed and understood by all age and sex groups, with few exceptions, suggesting that during their lifetimes all individuals may use and understand all gesture types. Indeed, as the number of overall gesture instances increased, so did the proportion of individuals estimated to both express and understand a gesture type. We compared the community repertoire of bonobos to that of chimpanzees, finding an 88 % overlap. Observed differences are consistent with sampling effects generated by the species' different social systems, and it is thus possible that the repertoire of gesture types available to Pan is determined biologically.

Concepts: Human, Sociology, Hominidae, Chimpanzee, Ape, Gesture, Bonobo, Common Chimpanzee


It is well established that great apes communicate via intentionally produced, elaborate and flexible gestural means. Yet relatively little is known about the most fundamental steps into this communicative endeavour-communicative exchanges of mother-infant dyads and gestural acquisition; perhaps because the majority of studies concerned captive groups and single communities in the wild only. Here, we report the first systematic, quantitative comparison of communicative interactions of mother-infant dyads in two communities of wild chimpanzees by focusing on a single communicative function: initiation of carries for joint travel. Over 156 days of observation, we recorded 442 actions, 599 cases of intentional gesture production, 51 multi-modal combinations and 80 vocalisations in the Kanyawara community, Kibale National Park, Uganda, and the Taï South community, Taï National Park, Côte d'Ivoire. Our results showed that (1) mothers and infants differed concerning the signal frequency and modality employed to initiate joint travel, (2) concordance rates of mothers' gestural production were relatively low within but also between communities, (3) infant communicative development is characterised by a shift from mainly vocal to gestural means, and (4) chimpanzee mothers adjusted their signals to the communicative level of their infants. Since neither genetic channelling nor ontogenetic ritualization explains our results satisfactorily, we propose a revised theory of gestural acquisition, social negotiation, in which gestures are the output of social shaping, shared understanding and mutual construction in real time by both interactants.

Concepts: Human, Hominidae, Chimpanzee, Gorilla, Ape, Gesture, Sign language, David McNeill


A novel hand biometric authentication method based on measurements of the user’s stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password ‘iloveu’ in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, ‘i’, ‘l’, ‘o’, ‘v’, ‘e’, and ‘u’. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy.

Concepts: Algorithm, Language, Computer graphics, Machine learning, Gesture, Sign language, Computer vision, Gestures


Research suggests that speech-accompanying gestures influence cognitive processes, but it is not clear whether the gestural benefit is specific to the gesturing hand. Two experiments tested the “(right/left) hand-specificity” hypothesis for self-oriented functions of gestures: gestures with a particular hand enhance cognitive processes involving the hemisphere contralateral to the gesturing hand. Specifically, we tested whether left-hand gestures enhance metaphor explanation, which involves right-hemispheric processing. In Experiment 1, right-handers explained metaphorical phrases (e.g., “to spill the beans,” beans represent pieces of information). Participants kept the one hand (right, left) still while they were allowed to spontaneously gesture (or not) with their other free hand (left, right). Metaphor explanations were better when participants chose to gesture when their left hand was free than when they did not. An analogous effect of gesturing was not found when their right hand was free. In Experiment 2, different right-handers performed the same metaphor explanation task but, unlike Experiment 1, they were encouraged to gesture with their left or right hand or to not gesture at all. Metaphor explanations were better when participants gestured with their left hand than when they did not gesture, but the right hand gesture condition did not significantly differ from the no-gesture condition. Furthermore, we measured participants' mouth asymmetry during additional verbal tasks to determine individual differences in the degree of right-hemispheric involvement in speech production. The left-over-right-side mouth dominance, indicating stronger right-hemispheric involvement, positively correlated with the left-over-right-hand gestural benefit on metaphor explanation. These converging findings supported the “hand-specificity” hypothesis. (PsycINFO Database Record

Concepts: Hand, Gesture, Sign language, David McNeill, Gestures