Cornelia Ebert

Speech-accompanying gestures in formal semantics and pragmatics

The project investigates the role of gestures in communication from a formal semantic and pragmatic point of view. It examines what the potential meaning contribution of a gesture is, how meaning is conveyed by it, and how this information combines with that of the speech signal. Gestures have not been investigated systematically in the formal semantics realm - and more generally: in theoretical linguistics - yet, even though they are a subject of investigation in other fields like psychology, neuroscience, semiotics, or robotics.

In 2012, I was awarded the Caroline von Humboldt Prize for this highly interdisciplinary project, which bridges the gap between traditional fields of gesture research and theoretical linguistics. The prize included a scholarship to visit Humboldt-University as a scientist in residence for a two month period, where the grounds of my present research and output were laid. With the prize money, I was also able pursue my project goals by initiating several workshops and conducting experiments and corpus work on gestures.

I would like to thank the Humboldt-University of Berlin for choosing me as one of the award winners and thus enabling me to pursue my research on gestures in formal semantics and pragmatics. This work also lead to our current project The Pragmatic Status of Iconic Meaning in Spoken Communication: Gestures, Ideophones, Prosodic Modulations (PSIMS) at Leibniz-ZAS, co-supervised by Manfred Krifka, Susanne Fuchs and me.

Project Description

The project aims at developing universal formal semantic models of multimodal meaning contributions that can deal with speech and gesture input. The developed models are evaluated via experiments and detailed corpus work. A thorough inspection of gestures with approved and reliable semantic tools will not only bring new results for traditional gesture studies, but will also enable formal linguists to test all kinds of semantic phenomena, theoretical claims, and well-established theories against an entirely new empirical domain. The integration of gesture information into established formal-linguistic frameworks will be the test and proving ground for these theories. Furthermore, a semantic investigation of gestures will provide new insights for the understanding of a wealth of so far unresolved problems within formal semantics like issues around 'multidimensional meanings' or information structure. I am convinced that it also has the potential to have serious impact on even long-lasting linguistic-philosophical debates, e.g. about the function of demonstration and the interpretation of demonstratives.

By shifting the traditional focus of formal linguistics on spoken language towards including gesture, this research directly raises the question of what counts as language and what counts as extra-linguistic context. An answer to this question has bearings on the overarching theory of language. While language is traditionally seen as involving conventionalised arbitrary (non-iconic) signs and a strict grammar, these are characteristics that do not hold for gestures, which are for the most part non-conventionalised, non-combinatorial and often iconic. On the other hand, language systems that are not based on auditory, but rather visual input such as sign languages are also known to meet the just mentioned requirements to only some extent. While sign languages have a fixed grammar and signs are conventionalised, these signs are not always arbitrary, but often very iconic. And although sign languages are by no means as well researched as spoken languages, there has lately been a growing interest in sign languages, also from a formal-semantic perspective and with respect to the question of the role iconicity plays for them and for language in general. This role of iconicity vs. the arbitrariness of signs in language will be further elucidated by taking into account also gestures, i.e. visual input in an otherwise auditory-based system.

Workshops

  • Workshop Week of Signs and Gestures, June 12-14 2017, Stuttgart (with Daniel Hole and Fabian Bross)
  • Workshop Perspectives in Gesture & Sign Language Research, September 22-23 2015, Stuttgart (with Daniel Hole and Fabian Bross)
  • Workshop Embodied meaning goes public -- gestures, signs, and other visible linguistic effects of simulation processes, December 5-6 2014, Stuttgart (with Daniel Hole and Fabian Bross)
  • Workshop Demonstration and Demonstratives, April 11-12 2014, Stuttgart (with Daniel Hole)
  • Workshop Interface Issues of Gestures and Verbal Semantics, March 13 2013, Potsdam (with Hannes Rieser)