Chirolens: Translating Hand Gestures in Museum Paintings

Chirolens display of gesture translations in Rubens, The Miracles of St. Francis Xavier, (detail), 1617-1618, Oil on canvas, Kunsthistorisches Museum, Vienna, (Photo CreA lab).

FWF Project (2022-2025)

Temenuzhka Dimova

 

The Chirolens project investigates the perception and understanding of hand gestures in museum paintings. In collaboration between the Lab for Cognitive Research in Art History (CReA lab) and the Vienna Research Center for Visual Computing (VRVis), we developed an Augmented Reality application that displays verbal translations of depicted gestures directly on the surface of exhibited artworks. The goal is to make the narrative speech of the depicted characters tangible, creating a new form of mediation and experience for museum visitors.

The painted hands form a codified visual language that is difficult to describe in a separate medium. The immersive potential of AR fits perfectly with the dynamics of this language and offers a unique and insightful educational perspective. The translations of hand gestures are established by combining iconographic codifications, quotations from historical sources, and contextual meanings. The goal of Chirolens is to offer concise sentences that deliver contextual information without overwhelming the viewer with details.

Most hand gestures in painting are codified and have a long iconographic history. However, this visual language is often overlooked by visitors because it requires historical knowledge to interpret correctly. Chirolens offers direct glimpses into these contextualized pictorial codifications. Chirolens displays translations of each chirogram directly in front of the artwork. The user wears an AR headset that automatically projects the translations onto the pictorial surface. The gestures of the depicted figures are pre-recorded in a selection of paintings. The device displays short sentences near the corresponding hands, appearing as contextualized “subtitles.” Designed to be discreet and unobtrusive, the device can be used alongside other mediation tools. It may even spark visitors’ curiosity and encourage deeper engagement with labels and wall texts.

The Chirolens app, equipped with a mobile eye tracking function, will be tested with visitors at the Kunsthistorisches Museum in Vienna to examine whether and how a stronger consideration of gestures influences art perception. Building on the core competencies of the CReA Lab, including mobile eye tracking in museums, studies of visiting practices, research on art perception among Deaf sign language users, and the history of hand gestures in painting, the project aims to inspire further developments at the intersection of AR technologies and museum education.

 

Funded by the Austrian Science Fund (FWF), ESP 37-G