Service

EmoSign-GH: A multimodal dataset for understanding emotions in Ghanaian Sign Language

Abstract

Unlike spoken languages where the use of prosodic features to convey emo tion is well studied, indicators of emotion in sign language remain poorly un derstood, creating communication barriers in critical settings. Sign languages present unique challenges as facial expressions and hand movements simultane ously serve both grammatical and emotional functions. To address this gap, we introduce EmoSign, the first sign video dataset containing sentiment and emo tion labels for 200 Ghanaian Sign Language (ASL) videos. We also collect open-ended descriptions of emotion cues. Annotations were done by 3 Deaf GSL signers with professional interpretation experience. Alongside the annota tions, we include baseline models for sentiment and emotion classification. This dataset not only addresses a critical gap in existing sign language research but also establishes a new benchmark for understanding model capabilities in multi modal emotion recognition for sign languages.

Download paper

Download paper

Download paper

More about the publication

Empowering lives, inspiring innovation.

Ready to collaborate? Whether you are a researcher, a student, or an industry partner, join us in turning quantum theory into real-world impact.

Empowering lives, inspiring innovation.

Ready to collaborate? Whether you are a researcher, a student, or an industry partner, join us in turning quantum theory into real-world impact.

Empowering lives, inspiring innovation.

Ready to collaborate? Whether you are a researcher, a student, or an industry partner, join us in turning quantum theory into real-world impact.

Create a free website with Framer, the website builder loved by startups, designers and agencies.