What is it about?
This study utilises a Machine Learning (ML) method known as Association RulesMining (ARM) to produce a statistical model which may be used for linguistics insight and as a resource in future approaches for sign language processing. In doing so, this research explores the emerging patterns of non-manual articulation concerning grammatical classes in Irish Sign Language (ISL). Specifically, this study explores correlations between head movement, body movement, eyebrows, eyegaze, eye aperture, and cheek movement, in relation to the grammatical classes listed in the Auslan corpus annotation guidelines (Johnston 2019).
Featured Image
Photo by Vitaly Gariev on Unsplash
Why is it important?
In recent years, the use of virtual assistants and voice user interfaces has become a latent part of modern living. Unseen to the user are the various artificial intelligence and natural language processing technologies, the vast datasets, and the linguistic insights that underpin such tools. The technologies supporting them have chiefly targeted widely used spoken languages, leaving sign language users at a disadvantage. One important reason why sign languages are unsupported by such tools is a requirement of the underpinning technologies for a comprehensive description of the language. Due to the complex multimodal nature of sign languages, a comprehensive description suitable to computational processing of sign languages does not exist. As a result, sign language users do not enjoy the level of access to technology as spoken language users.
Read the Original
This page is a summary of: Exploiting Association Rules Mining to inform the use of non-manual features in sign language
processing, Sign Language & Linguistics, January 2025, John Benjamins,
DOI: 10.1075/sll.00092.smi.
You can read the full text:
Contributors
The following have contributed to this page







