Historically, sign language technologies have been developed by hearing scientists with little knowledge of the intricacies of sign languages. These projects also generally have not included input from Deaf researchers or community representatives. This has led to unusable technologies, such as ‘sign language gloves.’ Dublin City University is co-ordinating a sensitive and inclusive project to tackle this issue.
The SignON project team is researching and developing a smartphone app that uses sign language machine translation (SLMT) to add to the existing ways deaf, hard of hearing and hearing people can communicate with each other—across several signed and spoken languages. The objective of this research project is the fair, unbiased and inclusive spread of information and digital content in European society.
DCU & ADAPT led project
DCU is the coordinating partner of SignOn, a European funded Horizon 2020 project exploring SLMT run by DCU and ADAPT researcher Prof Andy Way. DCU’s extensive experience in managing and collaborating in multidisciplinary research, as well as experience in organising multidisciplinary research-industry events, will ensure smooth and productive work throughout the project.
The SignON consortium comprises 16 European partner organisations, and includes Deaf community representation, expertise on sign linguistics, and technical expertise in design, machine translation, sign language recognition, 3D animation, avatar synthesis, and automated speech recognition.
Input from deaf and hard of hearing communities
SignON uses a co-creation approach to research, and this part of the process is coordinated by the European Union of the Deaf (EUD). It involves ongoing engagement with Deaf, hard of hearing, and hearing groups through surveys, focus groups, round table discussions, workshops, and cultural events. The co-creation work takes place alongside a communications strategy developed and led by deaf communication experts at Vlaams GebarentaalCentrum (VGTC; the Flemish Sign Language Centre).
Co-creation activities help inform the SignON research and development process, define use-cases, and prioritise specific features and user requirements for the SignON app. These are integrated into the service and app by the technical team, and to complete the co-creation cycle, these user communities regularly test the latest versions of the SignON service.
An example of these co-creation activities is a ‘Think-In’ that took place at Deaf Village Ireland in Dublin, in November 2022, in collaboration with the ADAPT Education and Public Engagement (EPE) team. Think-Ins are public fora that bring people together to discuss and deliberate the issues facing people in the digital age. The SignON Think-In was hosted by the EUD team and Dr Elizabeth Mathews from the DCU School of Inclusive & Special Education, and we were joined by members of the Deaf community, and some hearing guests, to discuss the implications and possibilities of SLMT.
Collaboration between SignON and the ADAPT EPE Team was valuable because it brought a new method of community engagement to SignON, and it helped the ADAPT EPE team develop the Think-In format to be more inclusive and accessible to Deaf and hard of hearing participants.
Throughout the co-creation process in SignON, Deaf and hard of hearing participants have provided valuable input in surveys, roundtables, and workshops. They have shared important perspectives on the Deaf community’s distrust of hearing researchers, the importance of human interpreters in specific contexts, the potential for this technology, fears around its misuse, and the importance of Deaf leadership and decision-making in projects like this.Recently, we introduced art-science engagement formats, which helped facilitate more speculative discussions about the future of sign language technologies, such as how AI technologies may ‘perceive’ sign languages.
One of the key outputs of the process has been the definition of use-cases for the SignON app. An example of an acceptable use-case is a scenario where a deaf signer from Ireland is travelling to Spain for work. She needs to check into her hotel, but she is in a hurry to get to a conference. The hotel clerk on duty only speaks Spanish, but the deaf guest doesn’t know Spanish, and doesn’t have time to go back and forth writing in English.
In a case like this, automated translation may be acceptable and useful. The deaf signer would open the app, sign in Irish Sign Language (ISL), and the app would produce text in Spanish. The hotel clerk can reply by speaking Spanish, and the app will translate that to ISL signed by an avatar.
Multilingual speech processing on smartphones
The expected impacts of SignON include the improvement of multilingual speech processing on mobile devices, the improvement of sign language recognition on mobile devices, the development of a sign language technology with the potential for wide uptake by Deaf and hard of hearing people, addressing under-resourced languages, greater autonomy for people communicating between signed and spoken languages, and the advancement of machine translation research with the addition of sign language corpora. Several evaluation methods are in place to monitor the progress of these potential impacts.
Read more about the project here - Sign On: Sign Language Translation Mobile Application and Open Communications Framework
Aoife Brady is an EU Research Project Manager in ADAPT
Shaun O'Boyle works in the Faculty of Engineering and Computing in DCU
Emma Clarke is the Education & Public Engagement Officer in ADAPT