Dear readers,
With the launch of e-newsletter CUHK in Focus, CUHKUPDates has retired and this site will no longer be updated. To stay abreast of the University’s latest news, please go to https://focus.cuhk.edu.hk. Thank you.

News

First-ever Multi-language Online Sign Language Game

CUHK, Google and Japanese counterparts launch SignTown

The Centre for Sign Linguistics and Deaf Studies of CUHK’s Department of Linguistics and Modern Languages joined hands with Google, the Nippon Foundation (Japan) and the Kwansei Gakuin University to set up Project Shuwa and officially launched the world’s first multi-language online sign language game, SignTown, on 23 September, the International Day of Sign Language. With the support of AI-based sign language recognition and sign linguistics theoretical framework, the beta version of SignTown released in May this year has accumulated over 8,500 users in Hong Kong and Japan.

SignTown places players in a fictional town where sign language is the official medium of communication. Players need to make signs in front of the camera to complete tasks in relation to daily activities, such as packing their bags for a trip, finding a hotel to stay in, or ordering food at a café. Meanwhile, the AI-powered recognition model will give immediate feedback on players’ signing accuracy. Cute hand-shaped characters scattered throughout the game will also explain to users the concepts of ‘sign language’, ‘deaf people’ and ‘deaf culture’. In this way, both hearing and deaf people can learn the culture of signs and the deaf, and sign languages in Hong Kong and Japan in a fun and relaxing manner.

Based on the Hong Kong and Japanese sign languages, Project Shuwa has successfully constructed the first machine-learning based model that can recognize three-dimensional sign language movements, and track and analyze hand and body movements as well as facial expressions using a standard camera. The new sign language game SignTown was built upon this novel technology.

The next move in the project is to generate a sign dictionary that not only incorporates a search function, but also provides a virtual platform to facilitate sign language learning and documentation based on AI technology. The ultimate goal of the team is also to develop an automatic translation model that can recognize natural conversations in sign language and convert them into spoken language using the cameras of commonly used computers and smartphones.