At the ninth China Collegiate Computing Contest Mobile Application Innovation Contest held in Hangzhou, Zhejiang, in November, a sign language learning app named ArtfulSign (Miaoshou in Chinese) emerged as the top winner.
Among 240 participating teams, ArtfulSign won the championship, the Most Innovative Award, and the Social Responsibility Innovation Award.
Developed by three hearing-impaired students from Beijing Union University, the app is designed to help hearing individuals learn sign language to communicate with those who have hearing loss.
The most notable feature of ArtfulSign is its use of AI models in sign language teaching.
Each sign language word is demonstrated through real-person video examples. After learning, users can activate the camera, and the app will recognize and assess the accuracy of their sign language gestures.
AI-powered video analysis and machine learning enable real-time interaction and feedback, enhancing the learning experience.
The competition was organized by the Teaching Steering Committee for Computer Majors in Higher Education under the Ministry of Education and co-hosted by Apple Inc and Zhejiang University.
On Nov 19, after the announcement of the winners, Apple Inc CEO Tim Cook posted a message on Sina Weibo praising the achievements of the student developers. "Wonderful to see how the students in this year's App Contest are creating apps that make a positive impact," he wrote, sharing a photo of the three members of the ArtfulSign team: Zhao Yuan, Wang Yueran, and Tan Chenglong.
Upon seeing Cook's post, Zhao typed the phrase "breaking the fourth wall" on her phone to describe her feelings at the time.
Tan also felt it was "unreal, like a dream", he signed.
Wang, who communicates with people using a hearing aid, said he was "very excited and proud" of his app's innovative performance.
Wang, a 21-year-old senior from East China's Shandong province, majors in computer science and technology. He is primarily responsible for ArtfulSign's architecture design and front-end development.
According to him, most sign language teaching software on the market uses static images or dynamic animations, which results in one-way output with poor interactivity and no adaptive learning paths.
ArtfulSign, however, has designed the learning process in the form of answering questions and unlocking maps, which not only adds fun but also better matches each user's individual progress and level.
Tan, 23, from Changchun, Northeast China's Jilin province, utilizes his expertise in visual communication to handle the user interface and user experience design.
One of his thoughtful touches is allowing users to have their own sign language names. He has created a chart for the 26 English letters, enabling users to craft their names using these signs.
The interface Tan designed for side-by-side teaching allows users to see both the real person's demonstration and their own actions simultaneously.
"It's like sitting in front of a classroom podium, listening to a teacher," he wrote.
The responsibility of forming the team and planning the development process falls on the shoulders of team leader Zhao, a 25-year-old from Jingjiang, East China's Jiangsu province, and a second-year graduate student in software engineering. She also oversees the most difficult and time-consuming part of the team's work — building and training the sign language AI recognition model.
To enhance the AI's accuracy in recognition, each sign language vocabulary item needs to be trained with nearly 1,000 corresponding video clips. However, due to a shortage of readily available videos, the team has taken on the task of filming them themselves.
So far, the ArtfulSign team has produced thousands of standard sign language video clips.
Yao Dengfeng, a professor at the Special Education College of Beijing Union University and the mentor of the ArtfulSign team, specializes in assistive technology software engineering. According to him, while computational linguistics has made significant progress, research in computational sign language is still lagging.
For example, when using AI in language translation, video clips need to be linguistically annotated. Chinese language annotations typically follow a subject-verb-object structure to help the AI understand the content. Annotating sign language videos, however, is far more complex, as it involves various elements such as hand gestures, head movements, body actions, facial expressions, and eye movements.
"Currently, there are computer based automatic annotation tools for Chinese and English, but none for sign language," explained Yao, with the assistance of a hearing aid. "As a result, AI developers need to annotate manually. One hour of video data requires 100 hours of annotation time."
Due to his own hearing impairment, Yao is committed to achieving barrier-free information access for people with disabilities.
However, there are very few researchers in China who study sign language as a natural language from a linguistic perspective; instead, it is often treated merely as a tool by those in the field of special education.
"Sign language is a truly beautiful and elegant language. We also have many sign language songs and dances," Yao said.
Zhao hopes everyone can learn some basic sign language. "Whether it's out of love and interest, a desire to help those with hearing impairments, or simply for situations where speaking is difficult due to illness, sign language can be very useful," she wrote.
She and the other two team members have spent six months developing ArtfulSign and are still working to optimize the interface and stabilize the AI model.
They plan to officially launch the product on the Apple App Store next spring. "We hope that one day ArtfulSign will become as common as accessible ramps, allowing both the hearing-impaired and hearing people to use sign language. Ultimately, there will be fewer barriers between us," she wrote.
Contact the writer at guiqian@i21st.cn