[Award] We received a Regular Session Award at Entertainment Computing 2025

Here is the English translation with the HTML tags preserved.

Chihiro Sakai (M2) and colleagues from the Cybernetics and Reality Engineering Laboratory received the Best Oral Presentation Award at Entertainment Computing 2025.

Entertainment Computing 2025 (EC2025) is a symposium organized by the Special Interest Group on Entertainment Computing (SIGEC) of the Information Processing Society of Japan (IPSJ). Its purpose is to promote “technical research for creating new entertainment,” “evaluation and elucidation of ‘fun’,” and “applied research in education, welfare, and sports.” It serves as a forum for presenting and discussing research results that drive the development of the entertainment computing field. The symposium was held from August 25 to 27, 2025, at the Nihon University College of Humanities and Sciences Campus (Setagaya-ku, Tokyo).

The research presented by Ms. Sakai, titled “Improvement and Usability Evaluation of a Robot System for People with Physical Disabilities to Realize Gaze Visualization and Visual Search Support,” utilizes an eye-gaze-controllable robot to enable individuals with severe physical disabilities to understand their surroundings and communicate their intentions or interests to others through gaze visualization. The evaluation method—conducted through fieldwork where individuals with physical disabilities actually used the robot at an event—as well as its social significance, were highly recognized, leading to the Best Oral Presentation Award at this symposium. This award is expected to further advance the research and its application in society.

Awardees/Authors:
Chihiro Sakai (2nd-year Master’s Course), Ory Yoshifuji (Ory Laboratory Inc.), Yutaro Hirao, Monica Perusquía-Hernández, Hideaki Uchiyama, and Kiyoshi Kiyokawa

Photo: Chihiro Sakai

Research theme:
“Improvement and Usability Evaluation of a Robot System for People with Physical Disabilities to Realize Gaze Visualization and Visual Search Support”

People with physical disabilities, who account for approximately 50% of disability certificate holders in Japan, may face limited vision or speech difficulties primarily due to nerve damage or muscle weakness. These factors restrict environmental awareness and communication, making it impossible for them to engage in “pointing-based” communication with caregivers. To solve this problem, the author developed a robot system in 2024 to realize gaze visualization and visual search support for individuals with physical disabilities. This robot allows users to see what they want to see in their surroundings and enables a pseudo-“pointing” gesture by visualizing their gaze to inform caregivers of their focus. This study describes the modification of the gaze visualization module to clarify the object of interest, structural improvements to enhance the robot’s robustness, and the usability evaluation and analysis conducted during an event exhibition. As a result of exhibiting the improved robot at “Global ALS Day in Nagoya,” an ALS awareness event, feedback suggesting the effectiveness of the system and proposals for improvement were obtained from individuals with physical disabilities, caregivers, and event visitors.

Awardee’s voice
“I am very honored to receive the Best Oral Presentation Award. I would like to express my gratitude to Mr. Ory Yoshifuji for the invitation to the event and his valuable advice, and to the professors of the Cybernetics and Reality Engineering Laboratory for their guidance on my writing and research. I hope to achieve further progress in the future.”

Link to:
Entertainment Computing 2025 HP: https://ec2025.entcomp.org/