(2005, fall ~ current)
Castle Quest is the iterated version of CopyCat. The game has thirty sentences of sign composed of vocabulary of color and location, such as "Alligator on blue wall".
For Castle Quest, we borrowed the main character of "Abe" from the game, Abe's amazing adventure. In the game, Abe is a friend of Iris, the mascot of AASD.
Abe's role is to save kittens from villans. While traverse the platform, Abe comes across thirty different tasks - sentences describing the graphycal situation on the screen,
As a user test, the WOz experiment was performed along with the pre/post language assessments and focus group discussion.
During the user test, each student was asked to participate in the pre and post language assessment before/after playing the game. The assessment is composed of three sessions which is designed to test different skills on linguistic development- perceptive skill, expressive skill and short-term memory. Each session has thirty sentences.
Focus Group Discussion
The focus group session aims to gather children's opinion in active way inviting them to be a "Game Designer". Since we found that children preferred to share their opinion with each other to answer for the one on one interview with the researcher, the focus group discussion session was designed to reinforce the participative design process with the target user group.
(2004, fall ~ 2005, summer)
CopyCat is designed both as a platform to collect gesture data for our ASL recognition system and as a practical application which helps deaf children acquire language skills while they play the game. The system uses a video camera and wrist mounted accelerometers as the primary sensors. In CopyCat, the user and the character of the game, Iris the cat, communicate with ASL. With the help of ASL linguists and educators, the game is designed with a limited, age-appropriate phrase set. For example, the child will sign to Iris, "you go play balloon" (glossed from ASL). If the child signs poorly, Iris looks puzzled, and the child is encouraged to attempt the phrase again. If the child signs clearly, Iris frolics and plays with a red balloon. If the child cannot remember the correct phrase to direct Iris, she can click on a button bearing the picture of the object with which she would like Iris to play. The system shows a short video with a teacher demonstrating the correct ASL phrase. The child can then mimic the teacher to communicate with Iris.
Gesture-based interaction expands the possibilities for deaf educational technology by allowing children to interact with the computer in their native language. An initial goal of the system, suggested by our partners at the Atlanta Area School for the Deaf, is to elicit phrases which involve three and four signs from children who normally sign in phrases with one or two signs. This task encourages more complex sign construction and helps develop short term memory. In the current game there are 8 phrases per level, and the child must correctly sign each phrase before moving on to the next level.
To date, CopyCat has used a "Wizard of Oz" approach where an interpreter simulates the computer recognizer. This method allows research into the development of an appropriate game interface as well as data collection to train our hidden Markov model (HMM) based ASL recognition system. Preliminary off-line tests of the recognition system have shown promising results for user-independent recognition of data from our ten-year-old subjects, and we hope to perform experiments with a live recognition system soon. In addition, our pilot studies have allowed us to create a compelling game for the students, who often ask to continue playing even after they have completed all levels of the game.
User-Centered Development of a Gesture-Based American sign Language (ASL) Game,
Seungyon Lee, Valerie Henderson, Helene Brashear, Thad Starner, Harley Hamilton, Steven Hamilton,
Proceedings of Instructional Technology and Education of the Deaf Symposium, NTID(National Technocal Institute for the Deaf), Rochester, NY. June 2005
CopyCat: Learning American Sign Language (ASL) through a Gesture-Based Computer Game,
Seungyon Lee, Valerie Henderson, Helene Brashear,
RESNA (Rehabilitation Engineering & Assistive Technology Society of North America), Student Design Competition Finalist, Atlanta GA June 2005
Development of American Sign Language (ASL) Game for Deaf Children,
Valerie Henderson, Seungyon Lee, Harley Hamilton, Thad Starner, Helene Brashear, Steven Hamilton,
Proceedings of IDC(Interaction Design & Children). Boulder, CO. June 2005
A Gesture-based American Sign Language (ASL) Tutor for Deaf Children,
Seungyon Lee, Valerie Henderson, Harley Hamilton, Thad Starner, Helene Brashear, Steven Hamilton,
Proceedings of CHI(Computer-Human Interaction). Portland, OR. April 2005