Sign languages used by the deaf are complete visual-spatial languages, governed by their own rules for syntax, grammar and punctuation (Stokoe, 1960, 2001). As such, they are not direct translations of any specific spoken languages. The signs express concepts rather than individual words per se, similar to the way that groups of words in a written language are used to express concepts. Vocabulary and grammar is expressed using hand gestures, facial gestures and body movements. These languages are the basis for Deaf culture and there is no written (text based) equivalent for them.
Sign language content on the web is limited due to the dominance of text-based web technologies and the lack of easy to use tools to create and link web pages without using text. Although some websites provide sign language content, most of the content is composed of static 2D screen shots of simple finger signs illustrating the characters of the alphabet. Other sites are primarily text-based descriptions of sign language with pointers to resources for sign learning sign language. In the rare cases where there is signed web content navigational elements and hyperlinks are still comprised of text. As a result, there is very little opportunity for members of the deaf community to communicate with each other in their native sign language over the web.
One solution proposed by Richards, et. al, (2004) is SignLink Studio, a web-based editor that has been developed to allow web authors to create web pages using sign language video.
Hyperlinks can be added to video material so that users can navigate between multiple videos without the need for text (although text is optional). Screen shots and prototype examples can be seen at www.aslpah.ca (also featured on Canadian Heritages culture.ca website in May 2006).
But Hyperlinking alone will not make signed resources on the web accessible, and the information will only be useful if they can be searched in some way. As the corpora of sign language videos and signed websites increases, a method of organizing using sign language will be necessary. Access to these materials is currently challenged by the lack of sign language metadata that can provide a summary of the resources, and facilitate navigation and retrieval in the native language of the resources and of the users.
In this paper, we present an overview of the SignLink Studio system, and preliminary results of a study of SignLink Studio with deaf web users. We note the unique semantic challenges of creating non-textual metadata, and comment upon areas for future research and development