Showing Off Seesign

Seesign's Main Screen

seesign in use

There’s a new project here at Local Games Lab ABQ: Seesign. It is Celestina Martinez’ tool to visually identify signs (as in sign language, not road signs). She used Nomen to make Seesign, a new platform from Field Day Lab, my ARIS co-conspirators.

Martinez’ goal with Seesign is to replace the very user-unfriendly dictionaries sign language students need to use to look up unfamiliar signs with something obvious and easy-to-use. The usual approach to looking up signs is to search by the English word you think might correspond to what you saw. The affordances of dictionaries are just not very useful for organizing visual as opposed to textual information. It is not hard to see that this is terribly inefficient for learners who need to look up signs they don’t recognize.All you have to do is mark the features you do notice: hand shapes, movement type, etc.

Seesign's Main Screen

Seesign’s Main Screen

You do this by tapping the relevant choices on Seesign’s main screen (above). The possible matches to your choices are numbered and linked at the top right of the screen, and are narrowed in scope with each additional feature you mark. At any point, you can tap on this number to see those potential matches (below; in the actual app these are animated GIFs).

Likely Matches

Likely Matches to a Choice on Main Screen

Nomen makes it quite simple for novices without much technical background to produce tools for this and similar identification tasks. More on that later, but for now, note that Ms. Martinez does not have a technical background and put this app together, with only minor assistance from me, part-time over this last year. She began work on Seesign last fall as part of an assignment in my course, Things That Make Us Smart, and has continued this semester as an independent study. She also submitted it to the 2016 UNM App Contest, but did not win any of the big prizes (robbed IMO).

Why Seesign Is Cool

Ms. Martinez describes it a bit better than I do, but basically, her app was born of the frustration of learning sign language using common tools. The usual approach to looking up signs is to search by the English word you think might correspond to what you saw. It is not hard to see that this is terribly inefficient. It works fine if you are trying to say something you know in English, but not at all for identifying signs you see. The affordances of dictionaries are just not very useful for organizing visual as opposed to textual information. Reverse-look-up visual dictionaries do exist, but they are not common and they are expensive and very limited in terms of

  • The words they provide – very few
  • How signs are represented – static images, very basic
  • The mechanics of looking up – tons of pages to rifle through.

Seesign improves on all of these greatly with its obvious, streamlined interface for marking features and looking at the matching results. Not only is the user able to mark as little or as much of what they recognize—maybe you notice the type of movement, but not what hand shapes were used—but Nomen is also flexible in how it organizes and displays matches: showing both “likely” matches, signs that match all marked features, and “possible” matches, signs that match at least one marked feature. And since it is a web-app, unlike my long-time favorite ARIS, Seesign is accessible on almost every internet connected device with a screen: phones, tablets, laptops, desktops.

So far, Martinez has developed two iterations of her basic prototype (one each semester). She currently only has 10 words catalogued, but has worked extensively at

  1. Developing a useful categorization scheme that classifies words in a way that brings out the visually distinct features of signs and fitting them into a small number of readily apparent choices, and
  2. Producing a visual format for illustration and feedback on the users’  marked choices: animated GIFs.

I’ve been really impressed with Ms. Martinez’ work this last year, both in her general ability to start with an idea and really dig into developing it into something tangible (not just the app but the “soft” work too: finding a model to work with, entering the UNM app contest, etc.), but also the acuity with which she has developed (through thoughtful iteration) the experience of her app, especially through those two areas above. She also displayed attention to detail. Nomen is strictly-speaking authorable without writing code, but it is still essentially a working prototype of the basic functionality Field Day hopes to turn into something far more polished in the next year. In particular the formatting guidelines are strict and unforgiving. Ms. Martinez’ build compiled on her very first upload, a feat I doubt I will ever manage.

This work is worth looking at for several reasons, some of which go far beyond her specific innovations:

  • It is a good idea: the advantage and use of this tool to aid sign language learners is clear.
  • It begins to describe a model for extending learning outside the classroom based on interests developed within.
  • It is an example of how accessible tools can open new areas of development by subject matter enthusiasts instead of technologists.
  • It is an example of how a local games lab can encourage and support development in small but meaningful ways.
  • It is an opportunity to discuss how events like app contests might focus innovation and enhance collaboration, and having seen this contest play out, what might be learned from what apps are being tried, why, and how.
  • It is the first documented use of Nomen (there’s another one I’ve been working on I hope to post about soon), a new accessible and open tool that could see myriad uses by others across many fields of inquiry and contexts of interaction.

In follow-up posts, I’d like to discuss these more general ideas. If you beat me to it or have other resources to point to, then by all means…