2012 CSUN Conference
San Diego, CA
February 27- March 3, 2012
Title: Perkins School Talking Campus Model
Abstract: A multi-sensory way-finding and orientation aid for Perkins’ campus, this model provides spatial information in multiple formats, including tactile, visual, auditory, large print and refreshable braille.
Touch Graphics, Inc.
330 West 38 Street Suite 900
New York, NY 10018
Center for Inclusive Design and Environmental Access
School of Architecture and Planning
University at Buffalo, State University of New York
Buffalo, NY 14214
In 2011, Perkins School for the Blind challenged Touch Graphics to create an interactive way-finding and orientation system that is especially welcoming and accommodating for individuals of all abilities. In response, the company has created a multi-sensory campus model that users interact with in a variety of ways, depending on their capabilities and preferences. Unlike most architectural models, this one is robust enough for public touching because it is 3D printed from solid ABS plastic. A patent-pending coating system enables a computer connected to the model to detect touches on the model’s surface, and then respond with spoken descriptions, visual highlighting, and braille and large-print captions. Information about buildings, roads, water features, and other landscape elements is provided in a sequence of “layers” that give the name, description, and walking directions to any location on campus. The new talking model is located at the entrance to the Perkins’ new Grousbeck Center for Technology and Students, a show place for the latest in assistive technology (figure 1). Visitors are invited to explore the model before heading out on a campus tour, and Perkins students also enjoy interacting with the system, especially the multi-player game, where up to four players can demonstrate their familiarity with the campus layout while competing and having fun.
Figure 1: The model positioned near the entrance to Grousbeck Center for Students and
Technology. The 80 cell refreshable braille display is mounted to a sliding tray, and is normally stowed under the platform.
By overlaying information presented in multiple formats, we make it possible for almost anyone to benefit from the information provided by the talking model. Those who rely entirely or mostly on touch sense and hearing to acquire spatial information about the world experience the relative forms, sizes and groupings of buildings through manual exploration, with speech or refreshable braille feedback. Students using wheelchairs can comfortably “belly-up” to the touch-sensitive rounded edge of the pedestal, and they can move around to the other sides to explore the entire model. And visual learners like the bright, high contrast video projection that enlivens the model’s monochromatic gray-painted surface.
As each building, street, parking lot or other feature is touched, that part of the model is placed in a “spotlight” while the rest of the model dims. During recitation of walking directions, a bright yellow dotted line appears tracing the recommended route to any destination. Also, users can choose from several visual themes, including a high-contrast rendered diagram with no trees and simplified forms (see figure 2), or an aerial photograph that produces the uncanny feeling that you are looking at a miniature version of reality.
As they touch things, a sensor device concealed in the model’s pedestal evaluates finger pressure on each of 60 shaped probes that demarcate the extents of each audio zone. The sensor sends data via USB to a computer, which runs a program that determines which zone is being pressed and with how much pressure, and then announces information about the place was just touched in a sequence of chunks, or layers. Tapping on the same spot again skips the audio playback head to the beginning of the next layer. Rapid double tap gestures pause and unpause audio playback.
Figure 2: the model illuminated with a high-contrast rendered map image.
Figure 3: The model illuminated by a Google aerial view of the campus.
In addition to this always-available map interaction scheme, a Main Menu of options can be navigated by pressing illuminated buttons clustered on each side of the model. Menu options are:
The presentation will include findings from user testing carried out by staff from the IDeA Center at University at Buffalo. This study focused on ways that using the Perkins Talking Campus Model builds comprehension about the physical layout of the campus, and leads to more confident, efficient movement in and around this complicated outdoor travel environment.
Figure 3: A cross section through the exhibit showing the downward-facing video projector mounted above the acoustical ceiling tiles.
see http://www.touchgraphics.com/research/perkinscampusmodel.html and for additional information, images and videos.
Landau, S. & Eveland, Z. Shaped Capacitive Touch Sensor, Devices and Methods of Use. U.S. Application Serial Number 61/509,394. 2011.