Michael J. Astrauskas, Arizona State University
This entry is part of the CFHSS’s VP Equity Issues series on diversity, creativity and innovation / diversité, innovation et créativité
To see a whiteboard at the front of the class, students with severe visual impairment typically use a monocular for far-sight viewing. This provides a greatly magnified but a very narrow field of view of the board. In addition to the monocular, they might need to use their glasses for note-taking (i.e. near-sight viewing). As these students take notes in class, they must repeatedly switch back and forth between the whiteboard and their notes, incurring a so-called board-note-board (BNB) delay. This delay limits the speed at which students can take their notes. In fast-paced classes, this delay prevents them from being able to take comprehensive notes, often useful in improving retention, and puts these students at a disadvantage compared to their fully sighted peers.
In Fall of 2007, David Hayden, an undergraduate student who is legally blind, added a mathematics major to his computer science major. He previously had compensated for his limited pace of note-taking by spending extra time with his textbook outside of class. However, when he reached his senior year, he found that his math courses depended solely on the lectures (where a single lecture would often fill a dozen whiteboards) and had no textbooks to fall back on. In compliance with the Americans with Disabilities Act (ADA), ASU allowed David to receive copies of his classmate's notes. However, David found that notes taken by other students were “as foreign as a textbook, and less legible”. Unable to keep up, David sadly dropped his math courses.
David then heard of CUbiC’s research on creating assistive technologies for people who are blind or visually impaired. He volunteered his time in the lab and began brainstorming with John Black about a solution to his note-taking problem. Based on the ideas, he built the first prototype of a system he called, “the Note-taker”, using a commercial camcorder, a USB-controlled pan-tilt device, a clamp from Walmart, and a Tablet PC. He wrote Tablet PC software that allowed him to view live zoomed-in video on the surface of a Tablet PC (using onscreen buttons to control the camera) and take digital notes with a stylus using Microsoft OneNote. He then re-registered for the same three senior math courses, he had dropped earlier, and (using the Note-Taker to take notes) earned an “A” in all three classes.
Watch video, “Blind Ambition.”
David and John then submitted a grant proposal to the National Science Foundation’s Research in Disabilities Education program, and were funded for two years to further develop the Note-Taker. A development team then joined David to build a second-generation pan/tilt/zoom camera prototype using robotics components and an industrial camera with 36x optical zoom. The team developed an easy-to-use gesture-based touch interface that allowed the user to control the pan, tilt, and zoom of the camera, while taking notes in Microsoft OneNote. This prototype was presented in the “Touch and Tablet” category of the 2010 Microsoft Imagine Cup Competition, and bagged the first prize in the world finals in Warsaw, Poland.
The team was encouraged by Microsoft to enter the Imagine Cup again in 2011 in the more competitive Software Design category.
With additional feedback from user studies the software was further improved, and a third-generation pan/tilt/zoom camera prototype was designed. In April, the development team behind this prototype went on to claim the top spot in the 2011 Microsoft Imagine Cup US finals. The team next represented the nation in the 2011 Microsoft Imagine Cup Worldwide finals in July 2011 at New York City, competing against teams from over 66 nations, and finished as the worldwide runner-up in this prestigious competition.
During the first year of the grant period, the team included Andrew Kelley (CS), Michael Rush (CS), Michael J. Astrauskas (EE), and Liqing Zhou (Product Design). During the second year the team included Qian Yan (Product Design), Shashank Srinivas (CS) and Parth Pandya (CS).
Michael J. Astrauskas is with the Center for Cognitive Ubiquitous Computing (CUbiC) at the University of Arizona.