U of M Home | Search U of M
AvenueASL

AvenueASL logoAmerican Sign Language (ASL) has evolved into the third most widely used language in America preceded only by English and Spanish (Welles, 2004). Currently, more than 500 colleges and universities in the U.S. offer ASL instruction as a world language (Wilcox, 2004).

Between the years 1992 and 2006, the rapid increase in demand for postsecondary ASL instruction and linguistic study created a wide range of instructional challenges including

  1. assessing, measuring, and documenting learner progress; and
  2. providing formative feedback through an efficient, effective, and technically-valid means (Miller & Hooper, 2008).

Project Overview

The most widespread practice for assessing ASL fluency involves evaluating video recordings of interviews with individual students (Newell & Caccamise, 1992). Prior to 2006, approximately 1,800 ASL students per semester at a large Midwestern university completed mid-semester and final-exams by renting a video camera from the program office, recording a 15- to 20-minute conversation with a fellow student, and submitting the videotape for evaluation (Miller, Hooper, & Rose, 2005). Instructors then reviewed the video (a process often lasting 45 minutes per videotape, due, in large part, to fast-forwarding through incomplete edits, false starts, and ‘redos’ of the exam), assessed the performance, and recorded a single-digit evaluation score with brief textual feedback comments on a note card. Ultimately, these assessment and examination practices proved burdensome for both students and instructors (Hooper, Miller, Rose, & Veletsianos, 2007). Instructors noted that evaluation time averaged three to five weeks per course, delaying learner feedback and limiting valuable reflection opportunities for students. Students noted that ‘meaningless’ scores on a 10-point scale and abbreviated feedback comments provided little to no guidance in improving actual ASL signing expression. Furthermore, the feedback delay made it difficult for instructors to modify classroom instruction based on evaluated deficiencies in learner performance.

To address the aforementioned evaluation and feedback challenges, the AvenueASL e-assessment environment was designed to establish

(a) a platform for students to capture, submit, and archive ASLvideo performances,

(b) a setting for instructors to evaluate and report student performance and feedback,

(c) a portfolio where students can monitor their personal performance and feedback, and

(d) an administration component to manage and coordinate performance evaluation and feedback data.

In excess of 2,000,000 practice and assessment tasks have been completed in the environment, encompassing what is believed to be the largest collection of ASL assessment performances captured in an online environment.

To date the AvenueASL environment and assessment/feedback strategies have been used by seven postsecondary ASL departments and eleven K-12 ASL curricula, influencing more than 5,000 students and 200 teachers nationwide each year. Over 500,000 student performances were captured and evaluated during the 2009-2010 academic year.

We believe success of the Avenue ASL e-assessment environment is the result of an unrestrained project evolution based on iterative cycles of theory, design, and implementation research. Rather than accepting the preliminary instructional problem as a concrete and steadfast blueprint for development, we recommend that designers embrace the unexpected, both in design and research, and challenge the notions of technology use in e-assessment.

Research

Miller, C. (2011). Aesthetics and e-assessment: The interplay of emotional design and learner performance. Distance Education, 32(3), 307-337.

Miller, C., Hokanson, B., Doering, A., & Brandt, T. (2010). Role-based design: Designing for Experience (article 4 of 4). Educational Technology, 50(6), 1-10.

Miller, C., Doering, A., & Scharber, C. (2010). No such thing as failure, only feedback: Designing innovative opportunities for e-assessment and technology-mediated feedback. Journal of Interactive Learning Research, 21(1), 65-92.

Miller, C., Hooper, S., Rose, S., Montalto-Rook, M. (2008). Transforming e-assessment in American Sign Language: Pedagogical and technological enhancements in online language learning and performance assessment. Learning, Media and Technology, 33(3), 155-168.

Miller, C., & Hooper, S. (2008). Avenue ASL: Transforming curriculum through design, theory, and innovation. Tech Trends, 52(3), 27-32.

Hooper, S., Miller, C., Rose, S., & Veletsianos, G. (2007). The effects of digital video quality on learner comprehension in an American Sign Language assessment environment. Sign Language Studies, 8(1), 42-58.

Miller, C., Hooper, S., & Rose, S. (2005). Avenue ASL: Developing an environment for capturing, evaluating, and monitoring American Sign Language learner performance. Advanced Technology for Learning, 2(3), 140-147.

Awards

2007 – 1st Place – AECT Outstanding Achievement in Innovative Instructional Design, Funded Projects >$100,000 Design and Development Showcase.

Acknowledgements

AvenueASL research and design was supported in part by grants from the Fund for the Improvement of Postsecondary Education (FIPSE), US Department of Education.

 

Additional References

Newell, W., and Caccamise, F. 1992. Sign Communication Proficiency Interview (SCPI) manual: Rating scale. Faribault, MN: Minnesota Resource Center for the Deaf and Hard of Hearing.

Welles, E. (2004). Foreign language enrollments in United States institutions of higher education, ADFL Bulletin, 35(3), 7-26.

Wilcox, S. (2004). American Sign Language as a foreign language. Retrieved January 3, 2006 from http://www.unm.edu/~wilcox/ASLFL/aslfl.html

 

 

Copyright © 2011 Regents of the University of Minnesota and LT Media Lab. All rights reserved.
The University of Minnesota is an equal opportunity educator and employer.