Alternative Assessment in the Cloud

By Paul BaeplerApril 3, 2011 | Print

(originally published in the March-April 11 issue of Assessment Update, available electronically to subscribers in March.)

With the advent of cloud computing—which has given us easy access to a spectrum of new Web-based applications—and the rise of emergent technologies such as podcasting and YouTube, instructors face a new set of assessment opportunities. Digital literacies will become increasingly important for students as they enter the workforce, and higher education institutions will find these skills essential as calls mount to prepare students for this Internet-mediated future (Hull & Nelson, 2009). In a recent study, the Pew Internet and American Life Project noted that by 2020, large institutions, including colleges and universities, will face significant exogenous pressure to become more cooperative, efficient, and responsive as a result of expectations created by a networked society (Anderson & Rainie, 2010). Faculty who embrace this imperative and who want to design assignments that take advantage of these new technologies will likely need to craft new assessments and learn new methods for creating feedback channels. Until recently, when cloud computing has helped to enable a participatory Internet culture, this was a tall order.

Imagine a course redesign scenario in which you replace a written essay assignment with a video project. How would you encourage peer assessment or deliver situated feedback at specific moments during the video? If this were a written product, you might write on the paper itself along the margins. Maybe you would have turned on Microsoft® Track Changes and added comments at various critical places. How could you give this same level of feedback at precise moments in a video that a student has submitted for your class? Could you as easily intersperse your impressions within the video itself as you would when commenting on a paper? This may seem like an instance in which a new technology has outpaced its ability to be easily assessed. But Web 2.0 technologies are beginning to change this. We can annotate video easily, deliver classroom assessment techniques online, and provide even more nuanced feedback on written essays.

Video in the Cloud

YouTube served over 12 billion videos in one month in 2009, and along with other social video sites like Vimeo and Viddler (not to mention commercial sites like Hulu™), video consumption is growing at an amazing pace. With the advent of inexpensive portable video recorders like the Flip! video camera and the ubiquity of digital cameras in mobile devices, the capacity to create video productions is also ramping up substantially. YouTube recently enabled an annotation system that allows a user to overlay comments on a video, typically in the form of a speech bubble. This method could be repurposed to deliver simple formative feedback because the source video is not altered. YouTube also offers a convenient “spotlight” function that allows a user to outline an area of the screen to isolate and highlight a single aspect of a set of frames, much like shining a spotlight on a stage. This strategy lets the reviewer focus attention on a particular area of the image, essentially drawing attention to a smaller portion of the screen (

But there are several downsides to YouTube’s method of annotating a video, which, after all, really was not built for an educational audience. For instance, the speech bubbles in which comments are presented are juvenile in appearance, resembling the affordances in comic books. Also, the process of creating each comment is not particularly efficient and takes several steps to accomplish. This repetitive process might discourage instructors from leaving extensive commentary. Additionally, this method of stamping the note onto the viewable image partially obstructs the screen. It not only distracts the reviewer from other action within the frame, but—particularly if the comment is lengthy—it might also be misconstrued as disrespectful because it obscures part of the scene. Finally, because the commentary appears independently throughout the video, the student would need to play the entire movie to locate your remarks. If this had been a written assignment, it would be as if the writer had to reread her entire essay just to get to your commentary. While this might encourage deeper reflection, it might also be unnecessarily tedious and, ultimately ignored.

Fortunately, a new free annotation tool, VideoANT (, has emerged that resolves many of these issues (Hosack 2010). VideoANT takes the URL of a YouTube video or any published .mov or .swf file and provides the reviewer with three windows. One window presents the video, another depicts a linear time line, and the third—which takes up the entire right side of the monitor—provides a dedicated space for comments. As the video plays, the reviewer simply clicks in the comment box on the right to make an annotation, and simultaneously, a marker is dropped onto the time line to indicate to the viewer that an annotation can be found in this location. When the author receives the URL of this new “VideoANT” in an e-mail, they can immediately read through the organized comments, choose and select any comment, and the time line jumps to that point in the video. This capability obviates the need to review the entire text, and the student can concentrate exclusively on those pinch points that might need to be revised. For summative review, an instructor could easily cut and paste from a rubric directly into VideoANT, embedding the scoring directly into the context of the assignment. Because the annotation appears in its own screen, larger comments do not occlude the view area. A student can read the comment and see the entire frame simultaneously. This elegant pairing of annotations with image and sound creates a straightforward platform for embedding written feedback in a multimedia text.

CATs in the Cloud

As more instructors redesign their courses and incorporate just-in-time teaching practices, using peer instruction and personal response systems to enhance the classroom experience, in-person classroom time becomes more precious. Many activities that had once happened during lecture—basic content review, for instance—have been repositioned to take place outside the classroom. It makes sense, then, to rethink how we administer simple classroom assessment techniques (CATs). Again, cloud computing tools can be easily marshaled in service of “outside the classroom assessment techniques.” For example, many of Angelo and Cross’s (1993) CATs can be administered using Google Docs (

Google Docs is a suite of online tools that mirrors the basic functionality of the Microsoft® Office product. Authors can share documents, spreadsheets, and presentations online. With Google Forms, one of the most powerful applications of the suite, you can create and publish an online form with almost no training. Once the form is created, the URL can be sent to students, and their responses on the Web will be collected and organized in a Google spreadsheet. This easy-to-use tool opens the door to informal online assessment.

Imagine that you want students to conduct “Exam Evaluations” outside class (Angelo & Cross, 1993, p. 359). Perhaps you have two or three reflective questions you want students to answer after each midterm for a few bonus points. These questions could be administered through a Google Form, and students could answer them within 24 hours of the exam. Answers can be time-stamped to guarantee compliance with your instructions, and all the responses can be scanned quickly in a single Google spreadsheet. This method gives students time to reflect on their experience outside the often frenetic circumstances at the end of a midterm, and consolidating all student responses makes scoring extremely efficient. “Classroom opinion polls,” “muddiest point,” “background knowledge probes,” and other CATs can also be redesigned to work outside the classroom (Angelo & Cross, 1993, pp. 258, 154, 121). Students’ electronically stored responses can be reviewed easily after the course to help instructors refine future classes.

Feedback in the Cloud

Despite the precipitous rise of new media, traditional literacies remain integral to higher education. Students will continue to write essays, and faculty will continue to provide feedback in the margins at the end of papers. The question becomes: How might new media be brought to bear on this old and important exchange?

One answer could be that teachers would respond to essays by using screencasts. Screencasts are short recordings of what happens on your computer screen. Traditionally, screencasts have been used for computer training to illustrate how to use a particular piece of software. This same technology, however, can be used to comment on seminar papers, providing audio and visual support for written comments.

This multimodal assessment process could begin much as it would during a traditional written assessment. The instructor would read the essay and make comments onscreen directly in the student’s Word document. Then, with the annotated paper displayed on the instructor’s computer screen, he could start a screencasting application such as Jing ( Programs (for a list of free and commercial screencasting software, see the review on, like Jing® offer the simplicity of a tape recorder, with “record,” “stop,” “play,” and “rewind” buttons. Having begun the recording, the instructor can highlight passages and record comments, clarifying and expanding upon written remarks. At the end of his remarks, he can save the file and send the student both the URL of the screencast and the Word document.

In trials using asynchronous audio feedback (essentially a screencast without video), not only did students report that it was more effective than text-based feedback alone, but it significantly increased the level at which students applied feedback to revisions. While written comments certainly provide good direction for revision, the instructor’s voice can convey greater nuance as well as an impression of the teacher’s demeanor. Students reported a greater sense of instructional presence when feedback was delivered orally, and instructors reported a significant decrease in the time spent actually delivering feedback (Ice, Curtis, Phillips, & Wells, 2007).


The construct of a distinct demographic of learners—so-called “digital natives”—who possess sophisticated knowledge of information technologies has been called into question. Recent research suggests that the role technology plays in our current students’ lives is varied and complicated. Although many students are immersed in a digital world, it is most frequently in the form of consuming multimedia rather than creating it (Bennett, Maton, & Kervin, 2008). This understanding of the range of talents among students suggests that increasingly we will need to develop new ways for assessing digital literacies among a diversely skilled student body. At the same time, we should understand that, indeed, students are accustomed to “consuming” digital products, and this ease with absorbing information in digital forms also includes processing assessment information and feedback. These are the dual challenges that cloud computing technologies can help us face: assessing digital literacies, particularly in multimedia work, and enhancing traditional feedback through digital means.


Angelo, T., & Cross, P. (1993). Classroom assessment techniques: A handbook for college teachers. San Francisco, CA: Jossey-Bass.

Anderson, J., & Rainie, L., (2010, March 31). The impact of the Internet on institutions in the future. Retrieved from

Bennett, S., Maton, K., & Kervin, L. (2008). The ‘digital natives’ debate: A critical review of the evidence. British Journal of Educational Technology, 39(5), 775–786.

Hosack, B. (2010). VideoANT: Extending online video annotation beyond content delivery. TechTrends, 54(3), 45–49.

Hull, G., & Nelson, M. E. (2009). Literacy, media, and morality: Making the case for an aesthetic turn. In M. Prinsloo & M. Baynham (Eds.), The future of literacy studies (pp. 199–227). Basingstoke, UK: Palgrave Macmillan.

Ice, P., Curtis, R., Phillips, P., & Wells, J. (2007). Using asynchronous audio feedback to enhance teaching presence and students’ sense of community. Journal of Asynchronous Learning Networks, 11(2), 3–25. Retrieved from

Paul Baepler is senior educational technology consultant at the Collaborative for Academic Technology Innovation at the University of Minnesota-Twin Cities.

Why Wait?

Get the current newsletter and
Join Our Email List
Sign up to receive exclusive content and special offers in the areas that interest you.
Copyright © 2000-2015 by John Wiley & Sons, Inc. or related companies. All rights reserved.