jossey-bass

Assessment Is Not a Spectator Sport: Experiencing Authentic Assessment in the Classroom

By Sarah M. Keeling, Kara M. Woodlee, Michelle A. MaherOctober 10, 2013 | Print

As we have done in each of the past five years, we offered the course EDHE 839: Assessment in Higher Education last spring to students enrolled in the student affairs master’s degree program at the University of South Carolina. The history of the course on this campus is in some ways a faithful index to the fortunes of the assessment movement as a whole. Designed and first offered almost two decades ago by a veteran faculty member who also served as a university administrator, the course was received by students and faculty colleagues with trepidation. Yes, the program should have an assessment course, but not a required one. The course attracted a few hardy souls. Most, however, steered clear, asking the question Peter Ewell (2002) noted that he heard frequently in the early years of the assessment movement: “When is assessment going to go away?”

With the veteran faculty member’s retirement, EDHE 839 fell into disrepair. Teaching notes left behind were difficult to understand for those unfamiliar with assessment lingo. The third author, a junior faculty member at the time, eventually agreed to restructure the course and offer it on a limited basis. Time passed. Then, seemingly suddenly, the assessment movement was everywhere on campus, bolstered by the urgent need to respond to a myriad of stakeholder concerns about what— or if—students were actually learning during their college years. Everyone on campus was now an assessment expert or on their way to becoming one. EDHE 839 enjoyed newfound popularity and was promoted to a required course, avoidable only if one enrolled instead in EDHE 737: Legal Aspects of Higher Education. The EDHE 839 student roster filled to overfowing.

In this article, the first and third authors, course co-instructors, and the second author, a recent EDHE 839 student, consider assessments of the assessment course received from the latest class of students. Because it was a key course component and because it offers lessons for assessment practitioners in and beyond the classroom, we pay special attention to the use of student teams engaged in authentic assessment practice as part of class instruction.

Student Assessment Teams in Practice

Active learning became an EDHE 839 cornerstone because, to apply a common phrase, the practice of assessment is not a spectator sport. Prior to the start of the semester, as course instructors (first and third authors), we identified campus student affairs offices willing to host a three- or four-member student team through the semester as they conducted an assessment activity within that campus office. We individually met with personnel from each office to review their planned assessment project. We took care to note that the project must fit the semester timespan. Further, we asked that office personnel plan a project that, in terms of out-of-class student involvement, neither overwhelmed nor failed to engage students. Offices were always grateful for team assistance on assessment tasks that sorely needed to be addressed but had been left undone due to already full work schedules.

During the first EDHE 839 class session, we described the use of this instructional strategy to students, explaining, as does the course syllabus, the following:

As part of a team, you will become an “assessment expert” for a USC– Columbia campus student affairs office. Each team will assist office area personnel by engaging in an authentic assessment project within that office. Each team will produce a project prospectus, a final project report, and a project presentation.

In general, the use of student assessment teams over the years has been rewarding for students, office personnel who host student teams, and course instructors. In a survey administered at semester’s end, students answer the question: “If asked about your experience with assessment during a job interview, what would you highlight in your response?” The following replies emphasize the benefits derived from the use of assessment teams:

  • “I would highlight that I got to work hands-on in a real world assessment project, how I got to conduct focus groups and give recommendations.”
  • “I would highlight my experience and practical knowledge gained through the team project. I really liked working with my office.”
  • “I would highlight the team project that I was able to conduct with three other team members and the focus groups that we conducted. I appreciated all the hands-on work that this class provided and I would certainly highlight it during a job interview.”

However, we have also noticed that student team use in an assessment class is not without its own particular pitfalls.

In the first course session, students provided information about their educational background and their current understanding of assessment. This initial student information proved important in understanding some of the challenges that student teams encountered later. Unlike, for example, students enrolling in engineering or mathematical graduate degree programs, those enrolling in student affairs graduate programs are drawn from a wide variety of disciplinary backgrounds. In the class of twenty-four students, educational backgrounds in humanities and social sciences were common, but also enrolled were undergraduate degree recipients from elementary education, forensic sciences, chemistry, and vocal performance. Previous assessment experiences on a college campus spanned from little to none (most common) to fairly extensive, as with the student who relayed, “I have a graduate assistantship, and many of my projects are assessments. I have done a focus group, compiled quantitative and qualitative data from several surveys, and written reports based on the data.” These differences emerged early and then often in teamwork throughout the semester. As one student relayed:

This disparity in [team members’ assessment] knowledge led to frustration within the assessment project teams, because it was difficult to get all team members on the same page when it came to fulfilling the requirements of the team project. If there was a team member who was more experienced, he or she was the group leader by default, and took on more responsibilities in the team.

Further, in theory, the office projects we planned seemed well designed and clearly focused. In practice, however, complications arose. Not all offices were clear about their project’s scope or the expectations of the students who worked with them. For example, one office discovered that a large database required updating before the assessment project could begin. The student team was asked to do this, which delayed that team’s progress and diverted student focus from the stated learning outcomes associated with team projects. A negotiation between office personnel, team members, and instructors resolved the situation, but precious time was lost in the process.

Sometimes assessment outcomes were not clearly defined by offices. Data could not be collected until assessment outcomes were clearly stated, and some offices realized that, upon closer consideration, intended assessment outcomes were, in fact, not their actual desired outcomes. Further, not all offices had a consistently designated contact person for team member interaction. Thus, some teams felt “handed off” from one staff member to another as schedules changed, resulting in communication complications. Also, different assessment projects required the use of different instruments. Thus, some teams became well versed in the design of focus group protocols while others spent their time mastering survey design.

Finally, as assessment and research are complementary but not synonymous activities (Schuh and Upcraft 1998), as instructors, we found ourselves conflicted about how much class time to devote to assessment versus research content. While a basic overview of research design was part of the course in terms of its relationship to facilitating assessment activities, for some students, it was not sufficient. We have learned that a fair amount of consideration should be given to the sequencing of research and assessment courses within individual students’ programs of study, a consideration complicated by the variation in students’ prior educational experiences.

Reflections and Recommendations (Lessons Learned)

In the spirit of continuous improvement, we reflect on EDHE 839 student survey responses and our own course experiences to consider how to strengthen this team-based course. We began this article by noting that at the start of the assessment movement many asked, “When will assessment go away?” It isn’t going away, and neither is the need for student affairs professionals, and all educators, to engage in assessment to inform their practice. However, through its use in and beyond the classroom, we have learned more about authentic assessment and about ourselves as assessment practitioners. We offer reflections that we believe may be applicable to both the teaching and the doing of assessment.

First, knowledge of the assessment process and facility with key assessment skills are likely to be uneven in the classroom and across the campus. Graduate students arrive in class with an assortment of disciplinary backgrounds that may or may not lend themselves to a quick grasp of assessment practices. In similar fashion, campus staff and faculty who comprise offices, departments, colleges, and universities have various levels of assessment knowledge and skill. Just because some claim to “do assessment” doesn’t mean, upon close inspection, that they are referring to the same set of actions and drawing from the same knowledge base, or that what is produced will be put to the same use for the same reason. Recognition of this in early discussions about undertaking assessment can smooth the path ahead. Dialogue such as “When you use the term survey, can you give me a concrete example of what you mean?” can situate those in a community of assessment practice on much firmer ground than simply assuming that all intuitively hold a common understanding.

Second, like every act of discovery, assessment is a fluid process, prone to making unexpected left turns. Graduate students of science are often befuddled to find that “real” science, unlike prescribed undergraduate laboratory assignments, is chaotic, messy, frustrating, and unpredictable. As Delamont and Atkinson (2001, 88) observed in their study of novice doctoral students in field sciences, “Failure is a normal outcome of routine work.” While judicious use of the iterative assessment cycle may provide a roadmap for students of assessment, authentic assessment is also experienced as chaotic, messy, frustrating, and unpredictable, especially for those new to the process. Textbooks make it look easy, while experience teaches that effective assessment practitioners learn to be flexible and creative, resilient in the face of unexpected but almost inevitable setbacks.

We look forward to the continued offering of EDHE 839 and to each year watching our student assessment practitioners grow and mature alongside the assessment movement itself. We extend this discussion of EDHE 839 experiences to all who anticipate doing likewise with their students and their campus assessment partners.

References

Delamont, S., and P. Atkinson. 2001. “Doctoring Uncertainty: Mastering Craft Knowledge.” Social Studies of Science 31 (1): 87–107.

Ewell, P. 2002. “Perpetual movement: Assessment after twenty years.” Retrieved from http://www.teaglefoundation.org/
teagle/media/library/documents/
resources/2002_ewell.pdf

Schuh, J. H., and M. L. Upcraft. 1998. “Facts and Myths about Assessment in Student Affairs.” About Campus 3 (5): 2–9.

Sarah M. Keeling is student services manager in the School of Library and Information Science and Michelle A. Maher is associate professor in higher education administration at the University of South Carolina. Kara M. Woodlee, formerly a master’s student and graduate assistant at the University of South Carolina, is an academic advisor at Indiana University–Purdue University Columbus.

Why Wait?

Get the current newsletter and
Join Our Email List
Sign up to receive exclusive content and special offers in the areas that interest you.
Send
Copyright © 2000-2015 by John Wiley & Sons, Inc. or related companies. All rights reserved.
Wiley