jossey-bass
Nonprofit Business Advisor, Strategies to Survive and Grow in Tough Times

Assessing the Impact of an Assessment Communication Campaign

By Marlene Clapp, Evangeline Kuzmech, Brian Sousa November 2015
If you have ever read through student comments submitted in your institutional surveys, you probably have come across a comment like this one: “I don't even know why I bother filling out these surveys. I don't fully believe that anyone is reading them.” In the course of assessment work, a lot is asked of students, particularly for add-on assessments like surveys, focus groups, and standardized exams. If the expectation is for students to keep responding to requests and participating in assessment efforts, they need to know what happens with all of the feedback they contribute. Communicating the results of assessment efforts and their use has been identified as a “key strategy for getting students on board with the assessment process” (Bresciani, Gardner, and Hickmott 2009, 80). At UMass Dartmouth, a campaign was initiated during the fall of 2013 to help raise student awareness of how the feedback they provide through surveys and discussion groups is acted upon at the university. Changes informed by student feedback submitted through surveys and discussion groups are identified, then publicized through posters, flyers, table tents, campus TV ads, ads in the student newspaper, and the university intranet.

UMass Dartmouth's campaign—called “You Spoke. We Listened.”—was modeled after similar efforts at other institutions, including Marquette University and UNC Wilmington. Marquette is among several institutions spotlighted by the National Institute for Learning Outcomes Assessment (NILOA) for their assessment communication efforts (see http://www.learningoutcomesassessment.org/FeaturedWebsite Category.html#communication).

During the first year of the campaign at UMass Dartmouth in the 2013–2014 academic year, eleven changes that took place on campus were publicized. Seven more changes were highlighted in the second year of the campaign during 2014–2015.

While other institutions conducted their own similar campaigns, it was important to us to understand the campaign's impact on our campus. We hoped that the campaign would not only increase student awareness, but also foster a culture of improvement at the university that would lead to building student trust in assessment work and motivation to participate in assessment efforts. Near the end of the first year of the campaign, we conducted an assessment impact study to explore student experiences with the campaign and to learn more about the extent to which the campaign had brought about desired effects (Rossi and Freeman 1993). Because all UMass Dartmouth students were exposed to the campaign, it was not possible to engage in certain known strategies for comparison (e.g., randomized control and treatment groups). Thus, the impact assessment took the form of a simple before-and-after study, which is one research design option for impact assessments with interventions involving total coverage. The study involved 18 key informant interviews as well as supplemental observational data and confirmation survey data (Crabtree and Miller 1999). In keeping with a comparison method used for before-and-after studies, student key informants were asked to reflect on attitudes toward participating in assessment work before the campaign had been initiated (Rossi and Freeman 1993).

Two principal findings emerged. First, the study revealed varying degrees of campaign impact on student motivation to provide feedback on surveys and in discussion groups. When asked to reflect on how the campaign may have affected response to requests to participate in surveys or discussions, about the same number of key informants indicated that the campaign had no to little effect as those who indicated that it had a positive effect. Additionally, there were mixed findings from a student poll that asked about the extent to which student motivation to respond to requests to participate in campus surveys or discussions had been influenced by the campaign. Students who responded to the poll were evenly divided in rating the impact of the campaign as small, moderate, or large. Study findings indicated that students perceived that the campaign demonstrates that University administration is listening to their concerns, but a clear connection between the campaign and student feedback, particularly made through surveys and discussion groups, was lacking. During one interview, a key informant offered that “students just don't know what [the campaign] is. … Who spoke? Where'd they get this information?”

A primary recommendation derived from the impact assessment findings was to involve students more with the campaign to boost its visibility and influence. During the second year of the campaign, student liaisons were recruited to help with campaign efforts. Student liaisons were to help inform the campus community about the “You Spoke. We Listened.” campaign and provide input on the purpose, design, and recruitment methods of planned student surveys and discussion groups. Specifically, student liaisons helped guide the development of campaign materials and provided feedback on assessment measures such as a sophomore survey and a campus climate survey. Student liaisons also helped publicize the campaign at tabling events where they asked other students to cast a vote indicating their favorite among the latest publicized changes. They also suggested developing a small informational handout to help orient students to the campaign, guided the handout design, and distributed it to students.

At the end of the second campaign year, the two most active liaisons were asked to reflect on their experiences with the campaign. It was important to understand not only how the liaisons' work had benefited the campaign but also how working with the campaign had benefited the liaisons. For example, the liaisons were asked to reflect on what, if any, contribution their work as a liaison had made to their learning inside or outside the classroom. Both liaisons described how their work as liaisons had influenced their communication skills. One liaison commented, “One way [working on the campaign] has helped me improve as a student has been in exploring how word choice can influence clarity in a written work. I believe I have begun to focus more on word choice and clarity in my own academic writing after seeing how important it is to survey writing.” Similarly, the other liaison offered, “In my English class, we learned that presentation is key: without a proper presentation, the value and meaning of writing is pointless. The same could be said for my work as a student liaison. As I had to put my own input into what would later become posters, projections, and flyers, it was critical that a student could easily understand the many changes made by the University.” Both liaisons also mentioned how their work with the campaign had helped them forge stronger ties to the university. One liaison remarked, “I have … begun to forge a stronger connection with the UMass Dartmouth community by developing relationships with professionals and with students through discussing the changes.” Likewise, the other liaison stressed, “I learned more about UMass Dartmouth than I could have ever imagined. As a freshman, new to the college experience, I was introduced to a campaign that focused on changes made throughout the campus.”

The student liaisons were also asked to articulate any suggestions they had for the campaign moving forward. Both spoke to the need for further work to communicate the purpose of the campaign and increase its visibility. One liaison offered, “In upcoming years, I believe the campaign should focus less on communicating changes and more on communicating how the surveys influenced changes.” Similarly, the other liaison remarked, “I first-hand witnessed the varying reactions students and faculty had when walking by our tables throughout campus; some were pleasantly surprised by the changes, while others were never even aware of them. This itself shows that more needs to be done in regard to advertising the campaign. … With more advertising, and more interest in the program, more people will be able to see that their voice does in fact count, and every opinion matters.”

After two years, it is apparent that the UMass Dartmouth's assessment communication campaign has had some impact on campus, but there is more that can be accomplished. Efforts to build campaign awareness, understanding, and influence need to continue. One strategy suggested by the current student liaisons is to recruit additional students as liaisons to help inform the campus community about the campaign. Because student liaisons indicated that they have benefitted from their work with the campaign, involving additional students could not only aid campaign efforts but also foster associated positive effects for student liaisons themselves. Another strategy may simply be to allow the campaign more time to percolate throughout campus. Culture has been described as the “learned product of group experience” (Schein 1985, 7). As such, the related impact of campaign objectives to foster a “culture of improvement” at the university may evolve over time. One liaison offered, “I can envision the future of this campaign to enlighten and bring about an ideology of change throughout the campus in every aspect, which will only lead to more satisfaction for the students in times to come.”

Biographies

  • Marlene Clapp, formerly senior institutional research analyst with the University of Massachusetts Dartmouth, is director of Institutional Effectiveness at Massachusetts Maritime Academy

  • Evangeline Kuzmech is an undergraduate student and peer health coordinator

  • Brian Sousa is a resident assistant at the University of Massachusetts Dartmouth.


Copyright © 2000-2015 by John Wiley & Sons, Inc. or related companies. All rights reserved.
Wiley