OLIVER WENDELL HOLMES Famously said, “Once the mind has been expanded, it will never again return to its original size.” As teachers, we like to think of this maxim in relation to our students’ learning. In the writing program at Berkeley City College (BCC), however, we have learned to apply it to ourselves. Through program assessment, we discovered that our thinking about what our students could do was limited, and we learned, with both humility and excitement, to apply this knowledge in order to revamp our program and thereby help our students succeed.
In spring 2011, the BCC writing program embarked on our first portfolio-based program assessment, which guided us down paths we never expected. It changed the way we think about how we can help our students learn and progress to be successful writers and students; consequently, it led to a drastic redesign of our writing program. The process brought us together as colleagues and gave us tools to continually analyze and refine our work. It also taught us to challenge our assumptions and constantly refine our processes for the good of our students.
In designing our first portfolio assessment, we began by looking at our program learning outcomes, according to which students completing this program would be able to:
We determined that the most efficient way to assess the degree to which students were achieving these learning outcomes was to assess student portfolios, each one consisting of a summary of a college-level reading, an in-class essay written in response to a common prompt, and a research paper. Because our pre-transfer English and English as a Second Language (ESL) composition/reading classes were intended to prepare students for freshman composition classes, we decided to ask all students taking freshman composition and all students in pre-transfer composition classes to participate in the portfolio assessment. (Note that “transfer” is used throughout this article in the sense of “transferable” to a four-year institution.) The entire department, consisting of English and ESL teachers, participated in scoring the portfolios of all students enrolled in these classes, applying a rubric of our design. Readers were normed, all portfolios were scored twice, and portfolio readers were unaware of the identity or the course level of the portfolios’ authors, nor did they have access, in each case, to the scores of the other reader. Each discrepancy was resolved by a third reader. We decided that the portfolio assessment would double as a culminating assignment for students, as well as a tool for program assessment.
The initial assessment yielded the first set of unexpected results in the area of research skills. We weren’t surprised to find that pre-transfer students didn’t perform well in terms of research skills, but we were dismayed to discover that the students in freshman composition didn’t fare as well as we had hoped they would. As we discussed possible reasons for this, we noted that students in the classes leading up to freshman composition did not practice research skills and that, even in freshman composition classes, students wrote only one research paper at the end of the term. We saw for the first time that we were asking students to learn a critical and difficult academic skill but only giving them one opportunity to practice that skill. This led to our first program change. We decided that we would have students in freshman composition classes write two short research papers instead of one long one; in a more radical move, we would have students in pre-transfer English and ESL classes also write two short research papers. Many teachers in the writing program were nervous about how to accomplish this, considering the other demands of writing classes. But because we were committed to helping students learn this critical skill, a small group of interested faculty began working on curriculum redesign to guide instructors in successfully making the change.
In addition, the first assessment yielded an even more surprising—and ultimately more significant—set of results. At the time of the first assessment, the BCC writing program offered a writing class one level below freshman composition and another class two levels below, as is typical of writing programs in California. As mentioned previously, all students enrolled in pre-transfer classes submitted portfolios, along with students in freshman composition courses. As we had expected, there was a notable gap in performance between the students in freshman composition and the other students. However, to our surprise, the students in the course two levels below freshman composition achieved scores almost as high as those in the course one level below. In fact, 90.3 percent of the students in the course two levels below freshman composition would have earned a C or better on the portfolio in the class one level below, according to the grading standards we set for that class level, and 40.4 percent would have earned an A or B. Many of us were aware of the statewide research that has indicated that students who begin their college careers taking English classes several levels below transfer tend not to complete the transfer-level course (Hern 2010). Yet our portfolio results, involving more than five hundred students each semester, indicated clearly that most of our students who began taking English classes two levels below transfer would have been able to succeed if they had been placed in a higher-level course.
There was one more significant and unpredictable (or at least unpredicted) result from our first portfolio assessment. We discovered that “basic skills” students were capable of learning the types of research and rhetorical skills that we had mistakenly assumed we shouldn’t be teaching them, though they slightly lagged behind other students in terms of mechanics and clarity. We had collectively assumed that we would be setting them up for failure if we taught research and rhetorical skills at the same level as in freshman composition. We learned that we had been wrong, and we immediately acted to correct our mistake.
Our first step in addressing the results of the portfolio assessment was to bolster the teaching of research in all of our classes. As a department, we designed a model curriculum: a semester-long schedule that would allow students to learn the skills in reading, writing, and research reflected in our student learning outcomes (SLOs). Instructors knew that they were not required to adhere to the schedule, as long as they met the basic requirements in the course outline and taught the SLOs. However, most instructors in freshman composition, as well as pre-transfer composition and reading classes, adopted the model schedule; all took responsibility for addressing the SLOs and agreed to include at least two research papers in their assignments.
When we realized that we had been holding our students back unnecessarily and that students could have—and clearly should have—learned the rhetorical and research skills that would be most useful to them in their academic careers, we designed a new course that we hoped would ultimately replace the one two levels below transfer. This course mirrored the freshman composition class and added a lab component in a computer lab setting, to be staffed by the instructor and three instructional assistants. In this course, students received individualized instruction and support so that they could successfully edit and proofread their work.
The results of the next portfolio assessment, structured in the same way as the first, were so striking that we felt we needed to collect data for another semester to confirm them. Students in the new class, though they would previously have been placed two levels below freshman composition, significantly outperformed students who had been placed in the class one level below. We held back on sharing our results widely until we had had a chance to confirm if they would be duplicated. They have, however, been duplicated consistently for three semesters. Most recently, in spring 2013, the average score in all skill areas for all portfolios in freshman composition (transfer level) was 81.61. The average score in the class one level below transfer was 63.48. The average score in the new, experimental class (for students who would have been placed two levels below transfer) was 70.44. This mirrored the results from the previous two semesters. Notably, the students in this class performed well in the skill areas in which we thought “basic skills” students could not be successful—research and rhetorical skills. As a result of these findings, we have replaced all basic skills classes with this newly designed class, which mirrors freshman composition but adds three hours of lab time with embedded instructional support.
But that’s not all. As we continued to develop our curriculum, we noticed another interesting change. In our first administration of the portfolio assessment (spring 2011), the top 7 percent of scorers were exclusively students in freshman composition. In the most recent administration (spring 2013), of the top 7 percent of scorers, 10 percent were students in the newly designed class, and 2 percent were in the class one level below freshman composition. Of the students who would have earned an A in the portfolio if they had been in freshman composition (the top 20 percent), 17 percent (24 out of 139) were in basic skills and ESL classes. (One of the more surprising and significant statistics that emerged from the latest assessment is that while 5.75 percent of the students who completed the portfolio assessment were in the new course, the percentage of these students among those who would have earned an A if they had been enrolled in freshman composition was 5.76 percent.)
This led us to question whether students who have already shown that they would have earned an A in the culminating project in freshman composition should be required to repeat the work of that class. We are currently developing an alternative approach to freshman composition for these students, a “competence-based” class that allows students to focus on skills they need to master, as indicated by portfolio assessment results, without having to repeat work unnecessarily.
The scope of this essay does not allow for a discussion of all of the aspects of our work that have surprised us and others. For example, the English and ESL departments have worked closely throughout the portfolio process, leading to changes in both curricula as well as to fruitful discussions concerning the ways in which our curricula can effectively dovetail. As in our English curriculum, the ESL curriculum has changed to allow for instruction in research strategies, and students in the newly developed advanced ESL reading and writing class have performed very well in relation to those students in freshman composition, as have their counterparts in pre-transfer English classes.
In addition to all the curriculum changes discussed here, our assessment process has led to the development of a writing, reading, and research skills rubric that is now ubiquitous among both teachers and students in English and ESL classes. Students in this program know which skills they aim to achieve, and if they move from one level to the next, they have a realistic sense of their accomplishments as well as the skills that they will work to improve.
Future projects that have emerged as a result of analyzing our findings include the development of a website on which to share materials and a joint project with one of our librarians to help us determine what we mean by “academically acceptable” sources.
Through the portfolio assessment process in the writing program at BCC, we’ve learned to question our assumptions concerning students’ capabilities. We’ve come together as English and ESL teachers with a common purpose and direction. Together, we’ve significantly restructured—and will continue to restructure—our programs.
Hern, K. 2010. “Exponential Attrition and the Promise of Acceleration in Developmental English and Math.” http://www.rpgroup.org/sites/default/files/Hern%20Exponential%20Attrition.pdf.
I wish to gratefully acknowledge my wonderful colleagues, the English and ESL teachers at Berkeley City College who did this work with me—especially Cleavon Smith, Gabe Winer, Meridith Paige, Scott Hoshida, and Laura Zink.
Jennifer Lowood is assessment coordinator and English department co-chair at Berkeley City College.
Sign-Up Now and Stay Informed!
Additional text goes here and here and here.
More text regarding the E-Alerts signup goes here.
© John Wiley & Sons, Professional Development Subscription Content
One Montgomery Street, Suite 1000, San Francisco, CA 94104-4594 | Phone: 800-835-6770
Terms and Conditions