(Full article originally published in the March-April issue of Assessment Update, available electronically to subscribers at publication.)
The third rail in a train system is the exposed electrical conductor which carries hundreds of volts of electricity. Stepping or touching the third rail usually results in electrocution and likely death. The term has become a metaphor in politics to denote an issue that is so “charged” that anyone who dares broach the subject will suffer political death. Social Security and health care reform have been two issues often referred to as the “third rail of politics.”
Notwithstanding the truth inherent in the metaphor, I believe the third rail has gotten a bum rap. After all, the third rail is where the power resides. Without the third rail carrying those hundreds of volts of electricity, the train would not run. So, while dangerous, it is also powerful.
This article describes the three rails of assessment that one small private college laid down to guide and power our assessment program. Three parallel tracks were laid at the same time: two to guide us and keep us going in the right direction, and a third rail to power us along the way.
Getting on track
Faculty and administrators at Lancaster Bible College & Graduate School (LBC) realized that while we had some assessment going on across campus, it was fractured and disorganized. We had no real direction and very little power to keep us going. Both of our accrediting agencies, the Middle States Commission on Higher Education and the Association for Biblical Higher Education, required that we complete an assessment progress letter within two years of the reaffirmation of our accreditation. So one of the primary goals we set for ourselves was to begin to create a unified and consistent assessment program.
As part of our accelerated efforts to get on track, we identified three main focuses for our immediate attention: enhancing institution-level assessment, improving unit-level assessment, and encouraging a culture of assessment. Since we had less than two years to make as much progress as possible, we started laying all three tracks at the same time.
I. Enhancing institution-level assessments. In order to create a sustained process of assessment, LBC administrators made a commitment in terms of personnel and finances. As a result of our accreditation self-study process, we recognized the need to establish the Office of Institutional Research & Assessment (OIRA) and to appoint a director. The OIRA has its own budget with a significant portion of it allocated to assessment instruments. After its first year in operation, OIRA’s budget was increased by 7.5%, with almost all of that increase going to additional instruments.
Besides overseeing all assessment on campus, the director of OIRA (DIRA) is directly responsible for measuring the accomplishment of LBC’s Mission, Core Values, Institutional Goals, and Core Knowledge & Skills. To that end, the director serves on the Committee for Institutional Effectiveness and Planning to ensure that assessment is an integral part of the overall planning process. To assist the DIRA, the Committee for Institutional Research & Assessment was established, comprising individuals representing all four major areas of the college.
Plans & Procedures: Now we have an assessment plan that outlines our current assessment of student learning outcomes at five levels: course, program, division, core knowledge and skills, and institutional. A section of the plan is devoted to each of these levels and includes a delineation of assessment goals, procedures, monitoring process, and use of results. The plan also includes a schedule for institutional and divisional level norm- and criterion-referenced testing.
While our freshmen and seniors had been taking three “value-added” assessments for years, very little was being done with the data and no one was really overseeing the process. In the years since our last accreditation visit, the college has participated in seven comparative assessment studies and the results have been systematically shared with the appropriate departments. It is recognized that this level of assessment cannot be sustained but we wanted to jump start our institution-level assessment as quickly as possible.
We have also enhanced our institution-level assessment by creating an IPEDS Custom Comparison Group for the first time. The group consists of 51 other bible colleges with membership in the Association for Biblical Higher Education and represents a more accurate representation of our peers than the Carnegie Specialized Institutes to which we have been compared previously. Likewise, we continue to develop other comparison groups through the Integrated Postsecondary Education Data System (IPEDS) to focus on specific criteria we wish to assess.
II. Enhancing unit-level assessments. Another of the three tracks that needed to be laid was the enhancement of unit-level assessments on both “sides” of the campus: academic and student support. For several years prior to our team visit, LBC’s academic programs had been using a five-column outcomes assessment grid proposed by Nichols and Nichols (2000). These Comprehensive Outcomes Assessment Plans (COAPs) track the assessment of learning outcomes across five columns: “Expanded Statement of Institutional Purpose,” “Program Intended Educational Outcomes,” “Means of Program Assessment and Criteria for Success,” “Summary of Data Collected,” and “Use of Results.” We have adapted the grid to include a sixth column that takes the “Use of Results” one step further by providing a place to indicate whether an Action Plan or a Strategic Initiative is needed. This additional step facilitates to an even greater degree the unit’s ability to close the assessment and planning loop. Action Plans and Strategic Initiatives are follow-up activities that have been identified based on the findings of an assessment of the unit’s outcomes. We define an Action Plan as a follow-up activity that can be accomplished by the unit with its available resources. A Strategic Initiative is needed if a follow-up activity requires additional resources that must be approved by some other administrative body on campus.
On the student support side of campus (i.e., all non-academic units), a six-year assessment cycle was put into place similar to the one for academic units. Each support unit maintains a COAP that is to be continuously updated and given oversight by the appropriate vice president. The evaluation cycle calls for a formal review of the COAP every two years, and a full program review every six years.
III. Encouraging a culture of assessment. The 1981 movie Chariots of Fire chronicles the story of several promising young British athletes as they prepared to participate in the 1924 Olympics. One of the most memorable lines in the movie was when the Scotsman Eric Liddell is speaking to a crowd in the pouring rain after he had just won a track meet leading up to the Olympics. Liddell asks: “So where does the power come from, to see the race to its end? It comes from within.” So too does the power of assessment come from within, not from without. Hence, the most powerful aspect of our endeavors, yet the most dangerous, is this third rail of enhancing a culture of assessment in all faculty and staff.
Academic Units: “I’m already doing assessment; I give tests and grades.” “Assessment is just a fad that will soon pass.” “I’m about to retire and I really don’t care to change what I’m doing in my classes.” How do you address those sometimes verbalized, sometimes unspoken, sentiments? This is the dangerous part of the third rail of assessment. You need to respond with care and caution, giving respect to that third rail, knowing you must maintain a close working relationship with these individuals and are dependent upon them. Telling them to “Just do it; our accrediting associations require it” is like stepping on that third rail—you will most certainly be electrocuted.
One of the first steps in approaching the third rail with caution is getting the backing of the administration. That was easy for us. While acknowledging that the ultimate goal and benefit of assessment is enhancement of student learning, the requirement of a progress letter to our accrediting associations added an incentive for our administration to give full backing to a concerted effort to enhance a culture of assessment across campus.
First, the DIRA conducted a full-day workshop for faculty as part of an annual two-day training event. An overview of assessment was given, including the faculty’s responsibilities in assessing student learning for the benefit of students. Emphasis was placed on course-embedded assessment; several examples were given. A workshop activity was assigned to each of the three college divisions for the afternoon. Each faculty member was then given the assignment of developing one course-embedded information literacy assessment assignment for a course. Of the full-time faculty, 87% completed the assignment.
To assist them in their self-directed efforts to learn more about assessment, faculty have access to a variety of tools through an assessment web page. Besides the aforementioned Faculty Workshop materials that are online, faculty also have access to information about classroom assessment techniques, rubrics, and information literacy. Tools to assist them in program evaluation are also available. Faculty are encouraged to highlight their achievements in the assessment of student outcomes by documenting them in their Faculty Portfolio. The template for the portfolio was modified recently so that assessment is acknowledged as a vital part of the instructional process. In addition, a curriculum mapping database program has been developed to assist faculty in making sure that course objectives are tied to higher level outcomes like program and division objectives, general education goals, and Lancaster’s Core Knowledge & Skills.
New instructors are started off on the right track as the DIRA now has one session of the Connecting with Community weekly workshops for new faculty members. As part of this session, all new faculty are given a copy of Classroom Assessment Techniques by Angelo and Cross (1993). In addition to this session on institution-wide assessment, the chair of the Teacher Education Department leads two sessions on classroom-specific assessment and gives each new faculty member a copy of Linda Suskie’s Assessing Student Learning (2004).
Student Support Units: The DIRA has conducted workshops with all Student Support Units staff to assist them in understanding and tracking their role in support of student learning outcomes. These workshops focused on creating or enhancing each unit’s Comprehensive Outcomes Assessment Plan (COAP). Subsequently, support unit staff submitted a COAP with at least the first three columns completed, thus demonstrating that they had a clear and current unit mission statement linked to broader college goals, specific and measurable unit goals to support their unit mission, and several criteria by which to measure those goals. Following the workshops, 80% of all support units submitted COAPs with at least one outcome assessed by both direct and indirect means (fourth column) and included some suggested use of the results (fifth column). Some Action Plans and Strategic Initiatives were proposed (sixth column) based upon these results. When our accreditation team visited, only 38% of support units had posted COAPs. Two years later, this figure was 88%.
While we acknowledge that we still have many miles of these three tracks of assessment to lay, we are now able to look back and see the substantial progress we have made. As we continue to evaluate and fine-tune our assessment procedures, we realize that the tracks we have already laid sometimes need adjusting. We have already seen that two of our institution-level assessments were not a good fit and are taking steps to replace them. Some of our academic units are still struggling with ways to measure their outcomes and some of our student support units have yet to analyze or even collect meaningful data. As for enhancing a culture of assessment, we are still having some “cultural differences” with some of our faculty and staff. So we recognize the danger of that third rail, yet we don’t let it frighten us away. For we know that the power of assessment comes from within—from within our people and their desire to enhance the learning and the lives of our students.
Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques (2nd ed). San Francisco, CA: Jossey-Bass.
Nichols, J. O., & Nichols, K. W. (2000). The departmental guide and record book for student outcomes assessment and institutional effectiveness. New York, NY: Agathon Press.
Suskie, L. (2004). Assessing student learning. Bolton, MA: Anker.
Dale L. Mort is associate vice president for institutional effectiveness at Lancaster Bible College in Lancaster, Pennsylvania.
Sign-Up Now and Stay Informed!
Additional text goes here and here and here.
More text regarding the E-Alerts signup goes here.
© John Wiley & Sons, Professional Development Subscription Content
One Montgomery Street, Suite 1000, San Francisco, CA 94104-4594 | Phone: 800-835-6770
Terms and Conditions