jossey-bass

The Voluntary Framework of Accountability (VFA)

By Peter T. Ewell September 20, 2010 | Print
In the wake of the Spellings Commission hearings and subsequent report, most of the major Washington higher education associations scrambled to demonstrate that they were constructively on board the accountability bandwagon by creating performance reporting templates for their member institutions. Probably the most widely recognized of these was the Voluntary System of Accountability (VSA) developed jointly by the Association of Public Land-Grant Universities (APLU) and the American Association of State Colleges and Universities (AASCU). Among the VSA’s several “cousins” were the University and College Accountability Network (UCAN) produced by the National Association of Independent Colleges and Universities (NAICU) and Transparency by Design (TBD) developed by a consortium of adult-serving institutions. All of these are voluntary and all contain at least some common comparative measures of performance such as graduation rates. For two of them, these common measures include standardized tests. In the case of the VSA, institutions are offered a choice of the Collegiate Learning Assessment (CLA), the ACT Collegiate Assessment of Academic Proficiency (CAAP), or the ETS Proficiency Profile. For TBD, the ETS Proficiency Profile is the probable choice.

Prominently missing from this array of voluntary reporting templates were the nation’s public two-year institutions. Three years later, this absence is being addressed by the Voluntary Framework of Accountability (VFA), currently under development by the American Association of Community Colleges (AACC), the Association of Community College Trustees, and the College Board.

Community colleges are among the most distinctive types of institutions in American postsecondary education, and this affects their basic attitude toward accountability and public reporting. First, they serve a variety of different functions simultaneously including providing a) the first two years of a baccalaureate degree, b) associate degree instruction in many vocational fields that also carries transfer credit, c) terminal occupational certification that has immediate workplace value (both associate level and certification) but does not carry transfer credit, d) remedial and developmental instruction to render students collegeready, e) noncredit instruction such as literacy training and English as a Second Language (ESL), and f) contract training for employers and local businesses. This multimission character poses significant challenges to the application of traditional conceptions of institutional effectiveness because institutional effectiveness is usually predicated on a unitary institutional mission that defines what it means to be “effective.”

Second, community college leaders have long claimed that established measures of student progression like the Graduation Rate Survey (GRS) required by the Integrated Postsecondary Educational Data System (IPEDS) are not appropriate to their institutions because they are based on full-time first-time students—a fraction of the entering student population in most two-year college enrollments. Moreover, they do not recognize that many students come to these institutions not intending to earn a degree. These areas of distinctiveness have frequently induced community college leaders to be wary of traditional performance measures or, indeed, any common set of performance measures at all. And this wariness has been quite visible in the development of the VFA.

To begin to develop the reporting framework, the sponsors established an advisory committee of community college presidents and researchers. Initial meetings of this group revealed substantial differences among its members with respect to the basic purpose of the effort. Consistent with the VSA, many felt that the reason to develop such an initiative was to respond proactively to stakeholder expectations about accountability. As a result, they felt that the effort should be centered on a limited number of comparative benchmarks of performance. But just as many—largely drawn from the ranks of presidents and senior administrators—believed that the primary purpose was to guide institutional improvement. As a result, they wanted to avoid measures that looked at comparative performance and believed that the report should be customized for individual institutions. Early drafts of the Statement of Purpose for the effort reflected this tension, and it was not until its main features were presented at the American Association of Community Colleges (AACC) conference in April that the choice of accountability was clear. As Eileen Baccus, the chair of one of the working groups, succinctly put it at this meeting, “The VFA is designed to show responsiveness to ‘those who are on our backs.’”

This initial tension has also been apparent in the technical design of the template’s measures. This work was assigned to three working groups—Communications and College Engagement charged with developing ways to get large numbers of institutions to participate; Workforce, Economic, and Community Development charged with examining workforce and community impact measures; and Student Persistence and Outcomes charged with developing measures of student progression and learning. Many of the indicators these last two working groups initially suggested were familiar, including college readiness, success in completing remedial and college-level courses, various “credit accumulation” milestones (e.g., earning fifteen hours of college-level work), and degree or certificate attainment. But conspicuously missing from this initial list were any externally benchmarked measures of student learning outcomes—the most prominent ingredient of VSA. Instead, the working group on Student Persistence and Outcomes proposed a reporting method through which institutions would describe their own learning outcomes, followed by a depiction of the methods used to gather evidence of the achievement of these outcomes, without reporting specific results at all. To help guard against graduation rates being misconstrued, the working group also proposed an overall success indicator based on the extent to which students reported having achieved the goals they had in attending.

Given the apparent resolution of the purpose question, this recommendation stimulated considerable push back by some members of the Advisory Committee, who believed that the VFA should set the accountability bar at least as high as the public four-year institutions had done in the VSA. Accordingly, the working groups were asked to go back to the drawing board to incorporate true cohort-based graduation rate measures as well as externally benchmarked measures of student learning outcomes. Among the testing measures to be considered were the CLA and the ACT CAAP, as well as generic skills examinations in the ACT WorkKeys battery. But unlike the design of the VSA, nonstandardized assessment methods were also to be encouraged—for example, electronic portfolios or student work samples evaluated using a common scoring scheme—so long as they could support comparative analysis across institutions. The entire scheme will be pilot tested by a diverse group of community colleges this fall.

As of this writing, the final decision about whether or not to include comparative learning outcomes measures in VFA has not been made. If the answer is affirmative, the VFA will emerge as a strong counterpart to the VSA to help demonstrate the accountability and responsiveness of the nation’s public colleges and universities. But whatever the VFA’s eventual force and content, the higher education sector that arguably will be most critical to achieving the nation’s future goals with respect to regaining global competitiveness in educational attainment—the community college sector—is assuming collective responsibility for performance.

Peter T. Ewell is vice president of the National Center for Higher Education Management Systems in Boulder, Colorado.

Why Wait?

Get the current newsletter and
Join Our Email List
Sign up to receive exclusive content and special offers in the areas that interest you.
Send
Copyright © 2000-2015 by John Wiley & Sons, Inc. or related companies. All rights reserved.
Wiley