March/April • Volume 43 • Issue 3

download pdfDownload full issue pdf



Effective Program Review: The Lessons I Have Learned

Theodore C. Wagenaar, Miami University

I have completed about 80 program reviews at many types of schools over the past 38 years. Every visit is a reminder how similar the issues are for most schools. I share here the commonalities and draw out the implications for successful academic planning.

It’s Not All About You

I keep rough track of the personal pronouns people use during reviews. Overwhelmingly, they use “I,” “me,” and “my” instead of the collective versions. This pronoun usage reflects how most faculty members think about the curriculum: their personal preferences and their own courses count far more than a collective responsibility for students’ learning across the major. So I often see something like an esoteric course on holistic health even when a more fundamental course such as stratification is missing. I sometimes see required courses offered at the same time because that is when the faculty members involved wish to offer them. I sometimes see only a few sections of intro offered when dozens of upper-level specialized courses are offered. I sometimes see course “hogging,” when a faculty member refuses to let someone else teach “my” courses.

Very few schools schedule a closing meeting for me with the faculty as a group (although most schedule closing meetings with administrators), so I always ask for such a meeting. The failure to schedule such a meeting underscores the view of faculty as individuals rather than a collective. I use closing meetings to highlight the issues most in need of collective action and begin to address those areas.

Lessons: Things work better within a group when that group works collectively toward group goals instead of pursuing individual goals. Program decisions should evolve out of careful deliberation reflective of everyone’s input, which are centered on solid program goals. Use ASA resources on assessment and program review.

Students Matter

I am continually struck by how often students are left out of the equation in program review. Self-study documents include a lot about budgets, schedules, and the faculty. But less than a fifth of the schools I have visited have done surveys of their current students, so they know little about such things as how students view scheduling, the quality of advising, and how prepared for careers students feel. About two-thirds of the schools do alumni surveys, but most of these are institutionally rather than departmentally driven. Alumni are asked about critical thinking, their graduate school, and/or employment status, and the like. They are not asked about the integration of theory and methods in other sociology courses, or the degree to which they gained the cumulative study-in-depth experience. I ask to meet with students in a required class so that I can talk with a cross-section of majors, but usually just have lunch with a few invited students (who are almost always highly positive about the program). I typically spend an hour or less with students. The institutional protocols I have seen include little about direct feedback from students or their involvement in the process.

Students are why we and our programs exist and they should play a greater role. In fact, they should be part of the review team. The self-study and external reviewer’s report should be shared with them and their input sought. They should be asked about their experiences at multiple points in their time with us and we should more explicitly incorporate their responses in our academic planning.

Lessons: Our students should be our first concern. Give careful consideration to what they should learn, in what order. Academic planning should start with their needs. Engage them in analyses of their experiences in their programs. Include at least one student in program review (and perhaps departmental) meetings.

Faculty Matter

There is a great disconnect between administrators who develop assessment strategies and the faculty members who implement them. I have yet to see an assessment program that came from the faculty. What typically happens is that 1) administrators attend conferences where they learn about the latest assessment strategies and then impose them at their institutions soon after, and/or 2) institutions face accreditation demands for assessment that must occur promptly. In both cases, faculty engagement in developing the process is nonexistent or minimal, which in turn yields low faculty engagement (even cynicism) in implementing the process. I ask faculty members about their involvement in the program review process or what they think about the self-study; responses almost always indicate that the self-study was written by the chair with little faculty involvement. At dozens of schools I have seen lists of goals copied verbatim from the ASA document Liberal Learning, so I ask faculty members what they think about the goals and I ask students to list a few of the overriding goals in the program. Both exercises show how these goals are often for program review window dressing but not actually implemented.

Lessons: Faculty members should see program review as part of their academic responsibility to their students. Administrators should initiate conversations about assessment with faculty and encourage the faculty to take the lead on implementation. Faculty should take charge of the process both within their departments as well as on campus. Program goals should be collectively developed, implemented, and assessed.

Process Matters

I always ask to see the previous self-study, external reviewer’s report, administrative response to these two documents, and the department’s response to the administrative response. Less than half of the schools have been able to furnish the previous self-study and external reviewer’s report. Of these, only a few have supplied the administrative response and the department’s response to that, probably because both of these are not commonly part of the process. I also ask during interviews with faculty members and administrators about what has been done differently since and because of the previous program review. Blank stares. Program review is typically a “burst” activity: every five to eight years it becomes a task to be done as quickly as possible to meet whatever minimal criteria are externally established and then ignored upon completion. This approach defeats the purpose of effective assessment: a process of continual improvement based on data and collective decision making. Program review should be one step in the process of regular reflection, data gathering, discussion, and implementation. Only once have I been invited to help a department implement the recommendations in its program review. Clearly the focus is more on completing a task than engaging in improvement.

Of course, process is tightly connected to structure. At one school, the provost outlined the review process, which involved considerable data gathering, portfolio review, and comparisons with peer and “aspirational” institutions. I asked about the structural support for this process, but there was none. No assessment office, no one to assist with data gathering, no campus-wide faculty-led process in place. Appropriate structures need to be in place to make assessment part of the regular work flow and part of the reward structure.

Lessons: Construct program review as a process of continual reflection and improvement. Make it part of the institutional culture. Make data gathering ongoing. Link structural support with effective assessment.

Program review should be viewed as an opportunity to solidify the ongoing reflection that faculty members engage in collectively, with improved student learning as the ultimate goal.

Theodore C. Wagenaar is Professor of Sociology at Miami University (Ohio) and a member of the ASA Departmental Resources Group.

Back to Top of Page


Print this article Connect on Facebook twitter

Special Announcement

TRAILS: Teaching Resources and Innovations Library for Sociology

Back to Front Page of Footnotes | Table of Contents