Review committees come in all shapes and sizes and individual reviewers are somewhere on the spectrum between first-time team members and people who have been on more review committees than can be conveniently counted. The expectation that review committees get it right all the time is unrealistic so let’s talk about best practices when a committee makes a misstep. Here are some examples from my experiences.
Before we start, I want to offer a big “Thank you” to the reviewers out there. One visit or many visits, you have worked many extra hours and given up a considerable amount of personal time to contribute to this peer-driven review process.
The first example comes from the very first time I served on a visiting committee and it was me who got it wrong. The standard was general education student learning outcomes and the institution had collected a huge amount of data. I’ve always liked playing with large data sets and I went a bit overboard on the information that they could get from those data. I believe that they did get a recommendation for good reason, but the guidance provided was well beyond what was required to be in compliance with the standard. I don’t know what happened at the institution after that visit, but I hope they were able to sift through my comments and zero in on what was required by the standard.
The second example is pulled from my last reaffirmation of accreditation before launching Southeastern Accreditation Consultants. It was again the onsite visit and was again general education student learning outcomes. During the visit, the committee members assigned to this standard spent no more than 15 minutes across two different meetings digging into the institution’s practices. During one of those I heard a committee member talk about what they did at their institution following a monitoring report on general education student learning outcomes. The committee member wondered why were we not doing what they did. Committee members are cautioned against this approach, but it does happen. The follow-up report after the visit clarified the case for compliance with the standard.
Third is a recent example. A client institution was on probation for program student learning outcomes and, after a considerable amount of work, their On-Site Visit went very well, the Monitoring Report was accepted, and the sanction of probation was lifted. A year or so later their Compliance Certification was submitted and the Off-Site Review Committee found them non-compliant for program student learning outcomes. The same narrative was submitted in Compliance Certification as was used in the Monitoring Report. The same structure was used to present the data, just updated with more recent assessment cycles. The identified problem was that the committee could not find evidence of one element of the process. The bigger problem was that the piece the committee could not find is not required by the standard. We are collaborating on the Focused Report now and I am confident that we will resolve this with the On-Site Review Committee.
If you are still reading you are wondering when the best-practices will be brought up. Well, from my experiences, here are the best practices when a committee gets it wrong.
- Don’t take it personal, especially after the off-site review. Remember that the Off-Site Committee does not have the opportunity to ask you questions.
- Dissect the committee’s comments and deliver a very surgical reply in the Focused Report or the Follow-up Report. Surgical, not snarky.
- Talk it over with your VP and get their perspective. Remember that your institution’s VP may not be the one working with the Off-Site Review Committee. This is why I recommend that the advisory visit happen after the off-site review. Get some fresh eyes on the Focused Report before you submit.
- It is hard, but don’t stare at your narrative looking directly at the piece the committee indicated was not there. Find a different way to report it. Maybe it was not clear the first time around.