Mixed Signals on 8.2.a?

Standard 8.2.a (Student Outcomes: educational programs) is one of the most common areas of non-compliance in the SACSCOC region. That makes sense since it is a complex standard that impacts nearly every corner of the academic division of an institution. It does not help when you get mixed signals about what is required to make a case for compliance with this problemmatic standard. Lets take a look.

8.2 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of seeking improvement based on analysis of the results in the areas below:

a. Student learning outcomes for each of its educational programs.

Of course, 8.2 goes on to include general education outcomes and academic and student services outcomes, but I want to look at 8.2.a and the mixed signals. First, this standard has several components. Each institution has to (1) identify student learning outcomes for each program, (2) assess the extent to which students achieve those outcomes, (3) use the data to inform changes seeking improvement in student attainment of the outcomes, and (4) provide evidence that the changes were made. Let’s take these in order.

Identifying student learning outcomes is not difficult, but enforcing a common format and focus sometimes eludes us. This is not usually a compliance issue, but there is a difference between “The student will demonstrate the application of X” and “The student will apply X to Y”. In my experience, coming up with program student learning outcomes is not a problem for an institution.

Assessing the extent to which students attain the outcomes is rife with opportunity for institutions to over-complicate and go astray. Complex assessment protocols that generate large quantities of data, in my experience, tend to be short on the final stages of the process. If there is only so much time for assessment and if all the time is spent generating data then no time is left for analysis. My experience has been that, when pressed, most institutions can produce data to support a case for compliance.

Using the data to inform changes seeking improvement is where you might get some mixed signals. One SACSCOC VP I worked with quite a bit has always made it a point to tell committees that institutions don’t have to show that improvement happened. They do have to show that changes were made seeking improvement. This is how I always approach this standard. Others may say that you should look at year-over-year improvement. What was the impact of the change you made last year on student performance this year? First, that is not required in the standard, and second, that makes a statistically invalid assumption. At least for most institutions. This assumes that the group of students who were assessed in one year is identical to the group of students assessed in the next year. Unless you have a huge group of students each year you simply can’t make this assumption.

Finally, the standard says that institutions must, “…provide evidence of seeking improvement…” as part of the case for compliance. I have heard others say that simply explaining the change is sufficient. As a reviewer I always wanted to see evidence that the indicated change actually happened. The standard seems clear to me and I advocate to all clients that evidence of the implementation of each change should be included. Consider that anywhere else in the Principles of Accreditation when you say that you do something, you have to provide evidence. Anywhere an institutional policy is required you must provide evidence that you follow said policy. This should be no different.

What is the key? Straightforward outcomes, streamlined data collection, protocols that facilitate meaningful faculty reflection, and evidence that changes were made. I can help with any or all of that.

At Southeastern Accreditation Consultants, we’re ready to collaborate and support your accreditation and strategic planning efforts. From reviewing narratives to building your documentation, we offer individualized services to best meet your needs. Contact us to get started.

Published by Douglas A. Wymer

Throughout an academic career spanning nearly 20 years, Dr. Wymer participated in many site visits (both substantive change and reaffirmation visits) for the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) and he has been a visiting team member for the Accrediting Commission for Community and Junior Colleges with the Western Association of Colleges and Schools. In addition to serving as a team member, Dr. Wymer has served as a visiting committee chair for SACSCOC. After earning a B.S. in Biology (with a minor in Chemistry) from what was then Shorter College, an M.S. in Entomology from Clemson University, and a Ph.D. in Environmental Science from Tennessee Technological University, Dr. Wymer started a rewarding career in academia. He earned tenure and achieved the rank of Associate Professor of Environmental Sciences at The University of West Alabama and served in a number of administrative roles at UWA including Department Chair and Assistant Dean. He served as a Department Head at Pensacola State College and, after a year in that position, was promoted to Dean of Baccalaureate Studies and Academic Support. In 2016 he became the Vice President of Academic Affairs at Lake-Sumter State College, where he served for four years before launching Southeastern Accreditation Consultants.

Leave a Reply