5. D. 2 The institution learns from its operational experience and applies that learning to improve its institutional effectiveness, capabilities, and sustainability, overall and in its component parts.
Miller College systematically reviews its courses through the multiple lenses of external standards, student performance, and feedback from Program Advisory Committees, alumni and external partners such as mentor teachers and intern supervisors, using feedback loops detailed in 5.D.1. Because all students at Miller College transfer credits from other institutions, a critical component to continuous improvement efforts is the evaluation of transfer credits to ensure that students’ 100- and 200-level coursework provides adequate preparation for the 300- and 400-level courses offered by Miller College. To ensure transfer credit meets these standards the Registrar applies processes of equivalence. Assisting this process are transfer guides generated and sent to colleges throughout southwest Michigan and beyond. These transfer guides are updated by the Instructional staff and Deans of each school, and advisors working with students to develop individual program plans consult the Deans regularly. All documentation submitted as transfer credit is evaluated first by the Registrar, guided by a protocol for review that also involves Deans and admissions advisors in the evaluation of course descriptions and syllabi to determine equivalence with previously vetted transfer courses and alignment with Miller College courses.
In cases where evaluations of transfer credit and assessments of student performance in introductory classes reveal inadequacies in incoming student preparation, Miller College syllabi and curricula are revised to better support students’ achievement of each program’s Learning Outcomes. For example, early in the College’s history, dissatisfaction with students’ writing skills in entry-level courses initiated a comprehensive assessment of student writing that led to the creation of LBAR 300: Junior Seminar as a General Education requirement across programs.
Data from end-of-course evaluations inform hiring decisions and continuous improvement of academic offerings. For example, information gathered from Fall 2010 end-of-course evaluations in Junior Seminar was used to fuel important reforms to that core course’s curriculum to ensure more effective coverage and transfer of essential knowledge and skills that are foundational to success in subsequent General Education and degree-specific coursework. A major comparison/contrast essay assignment was replaced with a more rigorous writing task that both better aligns with the kinds of argumentative research writing expected in subsequent courses and provides deeper opportunities for mastery of APA citation and document formatting standards. The course’s incorporation of distance learning strategies was better scaffolded to help ensure student success in LBAR 300 as well as prepare students for subsequent online and hybrid coursework.
The creation and ongoing refinement of Junior Seminar is but one example of Miller College’s regular practice of academic program reviews. This review takes place on multiple levels, one of which is the annual review of catalog accuracy before it goes to press. The catalog lists and describes programs and courses, and therefore any changes must be identified and included in the revision. Early examination for additions, deletions and modifications from all programs is required. Consequently, the Deans of each school begin this process early in the year in order to capture and reflect changes in the new catalog. Any change in a program must first be approved by the Academic Affairs Committee prior to any inclusion in the College literature. Changes may range in depth from simple correction of a typographical error to redesign of a program of study. Depending on the complexity of the recommended change, approval is then sought from the President’s Council and the Board of Trustees.
Every Dean and faculty member participates in such review activities, and students, too, take part in the program review process. Students evaluate instruction and curriculum through end-of-course evaluation surveys (as described in 5.D.1). Key student program review questions include the following samples
• The course content matched the course objectives.
• This course was clearly relevant to my program of study.
• The relevance of the subject matter to real world issues was made apparent.
• This course contributed significantly to my professional training.
• There was a close agreement between the stated course objectives and what was actually taught.
Miller College continues to learn about the efficacy of its programs of study after students graduate. Exit and alumni surveys inform continuous improvement efforts by offering insights on long-term outcomes of learning activities embedded within the programs of study. A greater emphasis on teaching contracts, unions and teacher evaluation systems was added to the EDUC 499: Senior Seminar curriculum, for example, in response to alumni feedback on information they wished they had learned more about during their undergraduate experience. In short, students are Miller College’s bottom line check for continuous improvement.
Designed and created by DDM Marketing & Communications.