This website uses cookies. Learn more via our web privacy policy. For questions, please email dataprivacy@columbusstate.edu.
XI. Fairness, Accuracy, Consistency, and Elimination of Bias - Columbus State University

{{ rssData.title }}

{{ rssData.description }}

College of Education and Health Professions

XI. Fairness, Accuracy, Consistency, and Elimination of Bias

Quality Assurance System
Educator Preparation Programs

XI. Fairness, Accuracy, Consistency, and Elimination of Bias CAEP: Self-Study Material, Standard 1.1, 3.3, 3.4, 5.2, Advanced

The unit uses the following strategies to ensure fairness, accuracy, consistency, and elimination of bias throughout its assessment system:

1) The unit ensures that the assessments are aligned with the unit’s conceptual framework, and that relevant standards are reflected in syllabi, key assessments, and course key/critical assessments.

2) Initial undergraduate and initial graduate MAT candidates are informed of all requirements in the education program when they initially meet with their education advisors and before they submit their application for admission to the program. Orientations are provided for students regarding the requirements, policies, and procedures for programs and lab experiences, and individual and group advising sessions are held. Advanced candidates are informed of the requirements in online orientation sessions designed to explain procedures for program matriculation. Information about the conceptual framework, dispositions, program requirements, and other requirements is available on the College’s website and discussed with the candidates by their advisors and course instructors. Initial undergraduate and initial graduate MAT candidates receive a copy of the “Student Teaching/Internship Handbook” at the beginning of the student teaching or internship.

3) Rubrics for the key program and critical course-based assessments are shared with the candidates before they are used. Thus, candidates know what they will be assessed on, what is expected of them, and the level of proficiency associated with each scoring decision.

4) All curriculum or program changes must be submitted for approval and follow the outlined approval process. This process includes review beginning at the program level, continuing through the department level, to the college level, and finally, to the university level. One of the purposes of the process is to ensure that the proposed changes are reviewed for fairness, accuracy, consistency, and freedom of bias.

5) The MAP/GMAP and Dispositions/Graduate Dispositions rubrics used to assess candidates are discussed with the candidates by advisors, instructors, university supervisors, and CSU Advise. They are also shared with the cooperating teachers. Training for using the MAP/GMAP rubrics is provided to all full-time and part-time faculty. Program faculty participate in inter-rater reliability training to ensure inter-reliability. Rubrics used for program specific assessments are discussed with candidates each semester by program faculty members.

Guidelines: 1) Every faculty member should complete interrater reliability training, 2)  faculty should participate in refresher workshops on the use of the rubric at regular intervals to make certain that scoring remains consistent, 3) each student should be reviewed by more than one faculty member, and 4) an independent assessment of the scoring process to review the reliability and validity of the instrument over time should be implemented.

6) Data are triangulated wherever possible to enhance the reliability of findings. For example, many of the same questions are asked on the follow-up surveys and on the Center for Quality Teaching and Learning surveys for both the initial and advanced programs. Also for the initial programs, the student teacher, cooperating teacher, and university supervisor each independently complete surveys at the end of the semester.

Ask Cody

Ask Cody