Standard 5
Standard 5


The unit assessment system is divided into 3 parts – data on candidates as they progress through the program, data on program assessments, and data on unit assessments (see the 5.1 Assessment overview file)

Data on candidates are organized in the Education Data Management System (EDMS). EDMS contains data on content test scores, admission data, KPTP scores, field placements, advisors, contact information, professional conduct and disposition issues, completer achievements, completer jobs, and program completers (see 5.1 EDMS manual file)

The unit collects and summarizes data across all CAEP standards:
Standard 1 – data on content test and PLT scores are obtained directly and electronically from ETS. These data are downloaded into our department data base and into the WU Banner system. KPTP data is obtained from the state and these scores are downloaded into the department database. Data on the assessments relevant for each licensure program are maintained in our secure ‘S’ drive.

Standard 2 – data on field placements are obtained from faculty each semester and compiled into one file for analysis. These data are downloaded into EDMS. School demographic data is generally collected every other year from the state and focuses on the four area school districts in our immediate area.

Standard 3 – the unit has been collecting data on candidates (admission scores and retention data) and has conducted some analysis of trends (see the 5.1 EPP Case Studies file)

Standard 4 – the unit has been collecting data on impact on student learning. We collect data on the academic gain scores of students based on the KPTP data from candidates, we collect data on non-academic measure (p12 children survey), the student teaching summary evaluation includes an item on impact which is completed by university supervisors and mentor teachers, the follow-up survey of principals of alumni and the candidate self-assessment completed by student teachers also includes an item about impact on learning.

In 2016 we conducted an analysis of our assessment system by examining the ‘inputs’ and ‘outputs’ of the various assessments we administer (see the 5.1 Assessment Inputs Outputs file). We compiled a list of our assessments and reviewed how that information is collected, what happens to it after it is collected, and how it is reported or summarized. While much of this was already known, it was helpful to review this process and look for ways to make sure it is as efficient and accurate as possible. Based on this review we have implemented a few changes. For example, ETS Praxis test information is now downloaded directly from ETS and input electronically into our database. These data are also much more complete than what we were getting previously.

Integration with the Banner system – At our NCATE visit in 2011 we were encouraged to integrate our assessments into the WU Banner system. A faculty member present at that time, who also served as an interim chair of the Education Department, was a strong advocate for that approach and was able to move these efforts forward given her knowledge and experience with Banner. However, that key faculty member left WU and over the past several years issues have surfaced regarding getting information in to and out of Banner. For example, to get assessment information summarized and reported out of Banner we need to go through the office of Strategic Analysis and Reporting. This process can sometimes take quite some time. In addition, there is data our department needs to collect on candidates or programs that does not fit well into the Banner system. At the present time we do generate several reports from the Banner system using the Argos reporting tool. We can generate a program completer’s report, a report on declared majors, grade reports for licensure programs and reports on content test scores by licensure program.

EDMS has data on content test scores and admission (CORE/ACT) scores, admission GPA’s, field placements, KPTP scores, jobs data. Banner system maintains student information including transcripts, race/ethnicity, gender, and age.

The unit has a clearly identified Assessment calendar. We conduct assessment retreats twice a year, there is a Unit Assessment committee and advisory board, and regular UTEC meetings where data is shared and discussed (see 5.5 Minutes of Meetings). The Title II, PEDS, CAEP Part C, and Washburn University annual assessment reports require us to collect, analyze and report data annually. These reports focus on completers, content test scores and field experiences (see the 5.1 Education Elementary and Education Secondary annual WU report files). The unit submits a program review every 5 years to WU and annual reports each year to CAS. All data on licensure programs maintained in the secure S drive. We conduct course evaluations each semester, conduct an advising survey once a year, conduct a follow-up survey with building principals, and conduct program completer self-assessment each semester

The unit is required to submit a comprehensive unit program review to WU every 5 years. This review focuses on programs, program effectiveness, faculty, unit strengths and weaknesses, and administrative structure. The College of Arts and Sciences, where the unit is based, requires annual reports from each department. These reports discuss the overall condition of the unit, faculty issues, needs or concerns, adequacy and availability of adjunct faculty, notable achievements of faculty in scholarship and service, student issues, concerns and notable achievements, issues or concerns regarding staff including adequacy and appropriateness of staff support, finances, and areas of concern in the departmental budget, department goals, issues and facilities.

Data on licensure programs is kept and analyzed separately. EDMS allows us to examine by licensure program 4 common assessments across all licensure programs – content test scores, KPTP, Student Teaching summary evaluations, and course grades


We have made efforts to ensure that our data is valid and reliable. The following files all show information and data regarding the validity and reliability of our assessments: 5.2 Alignment of Instruments, 5.2 Assessments Relevant, 5.2 Assessments reliability and validity, 5.2 Inter-rater reliability and 5.2 Evidence of Disposition in InTasc.

Annual WU Assessment reports focus on 5 Student Learning Outcomes (SLO’s): content knowledge, ability to plan and implement, diversity, assessment and impact on student learning. We report on these each June for the elementary, P12 and secondary programs and examples of these reports are in the evidence.

Information relevant to content validity for the Professional Reference Form and the Professional Conduct and Dispositions evaluation is provided with standard 3 evidence.


We do collect and monitor data on candidates and programs including trend data for several assessments. The WU annual assessment reports, Title II, PEDS and CAEP Part C reports all require us to review and report data each year.

The following five goals were established at our last WU Program Review in 2014:

Goal 1: Design and formulate curriculum review and modification practices to ensure that the learning opportunities both formal and informal are focused on ensuring students have the skills and knowledge to contribute to P – 12 learning including meeting the articulated college and career-ready goals. Goal 1 is focused on curriculum and is assessed directly by student performance on external license exams and indirectly through surveys of graduates and school-based employers.

Goal 2: Reinvigorate and design clinical partnerships to support our mission to prepare teachers for rural, suburban, or urban teaching. Building on the research and our past practices with Professional Development Schools, we will establish significant community partnerships that will ensure the preparation of educational professionals is a shared endeavor. Goal 2 is assessed by the number of partnerships and the quality of the partnerships which will be evaluated through surveys and analysis of the diverse placements which include an analysis of mentor teacher qualifications (see standard 2 files).

Goal 3: Develop and implement strategies to recruit and retain candidates who reflect the diversity of the community in which we live. Goal 3 is assessed through an analysis of the demographics of entering cohorts, retention, and graduation rates. We have established a recruitment committee, developed a recruitment plan, and have been collecting data and monitoring the percentage of candidates from diverse backgrounds. We have conducted some research on admitted candidates and candidates from diverse backgrounds (see standard 3 files) regarding GPA’s and CORE or ACT scores.

Goal 4: Design and formulate the ability to understand the impact our candidates have on P – 12 student learning. This goal will be assessed using employer feedback, KPTP data, Student Teaching Summary Evaluations and Candidate Self-assessment. The unit has worked to expand on these data by conducting a research study with USD 501 in the spring 2017.

Goal 5: Design and formulate a quality assurance system to ensure access to meaningful and substantive measures of student learning. We have an established assessment system and have conducted an analysis of where specific data comes from and what happens to it. We monitor candidate performance and unit operations.

We meet as a unit twice a year for assessment retreats where data is reviewed and discussed. This is often the point at which program changes are discussed and approved.

The 5.3 Program changes file provides information on changes made within our program over the past several years. Unfortunately we have not always documented these changes.

As noted in Standard 3 we conduct analysis of admissions data, including ACT scores and GPA’s. We have also investigated admission of diverse candidates as a part of our work related to standard 3. The 3.2 ACT Admit Summary and the 3.2 GPA Admission Summary files provide evidence of this analysis.

We do review data on candidates, programs and unit operations at least twice a year at the assessment retreats. Data is also often shared via email and in department faculty meeting, advisory board, and UTEC meetings. We review data in all licensure programs. Four of the six assessments associated with specific licensure programs have common assessments for all programs (assessments 1, 3, 4, and 5). The 5.3 Program changes file shows changes made to affect candidates and programs in a positive way. However, we have a need to improve this process. Unfortunately, we cannot document that 80% or more of the changes are linked back to specific data. We will address this in our selected improvement plan.


We do have two sources of data regarding external benchmarks – ETS scores and KPTP data from the state (see the two 5.4 External Benchmark files). In each instance our completers compare very well to other completers. We are also working on a study, linked to standard 4, to collect data on the impact on student learning our completers have as teachers in the USD 501 school district. Kansas does not provide these data, but the IHE’s in the state have been working on a process to collect these data (see KS Data files).

We make data on candidates and programs available on our department website under the data summaries link. We share data with our advisory board, Unit assessment committee, UTEC and through the WU Annual Assessment reports. Data is shared with candidates primarily through discussions in classes. Data Summaries in Department Web Site

We do collect and monitor data on CAEP's 8 Reporting Measures:
1 Impact on P-12 learning and development
We collect academic gain scores and data on non-academic measures and these are reported in Standard 4

2 Results of completer surveys

We have trend data on the results of our candidate self-assessments completed by candidates at the end of student teaching.

3 Graduation rates

The vast majority of candidates who enroll in student teaching complete the program. Although there can be a candidate or two in a given semester who do not finish student teaching. We do our best to track candidates after they complete the program and our trend data indicates that 75% of them are in teaching positions.

4 Ability of completers to meet licensing (certification) and any additional state requirements

Standard 1 provides evidence of the pass rates on content tests. We do not require candidates to take and pass the content tests as a graduation requirement, but taking and passing the content test(s) are required to apply for an obtain a teaching license in Kansas.

5 Indicators of teaching effectiveness

Every candidate completing student teaching is required to develop and submit a teacher work sample – the Kansas Performance Teaching Portfolio. The KPTP’s are scored by trained reviewers through KSDE. In addition, during student teaching candidates are evaluated by both mentor teachers and university supervisors. Both of these sources of data provide evidence of teaching effectiveness.

We have conducted a follow-up survey of program completers for many years. This survey is sent to school principals based on the data we have obtained on where our completers are teaching. We do have some trend data that shows a high level of consistency in the scores obtained.

6 Results of employer surveys, and including retention and employment milestones

We work to maintain records of employment and awards of our completers. However, this is an on-going challenge and we don’t have good records of employment milestones.

We have conducted a follow-up survey of program completers for many years. This survey is sent to school principals based on the data we have obtained on where our completers are teaching. We do have some trend data that shows a high level of consistency in the scores obtained (see evidence 5.5).

7 Ability of completers to be hired in education positions for which they have prepared

Our analysis of the teaching jobs obtained by program completers for several semesters indicates that 75.6% of our completers have been hired for teaching positions, but the recent year was higher than this.

8 Student loan default rates and other consumer information

Data on student loan default rates is obtained from external sources. The most recent data puts this just under 12%.

We involve partners in several different ways. We have documentation of involvement and input through the unit assessment committee, advisory board, and UTEC meetings (see 5.5 the Minutes of meetings file). We conduct a follow-up survey of principals regarding completers (see the 5.5 follow up survey data file) and have both formal and informal meetings with mentor teachers involved in methods courses or student teaching.

5 Evidence Summary
5.1 Assessment Overview and Calendar
5.1 Assessment Retreat minutes
5.1 Assessment_Calendar
5.1 Assessment Inputs Outputs
5.1 EDMS Manual
5.1 Education Elementary-Annual Assessment Report_2015
5.1 Education-Secondary&P-12-Annual Assessment Report_2015
5.1 EPP Case Studies
5.2 Assessment Validity Reliability
5.2 Assessments Relevant actionable
5.2 Evidence of Disposition Statement.In TASC
5.2 Inter Rater Reliability Summary
5.3 Program Changes
5.4 External Benchmarks ETS scores
5.4 External Benchmarks kptp
5.4 Internal Consistencies for KS Survey Scale for Alumni and Employer Survey
5.4 KS Alumni Survey InTASC Alignment
5.4 KS Educator Employer Survey Report for State of Kansas
5.4 KS Employer Survey InTASC Alignment
5.5 Follow Up Survey Results - Prinicipals
5.5 Minutes of Meetings

[ WU Home ] [ Directory ] [ A-Z Index ] [ Sitemap ] [ Contact WU ] [ Statements & Disclosures ] [ Accessibility ] [ Search ]
© 2015 Washburn University, 1700 SW College Ave, Topeka, Kansas 66621 785.670.1010
Contact with questions or comments.