Selected Improvement Plan

Rationale:
As a unit we have selected Standard 5 as the selected area for improvement. This is based on our self-assessment as well as the belief that quality assurance is the foundation on which everything else in our program is based. We feel we have documented that standards 1-4 have been met.

Our self-assessment revealed that we some areas for improvement in our overall assessment system. These have primarily focused on ensuring that data is collected and stored in a central location for easy access by all faculty and staff. Faculty have commented that specific data is not always easy to find. In addition, our self-assessment indicated that we have some weaknesses in documenting and tracking data-based program changes. The 5.3 Program changes file shows changes made to affect candidates in a positive way as well as changes in licensure programs. However, we have not always documented that these changes are based on specific data and followed-up on these decisions. Unfortunately, we cannot document that 80% or more of the changes are linked back to specific data. We also need to do a better job of
testing innovations and their effects of selection criteria on subsequent progress and completion. We need to ensure that data-driven changes are ongoing and based on systematic assessment of performance and/or innovations and that these result in overall positive trends of improvement for the unit, candidates, and P-12 students.

Some of the issues we face have occurred through changes in leadership within the department. From 2011-2015 we had 3 different interim department chairpersons in the four year time span and each chair brought their own views of assessment. We now have a stable department chair who has been in this position for two years.

Goals:
  1. Continue to develop and refine the assessment system to make it as efficient as possible and to ensure that data is readily available to all faculty and staff
  2. Ensure that all EPP assessments are valid and reliable and to use the most appropriate measures to document this
  3. Track and document data-based program changes within licensure programs and the unit as a whole.
  4. Work to ensure that all EPP assessments correspond to the CAEP Assessment rubric.

Strategies and Timeline

1. We developed the department database, EDMS, but we will continue to expand and refine it. This database has provided a central location for data that had previously been stored is separate folders in our secure ‘S’ drive. Staff have had training on the database, however, we want to expand on the data collected and work to access these data in ways not previously possible. We do need to continue work on EDMS to include specific assessments associated with each licensure program and to include data on advanced candidates. We also need to continue work on accessing the data for reports. We continue to work on integrating data with the WU Banner system. We have worked with the SAR (WU Strategic Analysis and Reporting) office to develop reports from the Banner system including content test reports, data on program completers and current candidates.

2. We are confident that our assessments are valid. We will continue to work to document that these assessments are also reliable. Our efforts have focused first on the Student Teaching Summary evaluation and Professional Conduct and Dispositions evaluation which are both completed by mentor teachers and university supervisors. Our reliability work showed that the only issue with these assessments was the interpretation of advanced vs. target on our rubric. We have started to develop inter-rater reliability training that will be conducted with mentor teachers and university supervisors.

3. Data is collected for each licensure program each year. Four of the six assessments are common to all programs. Data on two of these assessments has already been incorporated into EDMS and three of these assessments are routinely collected by the department. One issue is in regards to assessments which are specific to licensure programs (assessments 2 and 6). Data on content test scores, KPTP scores and Student Teaching Summary Evaluations (assessments 1, 3 and 4 in program reviews) are routinely collected as a part of department business. We need to address assessments 2 and 6 which are specific to licensure programs. We had a report for course grades (assessment 5), but it proved to not be as accessible and useful as hoped and some courses have changed with the new program reviews so we need to address this. This will require coordination with the SAR office.

4. Work has begun to ensure that EPP assessment are in line with the CAEP assessment rubric. We have conducted some analysis of EPP assessments and surveys. This analysis has focused on EPP assessments used for all candidates as well as those that are specific to different licensure programs.

plan

Assessment Plan:
1. We will continue discussions with faculty and the professional community regarding our overall assessment system and revisit the examination of inputs and outputs that was conducted in the fall of 2016. Determination of the effectiveness of our efforts to improve will be based on faculty and staff input as well as an examination of how quickly and accurately information can be obtained.

2. We feel confident that our assessments are valid as they align well with the InTasc standards and evaluations from the state. The reliability we have computed is also good, but our self-assessment has indicated that we have not done enough to train mentor teachers and university supervisors. We will work to ensure that assessments are reliable and will continue to explore the best way to show this consistency. Reliability consistently at 80% or better will be the basis to determine if this goal has been met.

3. We meet as a unit at least twice a year to focus on assessment. These retreats provide the mechanism for us to review data and discuss changes. We will monitor these reviews and set a timeline for each proposed change to be sure we follow up on changes. We do need to meet as a unit to discuss how best to collect and organize data specific to licensure programs and to make these data easily accessible. Content test scores and KPTP scores are input into EDMS which makes it easier to access those data, but we need to continue our efforts to get the other assessment data into that database. We will work with UTEC to monitor P12 and Secondary programs. We need to work on the best way to incorporate data on specific licensure assessments (i.e. assessments 2 and 6) into EDMS.

4. We have started work on using the CAEP Assessment Rubric to assess the quality of our assessment forms and items. There has been discussion about the assessments with UTEC and efforts have been initiated to ‘clean up’ our assessment instruments. The review of assessments has focused on assessments used with all candidates as well as those specific to different licensure programs. This goal will be met when all assessments are consistent with the CAEP assessment rubric.

Resources:
Pending the CAEP review and any other concerns that may be identified, we feel the proposed changes can be taken care of internally with current staff. The Director of Field Experiences has been working to develop and organize training sessions for mentor teachers and university supervisors, but we may seek additional funds to help provide stipends for mentor teachers. A professor in the Computer Information Sciences department has assisted us in development of EDMS in the past, and we anticipate that some additional time and expertise will be needed as we develop this system further, especially in regards to combining data from different data tables into single reports. The Dean’s office has been supportive of our efforts in the past and we anticipate that this support will continue. We have a good working relationship with the Statistical Analysis and Reporting office and with Information and Technology Services on campus. We will also initiate discussions with the Dean’s office regarding a line item position for assessment and accreditation.





[ WU Home ] [ Directory ] [ A-Z Index ] [ Sitemap ] [ Contact WU ] [ Statements & Disclosures ] [ Accessibility ] [ Search ]
© 2015 Washburn University, 1700 SW College Ave, Topeka, Kansas 66621 785.670.1010
Contact webmaster@washburn.edu with questions or comments.