Checklist for Evaluating Key Assessments and SPA Program Reports in PRS

Choosing Appropriate Assessments

  • Each assessment (assignment or requirement) is required for all candidates in the program.
  • The assessments you have chosen are similar to or congruent with the types of assessments listed in Section IV of the SPA’s program report template as suggested assessments (e.g., course grades are now acceptable for use as one of the content assessments across all SPAs and all SPAs will accept grades for use as an assessment beginning in Fall 2008; see Guidelines for Using and Documenting Course Grades, NCATE, October 2007).
  • Assessments submitted for the five required assessments meet the specifications for those assessments, and are submitted in the order specified on Section II of the program report form. (Note that Assessment #1 must be the state licensure exam [if there is a state licensure exam] for content assessment in the program area; thus basic skills tests required for entrance to the program, e.g. Praxis I, are not acceptable.)
  • Each SPA standard is covered by at least one assessment that provides solid and direct evidence of candidate mastery of that standard; in most cases, standards are addressed by more than one assessment.
  • Assessments are evaluated by parties other than the candidates themselves (i.e. faculty, clinical supervisors, employers). Course evaluations completed by candidates, candidate self-evaluations without additional validation, and graduate surveys are not appropriate assessments.

Assessment Instruments

  • The assessment instrument is a comprehensive document (e.g. actual instructions given to the candidate, course descriptions as they appear in syllabi or catalogs, a sample test, or a very rich and full description of the assessment – rather than a brief description or summary of the required expectation/s).
  • The assessment is specifically designed for the program area in which it is used, or
    • Includes a supplement that addresses the specific content, pedagogy, and/or professional knowledge applicable to candidates in the program, or
    • Is coded with specific SPA standard numbers along with enough information in the two-page narrative to convince reviewers that the assessment items are implemented in such a way to be SPA-specific, or
    • Uses SPA-specific rubrics.
  • The items or sub-scores in the assessment are either aligned to specific SPA standards, or the alignment is specified in Section IV of the report form (it is not sufficient to indicate alignment to state or other standards, even though the relationship to the SPA standards may be close or overlap).
  • The name of the assessment is used consistently throughout the program report.

Suggestion: Consider embedding cross-references to SPA standards in the assessment itself.

Scoring Guides

  • The scoring categories (e.g. letter grades, numeric values, rubric scales) are consistent with the assessment instrument they are designed to evaluate, and the data they provide. (OR, if data are converted to another scoring framework, an explanation of the conversion is provided.)
  • The scoring guide is aligned to the assessment, and precisely describes the difference between each scoring category. The expectation for each assessed item is clearly defined in the scoring guide and/or assessment instrument.
  • The elements in the scoring guide are clearly aligned with the standards.
  • The minimal level of competence – for both a scored item and the overall assessment – is specified somewhere in the assessment documentation (e.g. “C” in a course, the “acceptable” range on a rubric).
  • The minimal level of competence identified by the scoring guide is equivalent to the level of competence described by a standard.

Suggestion: Consider embedding cross-references to SPA standards in the scoring guide itself.

Data Tables

  • If the program report covers multiple programs, data are disaggregated by each program as much as possible.
  • Data are broken down as much as possible to show candidate performance on individual scored items (or sub-scores) within the assessment. For example, if a rubric is used to score an assessment, and the rubric has 10 elements that are rated, then data should be presented for all 10 elements.
  • Data are disaggregated by the semester/cohort/academic year represented.
  • Data are presented in terms of the scoring categories in the assessment. (Note: if data have been converted to a different scoring scale in order to aggregate data across programs, make sure to explain that in your report.)
  • All identifying information (names, IDs, SS #s) have been removed from data documents.
  • The “n” in a data set is relatively consistent with candidate/completer information presented elsewhere in the report (if the “n” is excessively greater or smaller than the number of candidates in a program, provide an explanation for the discrepancy.)
  • Data presentation is reader-friendly, accompanied by legends and explanatory notes as necessary; numerical values are consistently used and clearly defined.

Analysis of Data

  • Section IV of the program report reflects on the overall quality of the data, and briefly analyzes data across scored items within an assessment; semesters across which the assessment has been given; and (if applicable) across candidate groups taking the same assessment. It is clear from the description what the program has learned from the data.
  • Section V of the program report describes steps taken to address areas in which data are weak or significantly and consistently weaker than other areas (if data reveal the need for program changes). It is clear from the description that the program consistently applies data analysis in program evaluation and improvement.

Submitting the Report

Overall

  • Sections have been completed and files have been named as follows:

    File name File contents
    “Section I narrative”

    Question 1 of the Context section has a 4000 character limit and Question 2 of the Context section has an 8000 character limit.  Question 3 of the Context section also has a 4000 character limit, but only needs a response if you are writing a report for NCTE, CEC and NASP.

    “Section I candidate & completers chart”
    Candidate/Completers Chart at the end of Section I
    “Section I faculty chart”
    Faculty Chart at the end of Section I
    “Section I program of study”
    Program of study (added as an attachment) in Question 4
    “Section II chart”
    Chart for Section II – see example of a completed chart
    “Section III Chart”
    Chart for Section III—see example of a completed chart
    “Assessment 1”
    Two page narrative from Section IV for Assessment 1 and the three documents for Assessment 1 (assessment, scoring guide, and data chart)
    “Assessment 2”
    Two page narrative from Section IV for Assessment 2 and the three documents for Assessment 2 (assessment, scoring guide, and data chart)
    “Assessment 3” [and so on for other assessments)
    Two page narrative from Section IV for Assessment 3 and the three documents for Assessment 3 (assessment, scoring guide, and data chart)
    “Section V”
    Use of assessment results (12,000 character limit)
  • Files have been created as follows:
    • All files are created in Word, Word Perfect, Excel or PDF files. No files have a “.docx” extension
    • The report does not contain more than 20 attachments
    • Each attachment is no larger than 2 MB

Section I - Context

  • The context section addresses one item with a 4000 character limit and one item (Question 2) with an 8000 character limit. Question 3 also has a 4000 character limit and only needs a response if a report is being written for NCTE, CEC and NASP.  The character limits include spaces.
  • The candidate/completer chart and the faculty information chart are completed within stated character limits.
  • The Program of Study submitted contains sufficient information to be a source document for the rest of the report (e.g., full course titles, number of credit hours, etc. If expected by the SPA you can use the five-page minimum to include key course descriptions, program structure, required vs. elective courses, or other information that might provide a frame of reference).etc. If expected by the SPA you can use the five-page minimum to include key course descriptions, program structure, required vs. elective courses, or other information that might provide a frame of reference).
  • The context section does not include extraneous information, or hyperlinks to other documents.

Section II – Assessment Chart

  • The names of each assessment (column 1) are used consistently throughout the report.
  • The administration point of each assessment (column 3) correlates to course names/numbers or program stages as they are outlined in the Program of Study.
  • The assessments are listed in the order specified in column 1 of the SPA program report form.

Section III – Standards Chart

  • At least one assessment is checked or selected for each SPA standard.

Section IV – Assessments and Findings

  • For each assessment you have included a two-page narrative description of the assessment, outlined according to the instructions for Section IV.
  • Each of the components of the assessment is clearly labeled.
  • The two-page description specifically describes the relationship between the assessment and the particular SPA standards that are cited in the chart in Section III.
  • For each of the 6-8 assessments, you have included (1) the assessment instrument, (2) the scoring guide, and (3) data or an explanation of the lack of data as one whole document. (Note: Licensure data presented as Assessment #1 does not require inclusion of the assessment instrument or scoring guide. In lieu of a description of the assessment, provide the test specifications and information on how the state test aligns with the SPA standards.)
  • As much as possible, combine all of the files for one assessment into a single file. That is, create one file for Assessment 1 that includes the two-page narrative from Section IV, the assessment itself, the scoring guide, and the data chart.

Section V – Use of Assessment Results to Improve Program

  • Section V of the program report is organized according to content knowledge, pedagogical and professional knowledge, skills, and dispositions, and effect on student learning. This section of the report does not exceed three text pages or 12000 characters.

 Print