Documentos de Académico
Documentos de Profesional
Documentos de Cultura
NCATE COLLABORATIVE
ASSESSMENTS: HOW THREE
UNIVERSITIES WORKED TOGETHER TO
PRODUCE NCATE ELCC STANDARD
ACCREDITATION ASSESSMENTS
Joseph Pacha
Illinois State University
The NCATE mission statement (2007) gives the vital reason for its
existence:
4
5 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL
Although it was not agreed upon that this had to done, there
was about an hour of the first meeting time spent on “storming and
norming” – terms often used by group facilitators to designate the
level at which a group is functioning (which in this instance was at the
lowest levels for groups). The facilitator used a tactical decision to
allow some of the “venting” to occur in order to allow most of the
participants to “get it out of their system” in the hope that better work
and products would be produced. Fortuitously, one of the participants
ended a long conversation among the participants by looking directly
at the facilitator and asked the following. “What can we do about that
and what are we going to get done tonight since we have already used
up over an hour just talking about things? [Referring to an hour long
conversation about NCATE assessments and the NCATE process.]”
The facilitator answered, “There is nothing that we can do about the
issues that are being discussed because they are policies that are out of
our control; what we are going do is start right now to work on the real
work of the evening to create assessments; so, everyone needs to get a
partner...” With that, the real work began.
The final thing to do for this first session was to plan for
Monday’s work and to get a good night’s rest. The start was
successful. One of the universities had brought along a graduate
student whose job it was to take everything that was created and to
make sure that an electronic copy was made of it so that nothing was
lost from all the efforts and hard work of the participants. Assessments
Joe Pacha 10
were created for standards 1 through 7. The work for Monday was to
continue their refinement and to create scoring rubrics for them.
Monday began with continued enthusiasm for the project. After a brief
question and answer session about some logistics of some the items,
the format issue was discussed. A common framework was agreed
upon and the group was ready to start the next scope of its work;
creating common scoring rubrics to determine if candidates had truly
met the standard or standard element that was being assessed. The
facilitator continued to repeat what he had been saying on Sunday
evening, “This is an alignment process; the assessments must align to
the standard and standard elements and the scoring guide must align to
the assessment and to the standard and standard elements. It’s all about
alignment!” The facilitator distributed another template to help with
the creation of a scoring guide (see Appendix B). Using the template
each group began their work on creating a scoring guide that aligned
to the assessment and the standard and standard elements that were
being assessed.
Once again the enthusiasm of the groups and the work they
were able to produce was definitely pronounced. By mid-morning
each group was ready to report out their work on their scoring guide
and alignment to the standard and standard elements. In the report out
session that followed, various techniques were exemplified to
demonstrate that there is not just one way to align the assessments to
the scoring guides and to the standards, but rather, many and varied
ones. There was growing enthusiasm among group participants as the
morning came to a close because of the amount and the quality of the
work produced. The graduate student took copious notes and
documented the work to make sure that nothing was lost.
What is Next?
But this process did not just stop at this meeting. Because of
the pride of ownership and the quality of the assessments and work
that was produced, those who worked with the process felt that it
needed to be shared with others so that they could benefit from it,
learn from it, and share through it.
The process was shared with the other college and university
members at an ICPEA meeting. It was enthusiastically agreed by the
meeting participants that a common agenda item for the group would
be to have an NCATE assessment sharing session at each meeting to
present, review, and discuss NCATE assessments, scoring guides and
data tables.
REFERENCES
Appendix A
NCATE Assessment # 1
Assessment #1
• Is to assess content knowledge
• Usually is addressed by a state licensure test or professional examination
• If no state test or exam, substitute another assessment
• Do not duplicate another assessment already listed
NCATE Assessment # 2
Assessment #2
• To assess content knowledge
• ELCC standards and elements to be assessed should include but not be
limited to: 1.1, 1.4, 2.3, 3.2, 4.1, 4.2, 4.3, and 6.1
• Examples of assessments include comprehensive examinations, essays,
case studies, etc. (A word of caution about comprehensive examinations:
there must be a valid scoring guide and data table that aligns to this
assessment.)
NCATE Assessment # 3
Assessment #3
• This assessment should focus on the application of content knowledge in
educational leadership
• ELCC standards and elements to be addressed could include but not
limited to: 1.1, 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.3, 2.4, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3,
and 6.1
• Examples of assessments include: action research projects or portfolio
tasks
15 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL
NCATE Assessment # 4
Assessment #4
Assessment of professional skills in instructional leadership
NCATE Assessment # 5
Assessment #5
• Assess professional skills in internship/clinical practice
• ELCC standards and elements included but not limited to this assessment:
1.3, 3.1, 2.2, 2.3, 2.4, 3.2, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3, 6.1, 6.2, 6.3
• Example assessments include: faculty evaluations of candidate
performances, internship/clinical site supervisor’s evaluations of
candidates’ performances, and candidates’ formative and summative logs
and reflections
• Must show required internship/clinical activities aligned to specific ELCC
standard elements
• Must provide evaluation instrument for measuring candidate proficiency on
specific internship/clinical activities
Joe Pacha 16
NCATE Assessment # 6
Assessment #6
Assess professional skills in organizational management and community relations
NCATE Assessment # 7
Assessment #7
• Assesses the effects on student learning
• Assessment demonstrates candidates’ ability to support student learning and
development
• ELCC standards and elements that could be addressed include but are not
limited to: 1.1, 1.2, 1.4, 2.1, 2.2, 2.3, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3,
and 6.3
• Examples of assessments include: post graduate surveys, employer
satisfaction surveys, or community feedback surveys of candidates and
graduates
17 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL
NCATE Assessment # 8
Assessment #8
Appendix B
Scoring Guides are instruments that measure candidate mastery of the ELLC
Standards and Standard Elements and Standard Element Indicators
ES = ELCC Standard
SE = ELCC Standard Element
SEI = ELCC Standard Element Indicator