Está en la página 1de 14

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

VOLUME 25, NUMBER 1, 2007--2008

NCATE COLLABORATIVE
ASSESSMENTS: HOW THREE
UNIVERSITIES WORKED TOGETHER TO
PRODUCE NCATE ELCC STANDARD
ACCREDITATION ASSESSMENTS

Joseph Pacha
Illinois State University

T his is a story of how three universities working in collaboration


created common NCATE ELCC Standard assessments for
determining student attainment of the standards and
furthermore serves as the introduction to these assessments that follow
this introduction. The universities that worked on these assessments
make no claim that the assessments are NCATE approved, but they
have been produced in a spirit of collaboration and are shared in that
spirit. Further, universities may use these assessments in different
ways depending upon the format and framework of their own NCATE
assessment process.

The NCATE Process

The NCATE mission statement (2007) gives the vital reason for its
existence:

NCATE is the teaching profession’s organization to help


establish high quality teacher, specialist, and administrator
preparation. Through the process of professional accreditation of
schools, colleges, and departments of education, NCATE works to
make a difference in the

4
5 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

quality of teaching, teachers, school specialists and administrators.


NCATE believes every student deserves a caring, competent, and
highly qualified teacher (p. 1).

NCATE uses an accreditation process to fulfill its mission. This


accreditation process for schools or departments of educational
administration preparation means each institution has passed a
structured accreditation review. Accreditation is a method for
assessing and enhancing academic and educational quality through
voluntary peer review. Accreditation informs the public that the
accredited college or university operates at a high level of educational
quality and integrity. NCATE accreditation is the process by which a
professional education unit is recognized by the profession as meeting
national standards for the content and operation of the unit. A
nationally recognized program has been approved by its specialty
professional association (SPA). In addition, if a program is approved
by a state in which the state’s program review process has been
approved by the relevant SPA, that program will be nationally
recognized (NCATE Facts, 2007). The SPA (Specialty Professional
Association) to which most institutions seeking NCATE accreditation
for administrator preparation is the Educational Leadership
Constituent Council (ELCC) which provides the Standards for
Advanced Programs in Educational Leadership, commonly referred to
as the ELCC Standards. These standards can be accessed
electronically at http://www.npbea.org/ELCC/ELCCStandards%20_5-
02.pdf. These standards govern the practice of administrator
preparation programs by requiring each program to demonstrate how
their students are meeting each specific standard and standard element
through an assessment and data gathering process.

Three Universities and How They Got Together

It is this assessment process that brought together three Illinois


educational administration preparation programs at Eastern Illinois
University located at Carbondale, Illinois; Illinois State University
located at Normal, Illinois; and Western Illinois University located at
Joe Pacha 6

Macomb, Illinois. All three institutions are members of the Illinois


Council of Professors of Educational Administration (ICPEA) an
affiliate of the National Council of Professors of Educational
Administration (NCPEA) organization. It was at a meeting of ICPEA
that the idea for collaborative work on NCATE assessments was born.

All three university programs are NCATE member institutions


and are under the ELCC and NCATE Accreditation process. The
author had previously worked with each institution independently,
giving a three hour faculty work session on NCATE assessment
development. While discussing the issues that surrounded this process
at an ICPEA meeting, members of the three institutions “wondered out
loud” whether or not some “common assessments” could be put
together to help everyone in their work to meet the accreditation
requirements to gather valid and reliable data for NCATE review.
Creating good assessments for the NCATE process was proving to be
a more difficult endeavor than was previously believed. Two of the
department chairs and the assistant chair from the three institutions
decided that they could work together collaboratively to develop
assessments, scoring rubrics for the assessments, and data tables that
all aligned to the ELCC Standards and standard elements as required
as partial fulfillment of the accreditation process. Since the department
assistant chairperson (and author of this introductory article) was an
NCATE program reviewer, it was determined that he should be the
“retreat” facilitator to help the faculty from these three institutions
come together in a retreat format to develop common assessments that
could be used by the participating universities.

The Retreat Frameworks and Timeline:

The retreat facilitator was given the assignment of creating a


process that would bring faculty members together to produce quality
ELCC Standard assessments, scoring guides, and data collection tables
for the accreditation process. It had been decided that a retreat format
would be used for the meeting which meant the faculties would come
together for a dinner meeting on Sunday evening and continue their
7 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

collaborative work on the following Monday. The three university


programs were able to secure funding and support for the project from
Illinois State Action for Education Leadership Program (IL-SAELP).

The facilitator designed the following format for the retreat.


Sunday evening: (1) meal with a conversation of the retreat’s goals;
(2) breakout work session for assessment development with a partner;
(3) report-out by partners of work accomplished; and (4) review of
work agenda for the next morning. Monday: (1) review the work of
Sunday; (2) in pairs, use the newly created assessments for work on
scoring rubrics to align with the assessments to the standard and
standard elements being assessed; (3) report out work on the scoring
guides; (4) plan for future continued work and development; and (5)
celebrate successes.

The facilitator prepared several handouts for the faculty


members to use to help in their deliberations and creative processes.
Copies of the ELCC Standards were made available for review.
Templates, created by the facilitator to help faculty members organize
their ideas, were made ready for distribution (see examples in
Appendix A and B). An agenda was created and sent to the department
chairs of each department for review prior to the retreat. All was ready.

The Retreat Work- What Happened:

The retreat day was to start on Sunday afternoon, and as luck


would have it, it was the first warmest day of the year. Since the air
conditioning was not yet on, the meeting room was quite warm – not a
good sign to start off a working retreat where, to begin with, the
participants were not really enthusiastic. One good omen was that the
evening meal was excellent and ready for the group as they arrived.
The faculty members took time introducing themselves to each other
as everyone settled in to eat the evening meal. This time of fellowship
and getting to know each other soon turned into the real conversation
for the meeting; namely NCATE assessments.
Joe Pacha 8

Although it was not agreed upon that this had to done, there
was about an hour of the first meeting time spent on “storming and
norming” – terms often used by group facilitators to designate the
level at which a group is functioning (which in this instance was at the
lowest levels for groups). The facilitator used a tactical decision to
allow some of the “venting” to occur in order to allow most of the
participants to “get it out of their system” in the hope that better work
and products would be produced. Fortuitously, one of the participants
ended a long conversation among the participants by looking directly
at the facilitator and asked the following. “What can we do about that
and what are we going to get done tonight since we have already used
up over an hour just talking about things? [Referring to an hour long
conversation about NCATE assessments and the NCATE process.]”
The facilitator answered, “There is nothing that we can do about the
issues that are being discussed because they are policies that are out of
our control; what we are going do is start right now to work on the real
work of the evening to create assessments; so, everyone needs to get a
partner...” With that, the real work began.

The facilitator quickly reviewed a power point presentation


that both faculties had reviewed in their prior training sessions with
the facilitator. Next the facilitator reviewed the first template created
for the groups to guide their work. This template detailed how to
design and make assessments for each of the seven NCATE required
assessments. Then faculty members were paired up – preferably with a
faculty person from another university – and were asked to reflect
upon an assessment that they were already using; review the ELCC
standards and standard elements to consider if their assessment could
assess one or more of them; and then decide how. Next, each pair was
asked to commit to a standard they would work on and develop/refine
the ideas on the assessment of that standard and its standard elements.
Fortunately nearly all the standards were addressed by at least one pair
of faculty working together. The facilitator gave the groups about an
hour to “write out or flesh out” their assessment and how it aligned to
the standard and standard elements. While the teams worked the
9 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

facilitator moved between each group answering questions, listening


to and commenting on ideas and giving support as the groups went
about their work.

Some interesting things started to happen as the groups


worked. The initial rather stoic and somber mood did not last long.
After a period of quiet sharing of ideas, the mood/tone shifted to the
positive, with laughter, and common quick give-and-take between
participants. As the groups worked, more and more enthusiasm
became evident as the groups realized they could do this work; their
ownership and pride in the quality of their activities along with
expanded ideas continued to flow. Having a partner to bounce ideas
off or to help add ideas really helped the process. There was no special
written format used at this time – a free-flow of ideas to make the
connection between the standard and standard elements and the
assessments being created.

The groups came back together when the facilitator had


checked and found most everyone to be done with their assigned work.
Each group reported out their assessment ideas and how they related to
the standard and standard elements they were addressing. Some
questions and other ideas were offered to each group as they reported
out. Kudos and applause were given to each group for their excellent
work. The enthusiasm for the work was a complete turnaround from
the beginning of the retreat. Participants followed up with discussion
and celebration of the good work accomplished. One participant said,”
I would never has guessed that we would have gotten so much
accomplished, felt so good about it, and be ready to work tomorrow,
after the way we started out.”

The final thing to do for this first session was to plan for
Monday’s work and to get a good night’s rest. The start was
successful. One of the universities had brought along a graduate
student whose job it was to take everything that was created and to
make sure that an electronic copy was made of it so that nothing was
lost from all the efforts and hard work of the participants. Assessments
Joe Pacha 10

were created for standards 1 through 7. The work for Monday was to
continue their refinement and to create scoring rubrics for them.
Monday began with continued enthusiasm for the project. After a brief
question and answer session about some logistics of some the items,
the format issue was discussed. A common framework was agreed
upon and the group was ready to start the next scope of its work;
creating common scoring rubrics to determine if candidates had truly
met the standard or standard element that was being assessed. The
facilitator continued to repeat what he had been saying on Sunday
evening, “This is an alignment process; the assessments must align to
the standard and standard elements and the scoring guide must align to
the assessment and to the standard and standard elements. It’s all about
alignment!” The facilitator distributed another template to help with
the creation of a scoring guide (see Appendix B). Using the template
each group began their work on creating a scoring guide that aligned
to the assessment and the standard and standard elements that were
being assessed.

Once again the enthusiasm of the groups and the work they
were able to produce was definitely pronounced. By mid-morning
each group was ready to report out their work on their scoring guide
and alignment to the standard and standard elements. In the report out
session that followed, various techniques were exemplified to
demonstrate that there is not just one way to align the assessments to
the scoring guides and to the standards, but rather, many and varied
ones. There was growing enthusiasm among group participants as the
morning came to a close because of the amount and the quality of the
work produced. The graduate student took copious notes and
documented the work to make sure that nothing was lost.

The retreat ended by setting a date at the next ICPEA meeting


to come together again and to share completed assessments, scoring
guides, and data tables. This assignment would keep people on task to
finalize ideas, flesh out structures, and ensure that products were
complete for sharing among the member institutions who participated.
11 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

What is Next?

The ICPEA meeting was held and group representatives shared


their assessments, scoring guides, and data tables. The quality of the
work continues to improve as peer review continues to provide ideas
for improvement. The process is working. Quality assessments are
being created, reviewed by fellow faculty members, and are available
for use by those participating in the process.

But this process did not just stop at this meeting. Because of
the pride of ownership and the quality of the assessments and work
that was produced, those who worked with the process felt that it
needed to be shared with others so that they could benefit from it,
learn from it, and share through it.

The process was shared with the other college and university
members at an ICPEA meeting. It was enthusiastically agreed by the
meeting participants that a common agenda item for the group would
be to have an NCATE assessment sharing session at each meeting to
present, review, and discuss NCATE assessments, scoring guides and
data tables.

Mike Schmoker (2006) writes in his book, Results Now,


“Professional learning communities have emerged as arguably the
best, most agreed-upon means by which to continuously improve
instruction and student performance.” The spirit of collegial learning is
what we as professors of educational administration should
demonstrate and instill in our students. This process is allowing us to
do what we say we want our students to be doing in field. We are
creating a learning community that shares ideas and builds upon the
strengths of each other. We are collegially sharing ideas about
common ELCC assessments and giving suggestions for improvement.
We are talking about and discussing important ideas about teaching
and learning. Let’s continue.
Joe Pacha 12

The assessments that were created in collaboration by the


members of the three universities follow this introductory article. They
represent at least one comprehensive assessment for each of the seven
ELCC standards that covers the many of the standard elements
contained in that standard. These assessments are being shared in the
spirit of collaboration in which they were produced.
13 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

REFERENCES

National Council for Accreditation of Teacher Education (NCATE).


(n.d.). About us - mission. Retrieved June 1, 2007, from
http://ncate.org/public/aboutNCATE.asp
National Council for Accreditation of Teacher Education (NCATE).
(n.d.). Facts about NCATE. Retrieved June 1, 2007, from
http://ncate.org/public/faqaboutNCATE.asp?ch=1
National Council for Accreditation of Teacher Education (NCATE).
(n.d.). Quick facts. Retrieved June 1, 2007, from
http://ncate.org/public/factsheet.asp?ch=1
National Policy Board for Educational Administration (NPBEA)
(2002). Standards for advanced programs in educational
leadership: For principals, superintendents, curriculum
directors, and supervisors. Retrieved June 1, 2007, from
http://www.npbea.org/ELCC/ELCCStandards%20_5-02.pdf
Schmoker, M. (2006). Results now: How we can achieve
unprecedented improvements in teaching and learning.
Alexandria, VA: Association for Supervision and Curriculum
Development
Joe Pacha 14

Appendix A

Collaborative Work Session Assessment Template

NCATE Assessment # 1

Assessment #1
• Is to assess content knowledge
• Usually is addressed by a state licensure test or professional examination
• If no state test or exam, substitute another assessment
• Do not duplicate another assessment already listed

NCATE Assessment # 2

Assessment #2
• To assess content knowledge
• ELCC standards and elements to be assessed should include but not be
limited to: 1.1, 1.4, 2.3, 3.2, 4.1, 4.2, 4.3, and 6.1
• Examples of assessments include comprehensive examinations, essays,
case studies, etc. (A word of caution about comprehensive examinations:
there must be a valid scoring guide and data table that aligns to this
assessment.)

NCATE Assessment # 3

Assessment #3
• This assessment should focus on the application of content knowledge in
educational leadership
• ELCC standards and elements to be addressed could include but not
limited to: 1.1, 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.3, 2.4, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3,
and 6.1
• Examples of assessments include: action research projects or portfolio
tasks
15 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

NCATE Assessment # 4

Assessment #4
Assessment of professional skills in instructional leadership

School Based Programs District Based Programs


ELCC standards and elements could ELCC standards and elements could
include: 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.4, include: 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.4,
5.1, 5.2, & 5.3 5.1, 5.2, & 5.3

Examples of assessments: school Examples of assessments: district


improvement plans, needs assessment improvement plans, needs assessment
projects, or faculty intervention plans projects, or district curriculum redesign
projects

NCATE Assessment # 5

Assessment #5
• Assess professional skills in internship/clinical practice
• ELCC standards and elements included but not limited to this assessment:
1.3, 3.1, 2.2, 2.3, 2.4, 3.2, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3, 6.1, 6.2, 6.3
• Example assessments include: faculty evaluations of candidate
performances, internship/clinical site supervisor’s evaluations of
candidates’ performances, and candidates’ formative and summative logs
and reflections
• Must show required internship/clinical activities aligned to specific ELCC
standard elements
• Must provide evaluation instrument for measuring candidate proficiency on
specific internship/clinical activities
Joe Pacha 16

NCATE Assessment # 6

Assessment #6
Assess professional skills in organizational management and community relations

School Based Programs District Based Programs


ELCC standards and elements could ELCC standards and elements could
include: 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1, include: 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1,
5.2, 5.3, 6.1, 6.2, & 6.3 5.2, 5.3, 6.1, 6.2, & 6.3

Examples of assessments: school-based Examples of assessments: district-based


strategic plans, school simulations, or strategic plans, district simulations, or
school intervention plans district intervention plans

NCATE Assessment # 7

Assessment #7
• Assesses the effects on student learning
• Assessment demonstrates candidates’ ability to support student learning and
development
• ELCC standards and elements that could be addressed include but are not
limited to: 1.1, 1.2, 1.4, 2.1, 2.2, 2.3, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3,
and 6.3
• Examples of assessments include: post graduate surveys, employer
satisfaction surveys, or community feedback surveys of candidates and
graduates
17 NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

NCATE Assessment # 8

Assessment #8

This is an optional assessment

Appendix B

Collaborative Work Session Scoring Guide Template

Scoring Guides are instruments that measure candidate mastery of the ELLC
Standards and Standard Elements and Standard Element Indicators

• Scoring guides must be specific to the assessment activity or activities


described.
• Examples of scoring guides may be rubrics or Likert type of scale
measurements. They must be able to translate levels of performance into a
numeric data system. Further, what is proficient or meeting the standard
element must be clearly defined.
• Scoring guides must specifically measure the performance expectation that
is “proficient” or “meets” the specific ELCC standard element(s) to which
the assessment is aligned.

Example of a scoring guide framework:

SE Assessment Does Not Meet Meets the


ES SE
Indicator: Description: the Standard: Standard

ES = ELCC Standard
SE = ELCC Standard Element
SEI = ELCC Standard Element Indicator

También podría gustarte