Documentos de Académico
Documentos de Profesional
Documentos de Cultura
Table of Contents
Student Learning Outcomes at the Lesson Level............................................................................1
Student Learning Outcomes at the Course Level: From Course Objectives to SLOs.....................2
Primary Trait Analysis: Statements of Grading Criteria..................................................................4
Selecting the Assessment Method: Authentic Assessment and Deep Learning...............................6
Norming or Inter-Rater Reliability: Assuring Consistency of Grading Among Faculty.................8
The Assessment Report: Sharing the Results of Student Learning Outcomes................................8
Program Level Student Learning Outcomes....................................................................................9
Direct and Indirect Measures of Student Learning Outcomes.......................................................10
Identifying Program CompetenciesExternal and Internal Sources............................................11
Strategies for Direct Assessment of Program SLOs: Mosaic and Capstone Approaches..............11
General Education Student Learning Outcomes............................................................................13
Conclusion 14
Appendices
Appendix 1 Good Practices in Assessing Student Learning Outcomes......................................15
Appendix 2 Activity 3: Writing Student Learning Outcomes.....................................................22
Appendix 3 Developing and Applying Rubrics..........................................................................23
Appendix 4 Examples of Scoring Rubrics..................................................................................28
Appendix 5 Activities 4 & 5: Building and Using a Grading Rubric.........................................29
Appendix 6 The Case for Authentic Assessment by Grant Wiggins .........................................30
Appendix 7 -- State and National Standards, Academic & Vocational Competencies..................32
Appendix 8 Assessment Report Examples.................................................................................36
Appendix 9 Assessment Plan Examples Internet Sites...............................................................40
Appendix 10 Activity 5 Program SLOs from Competency Statements..................................41
Appendix 11 Examples of Program Assessment Reports...........................................................42
Appendix 12 General Education Student Learning Outcomes...................................................44
Appendix 13Resources and References for Student Learning Outcomes Assessment..............52
Endnotes.........................................................................................................................................57
URL for this document: http://cai.cc.ca.us/workshops/SLOFocusOnResults.doc
Bill Scroggins
Interim President
Modesto Junior College
scrogginsb@yosemite.cc.ca.us
Activity 1
In small groups by discipline or cluster of related disciplines, discuss how you develop grading criteria.
Do you write down your grading criteria for each assignment?
How consistent are you in applying your grading criteria?
Do you use the results of student assessment to improve your grading criteria?
Do you communicate your grading criteria to students? Before or after the assignment?
Do you encourage students to apply the grading criteria to their own work?
Do you involve students in developing or modifying your grading criteria?
Aggregating the feedback from grading student assignments can provide valuable insight into
areas in need of improvement. With all the demands on our time, we may not give adequate
attention to mining this valuable source of information for improvement of the teaching and
learning process. One of the challenges that the new accreditation standards present is creating
an assessment plan that outlines objectives, grading criteria, results of assessing student work,
and how we use those results to improve student learning.
Of course, part of improving student learning is improving the way we teach. This inevitable
outcome can potentially be threatening to faculty members. However, when these issues have
been raised in workshops with faculty, the result has generally been a serious engagement in
discussions of teaching methods to improve authentic, deep learning. iv It is extremely important
to build environments for discussing the improvement of student learning which are positive and
reinforcing. Several colleges have made explicit commitments to this principle.v (The endnote
references include approaches by Palomar College in California, College of DuPage in Illinois,
and the American Association of Higher Education.)
Activity 2
Read the following resource documents (see Appendix 1) and join in the group discussion on Good Practices for
Assessment of Student Learning Outcomes.
An Assessment Manifesto by College of DuPage (IL)
9 Principles of Good Practice for Assessing Student Learning by AAHE
Palomar College Statement of Principles on Assessment from Palomar College (CA)
Closing the LoopSeven Common (Mis)Perceptions About Outcomes Assessment by Tom Angelo
Five Myths of Assessment by David Clement, faculty member, Monterey Peninsula College
Student Learning Outcomes at the Course Level: From Course Objectives to SLOs
Beyond the lesson level, we must address results of student learning at the course level.
Moreover, we should do so for all sections of each course, meaning collaboration among the
full- and part-time faculty teaching the course. In stating the desired student learning outcomes,
we have the advantage of agreed-upon student objectives in the course outline.
A great deal of energy has been expended in discussing the difference between a course objective
and a student learning outcome. The difference may be clearer when viewed in the context of
producing assessment results that 1) provide useful feedback to improve the teaching and
learning process and 2) provide useful information to improve college practices. SLOs more
clearly connect with how the instructor will evaluate student work to determine if the objective
has been met. When we write an assignment, we provide a context in which the student will
respond and we evaluate the response based on criteria we use to judge if the student has met the
objectiveusually we have at least a mental construct of minimum acceptable performance
standards. These are the two additional pieces that transform an objective into an SLO. Heres
how it might work.
If course objectives have been written well, they will be complete, measurable, and rigorous. In
practice, as faculty look more closely at the criteria and methods to assess these objectives,
changes often result. To operationalize an objective for assessment purposes, that is, to
transform it into a statement of desired student learning outcomes, typically we must address:
1) the stated objectives in terms of acquired knowledge, skill or values (hopefully, the
existing course objectives),
2) the context or conditions under which the student will be expected to apply the
knowledge, skill or values, and
3) the primary traits which will be used in assessing student performance.
Below are some examples of robust course objectives or statements of desired student
learning outcomes. (Note that this difference is largely semantic. Some colleges have chosen to
put SLO statements in course outlines as an enhancement of the objectives, while others have
built statements of desired SLOs into a departmental assessment plan, typically related to
program review.) Whatever vehicle the college uses to operationalize course objectives to SLOs,
it must be done collaboratively among faculty who teach the course.
Examples of Course Objectives Transformed Into Student Learning Outcomes
Course Objective
Write well-organized,
accurate and significant
content. (English)
Analyze behavior following
the major accepted theories.
(Psychology)
Activity 3
Perform the Writing Student Learning Outcomes exercise in Appendix 3. Review the first example. Then for the
second course objective, complete the Performance Context, Measurable Objective, and Primary Traits. Finally,
select an objective from a course in your discipline and construct the three-part SLO statement.
While primary traits are the categories into which we can sort competencies when we evaluate
student work, we look for specific levels of performance in each of these areas. For example, an
essay might be rated on development, organization, style, and mechanics. These primary traits
are then rated on some sort of a scaleas simple as A/B/C/D/F or more descriptive as
excellent/superior/satisfactory/poor/unsatisfactory. Occasionally, points are given based on this
scale. The challenge presented by the Student Learning Outcomes process is to write down those
observable student performance characteristics in an explicit way for each of the primary traits
we have identified. This system, known as a grading rubric, can be used to grade student work
collected through all manner of assessment methods.vii
Template for a Grading Rubric:
Primary Traits and Observable Characteristics
Trait
Excellent
Superior
Satisfactory
Poor
Unsatisfactory
Development
Organization
Style
Mechanics
Building a Rubric
Start with expectations for satisfactory
work for each trait such as
Organization in the table to the left:
Ideas generally related to one
another and to the focus, but may
have some unrelated material
Adequate introduction and
conclusion
Some attempt at transitions
Then stretch up to excellent and down
to unsatisfactory.
Rubrics can be applied in total by specifically rating each primary trait (an analytic grading
rubric) or holistically (using the rubric as a guide to determine the overall rating of excellent,
satisfactory, or unsatisfactoryor whatever performance levels have been agreed upon). An
example is given below.
Trait
Understanding
complete understanding of
the problem in the
problem statement section
as well as in the
development of the plan
and interpretation of the
solution
Plan
Solution
Presentation
Trait
Analyzed
holistically
no understanding of the
problem; the problem
statement section does not
address the problem or
may even be missing. The
plan and discussion of the
solution have nothing to do
with the problem
0 points
no solution is given
0 points
All of the following
characteristics must be
present:
answer is incorrect;
explanation, if any,
uses irrelevant
arguments;
no plan for solution is
attempted beyond just
copying data given in
the problem statement
Grading rubrics can be applied to a wide variety of subjects and used in association with a range
of assessment techniques. (See the endnote on rubrics for references to good practices for using
rubrics and for a range of examples of rubrics at a variety of colleges and across several
disciplines.)
Before doing these two activities on rubrics, read Developing and Applying Rubrics by Mary
Allen in Appendix 3. If possible, review some of the sample rubrics listed in Appendix 4.
Activity 4: Building a Rubric
Using the grid in Appendix 5A, select or write an SLO, identify Primary Traits, and then decide
on observables for each assessment level
Activity 5: Using a Grading Rubric and Norming the Results
Use the English rubric in Appendix 5B to grade the sample student essay in Appendix 5C.
Compare your results with colleagues who graded the same paper. Where were your assessments
different? Can you come to agreement on the overall rating of the paper?
To this point we have discussed stating the desired student learning outcome and developing a
grading rubric. These are the beginning steps that can lead us toward collecting and using the
results of measured student learning outcomes. A road map of a possible SLO Assessment Plan
is shown in the diagram below.
Faculty Collaboration
Course Context or Primary Observables for Each Assessment Norm Among Evaluate Compile Use Feedback
Objective Conditions
Traits
Performance Level
Method Selected
Instructors
Student Work Results
for Improvement
Grading Rubric
Assessment Report
(Compiled for Each Desired SLO)
Activity 6
Read the article The Case for Authentic Assessment by Grant Wiggins in Appendix 6. Discuss the
assessment methods you use in your classes. What methods do you use? How effective do you find them?
Activity 7
View the film A Private Universex. Discuss the implications for producing and assessing deep learning.
As I have listened to faculty discuss assessment methods (at six statewide California Assessment
Institutes, eight regional RP/CAI workshops, and our own colleges summer institute on SLOs), I
have come to several conclusions:
Faculty are eager to talk about the challenges they experience in assessing students.
Discussions often turn to great stuff such as authentic assessment and deep learning.
Most faculty use a rather narrow range of methodsbut use them well.
Faculty will more often try another assessment technique if recommended by a colleague.
Many faculty use assessments that need just slight enhancement to yield SLO results.
One department has a clinical component in which students are evaluated using a rating
sheet on their hands-on competencies. The department has complained about needing
feedback from clinical to the theory courses, but has not consistently used the results of
the rating sheets for this purpose. The competencies taught in the theory course are fairly
well aligned with those assessed in clinical but could be improved.
Faculty in one of the social science departments have worked on departmental standards
for term papers to the point of a primary trait analysis and meet regularly to discuss
grading of term papers but have not filled in the observables to establish a rubric.
The English department has a grading rubric for written essays, and full- and part-time
faculty have regular norming sessions to improve consistency of grading, but the system
has only been used for two courses, freshman comp and its prerequisite.
Based on these observations, my recommendation is to start with these good things that faculty
are doing, get them engaged in talking about grading (Effective Grading: A Tool for Learning
and Assessment by Barbara Walvoord and Virginia Anderson has been great for this), get faculty
7
to share assessment strategies with one anotherespecially across disciplines, and provide the
support for moving these good existing assessment practices to the next level.
Results often contradict our assumptions of how and what students learn.
The key components of the Assessment Plan are the student learning outcomes statements and
the assessment methods used for each. The plan often includes benchmarks that indicate the
incremental gains expected in the assessment results. The essential features of the Assessment
Report are a summary of the results of student evaluations, an analysis of those findings, and a
summary of the actions taken to improve the student assessment performance. The diagram
below summarizes the elements of an effective Assessment Plan and the resulting Assessment
Report. Examples of Assessment Reports are shown in the Appendix.
Course Assessment Report
Department __________________________________
Term & Year __________________
Course Name and Number________________________________________________________
Student Learning Outcome Statements
Assessment Method Description (attach rubric)
3.
Assessment Results
1.
2.
3.
Program Level Student Learning Outcomes
The term program here refers to core required courses for occupational programs and lower
division major preparation for transfer programs. Many professional societies have standards or
competencies that can be used as the basis for program level SLOs. (Some examples are
referenced in the endnotes and summarized in the Appendix.) Often, however, these
competencies are in the form of discrete skills rather than more global outcomes that would lend
themselves to summaries of student learning by those who have completed those programs. An
example of aggregating detailed standards into more comprehensive SLO statements is this
sample taken from the American Psychological Association.xii
Example of Aggregation of Specific Program Competencies into a Program Student Learning Outcome
Global Student Learning Outcome: Use critical thinking effectively.
Specific Competencies:
a. Evaluate the quality of information, including differentiating empirical evidence from speculation and the
probable from the improbable.
b. Identify and evaluate the source, context, and credibility of information.
c. Recognize and defend against common fallacies in thinking.
d. Avoid being swayed by appeals to emotion or authority.
e. Evaluate popular media reports of psychological research.
f.
g.
Demonstrate an attitude of critical thinking that includes persistence, open-mindedness, tolerance for
ambiguity and intellectual engagement.
Make linkages or connections between diverse facts, theories, and observations.
From Undergraduate Psychology Major Learning Goals And Outcomes: A Report, American Psychological Association, March 2002xii
Indirect measures are often thought of as outputs: course completions, degrees, certificates, and
transfers for example. These are the institutional measures of accountability measured by the
California Community Colleges Partnership for Excellence initiative. These measures are often
key indicators of success for a program, as exemplified below.
Example of the Use of Direct and Indirect Measures of Student Learning
From Oklahoma State University: http://www.okstate.edu/assess
Direct
Direct
Direct
Indirect
Program: a sequence of
courses leading to an
educational goal in accord
with the mission of the
California Community
Colleges: transfer, associate
degree (both of which have
major and general education
components), certificate,
basic skills, or workforce
skill upgrades.
The American Welding Society publishes welding codes and standards on which an extensive AWS curriculum
is based. Many community colleges give students AWS certification tests based on these competencies.
The California Board of Registered Nursing uses standards of competent performance and tests nursing
applicants for licensure in many nursing fields.
10
The American Psychological Association recently published Undergraduate Psychology Learning Goals and
Outcomes that lists both global student learning outcomes and detailed competencies for both the psych major
and liberal studies students.
The California State Board of Barbering and Cosmetology tests graduates for licensure based on curriculum
standards enacted in Title 16 of the California Code of Regulations.
Links to these and other competencies and standards are found in the Appendix. While an
individual program may not teach to all the outcomes that these groups specify, the lists are an
excellent starting point. Not all programs have industry associations or professional societies
who write standards. Such programs may need to consult local vocational advisory committees
or faculty colleagues at neighboring institutions.
Strategies for Direct Assessment of Program SLOs: Mosaic and Capstone Approaches
The Mosaic Approach. Assessment of program-level student learning outcomes can be
approached by assessing either detailed competencies or more global program learning goals.
(Look again at the example in the table at the top of page 10 for the distinction between a global
SLO statement and its detailed competencies.) Assessing detailed competencies views the
acquiring of knowledge, skills and attitudes as taking place rather like assembling a complex
mosaic from individual colored tiles. It is a more analytical model and provides more targeted
information about student learning. However, the extent of the effort to find authentic
assessments for a large number of mosaic competencies, get agreement among program faculty
on those assessments, construct rubrics, norm on samples of student work, and then collect and
analyze the data may stretch program resources to the breaking point. Furthermore, the
acquisition of small, discrete packets of knowledge may not lead the student to acquire a more
integrated understanding that provides needed applicability to the next step in that students
career, be it transfer or directly entering the job market. Consequently, more holistic assessments
are often preferred, such as capstone courses or internships.
The Program Audit. Even if an integrated assessment is used at the end of the program, it is
useful to identify where in the curriculum each SLO (or even individual competency) is
acquired. Furthermore, learning most often occurs in cycles: the student will be exposed to a
topic, then later gain competency in that area, and finally master that skill. Doing a program
audit of exactly where SLOs and/or competencies are introduced, reinforced, and mastered in the
program course offerings is a useful exercise. A template for such a program audit is shown
below. Several colleges use such a model to connect individual course learning outcomes
statements with the more global program level learning outcomes statements.
Curriculum Audit Grid: Identifying Specific Competencies in the Program Mosaic
Course
Outcomes
201 202 205 207 251 260 313 314 320 425
1. Recognize and articulate approaches to psychology
I
E
R
2. Independently design valid experiments.
I E
R
3. Articulate a philosophy of psych/Christian
integration.
11
E = Emphasized
R = Reinforced
Example from A Program Guide for Outcomes Assessment by Geneva College, April
2000: http://www.geneva.edu/academics/assessment/oaguide.pdf
Intended Outcomes
(Objectives)
1. Desktop Graphics
program revised to
include more experience
in Web site graphics.
Students designed
graphics for current MC
home page and links.
3. U of I Coordinator of Transfer
Articulation reported that out of
29 applicants from other schools
to Graphics a Mass Com student
was the only admit.
3. Continue to
gather/monitor data.
Investigate how many
Parkland Graphics
students applied.
discipline.
List the course objectives for all of these courses, preferably after having revised them to
robust objectives/student learning outcomes as described previously.
Identify which course objectives match with each Program SLO statement. Present the results
in a table format like that above. You may wish to categorize each course objective by the
extent to which is moves students toward mastery of the Program SLO.
Identify Direct
Measures
Identify
Indirect
Measures
1
Implement
Program
Program
Level
TLC
4
Collect
Assessment
Results
5
Disseminate &
Reflect on
Results
6
Decide on
Program
Improvements
California Community Colleges have three sets of general education patterns to offer to students:
the associate degree pattern set by Title 5, the CSU GE-Breadth pattern, and IGETC. While these
patterns are similar, they have significant differences. The competency statements found in the
source documents for CSU GE-Breadth and IGETC can be a useful starting point for colleges
beginning the process of constructing SLO statements for general education categories.
General Education Patterns Available to Students (Merced College Example)
Merced College AA
CSU GE-Breadth
IGETC
A. Language & Rationality
B. Natural Sciences
C. Humanities
D. Social & Behavioral Sciences
E. Livelong
Development
Understanding
&
Self-
For more information refer to CSU Executive Order 595 and IGETC Notes 1, 2 and 3
Activity 10 Writing Global SLO Statements with Specific Competencies for Each
Review the models of general education student learning outcomes in Appendix 12 (assumes
knowledge of CCC GE, CSU GE-Breadth and IGETC patterns).
For each college GE area, write a global student learning outcome statement.
For each college GE area, write specific competency SLO statements under each of the global
SLO statements.
Activity 11 Performing a General Education Program Audit
Assemble the outlines of record for the courses approved in each GE area.
List the course objectives for all of these courses, preferably after having revised them to
robust objectives/student learning outcomes as described previously.
Identify which course objectives match with each GE SLO statement. Present the results in a
table format like that discussed previously. You may wish to categorize each course objective
by the extent to which is moves students toward mastery of the AA/AS GE SLO: I =
Introduced, E = Emphasized, or R = Reinforced.
Conclusion
In presenting preliminary findings to be published in an up-coming monograph, Jack
Friedlander, Executive Vice President of Santa Barbara City College, concluded that most
colleges around the country are still at the process level of developing SLOs. Nevertheless, there
are many examples of excellent work on SLOs at colleges around the country, summarized in
Appendix 13. These examples should provide colleges which are new to the Student Learning
Outcomes process with the shared experiences of their colleagues so that climbing the learning
curve can be facilitated. The climate in education today simply will not allow us to expend
valuable time and energy on a process that will not yield useful results. Such results have the
14
potential to allow faculty and others to engage in reflection about the process of teaching and
learning and then use the insights they develop to adjust the teaching-learning-assessment
process to optimize learning to the full extent possible. By having a clear path to those results,
we can move ahead with taking the first few steps. But we need to keep our eye on the goal as
were walking. Remember, utility can quickly become futility by adding a few fs!
15
1. Assessment should be based on an understanding of how students learn. Assessment should play a
positive role in the learning experiences of students.
2. Assessment should accommodate individual differences in students. A diverse range of assessment
instruments and processes should be employed, so as not to disadvantage any particular individual or
group of learners. Assessment processes and instruments should accommodate and encourage
creativity and originality shown by students.
3. The purposes of assessment need to be clearly explained. Staff, students, and the outside world need
to be able to see why assessment is being used, and the rationale for choosing each individual form of
assessment in its particular context.
4. Assessment needs to be valid. By this, we mean that assessment methods should be chosen which
directly measure that which it is intended to measure, and not just a reflection in a different medium
of the knowledge, skills or competences being assessed.
5. Assessment instruments and processes need to be reliable and consistent. As far as is possible,
subjectivity should be eliminated, and assessment should be carried out in ways where the grades or
scores that students are awarded are independent of the assessor who happens to mark their work.
External examiners and moderators should be active contributors to assessment, rather than observers.
6. All assessment forms should allow students to receive feedback on their learning and their
performance. Assessment should be a developmental activity. There should be no hidden agendas in
assessment, and we should be prepared to justify to students the grades or scores we award them, and
help students to work out how to improve. Even when summative forms of assessment are employed,
students should be provided with feedback on their performance, and information to help them
identify where their strengths and weaknesses are.
7. Assessment should provide staff and students with opportunities to reflect on their practice and their
learning. Assessment instruments and processes should be the subject of continuous evaluation and
adjustment. Monitoring and adjustment of the quality o f assessment should be built in to quality
control processes in universities and professional bodies.
8. Assessment should be an integral component of course design, and not something bolted on
afterwards. Teaching and learning elements of each course should be designed in the full knowledge
of the sorts of assessment students will encounter, and be designed to help them show the outcomes of
their learning under favorable conditions.
9. The amount of assessment should be appropriate. Students' learning should not be impeded by being
driven by an overload of assessment requirements, nor should the quality of the teaching conducted
by staff be impaired by excessive burdens of assessment tasks.
10. Assessment criteria need to be understandable, explicit and public. Students need to be able to tell
what is expected of them in each form of assessment they encounter. Assessment criteria also need to
be understandable to employers, and others in the outside world.
Appendix 1A
16
Appendix 1B
17
We will not use assessment in a way that will impinge upon the academic freedom or
professional rights of faculty. Individual faculty members must continue to exercise their
best professional judgment in matters of grading and discipline.
We will not assume that assessment can answer all questions about all students. We need
not directly assess all students in order to learn about the effectiveness of our programs
and policies.
We will not assume that assessment is quantitative. While numerical scales or rubrics
(such as the four-point grading scale) can be useful, their accuracy always depends on the
clear understanding of the concepts behind the numbers. Often the best indicator of
student learning can be expressed better as a narrative or a performance than as a number.
We will not use assessment only to evaluate the end of the students experience or merely
to be accountable to outside parties. Assessment must be ongoing observation of what we
believe is important.
We will not assume that assessment is only grading.
Appendix 1B
18
Appendix 1C AAHE Nine Principles of Good Practice for Assessing Student Learning
Appendix 1C
19
6.
7.
8.
9.
toward intended goals in a spirit of continuous improvement. Along the way, the
assessment process itself should be evaluated and refined in light of emerging insights.
Assessment fosters wider improvement when representatives from across the
educational community are involved. Student learning is a campus-wide responsibility,
and assessment is a way of enacting that responsibility. Thus, while assessment efforts
may start small, the aim over time is to involve people from across the educational
community. Faculty play an especially important role, but assessment's questions can't be
fully addressed without participation by student-affairs educators, librarians,
administrators, and students. Assessment may also involve individuals from beyond the
campus (alumni/ae, trustees, employers) whose experience can enrich the sense of
appropriate aims and standards for learning. Thus understood, assessment is not a task for
small groups of experts but a collaborative activity; its aim is wider, better-informed
attention to student learning by all parties with a stake in its improvement.
Assessment makes a difference when it begins with issues of use and illuminates
questions that people really care about. Assessment recognizes the value of
information in the process of improvement. But to be useful, information must be
connected to issues or questions that people really care about. This implies assessment
approaches that produce evidence that relevant parties will find credible, suggestive, and
applicable to decisions that need to be made. It means thinking in advance about how the
information will be used, and by whom. The point of assessment is not to gather data and
return "results"; it is a process that starts with the questions of decision-makers, that
involves them in the gathering and interpreting of data, and that informs and helps guide
continuous improvement.
Assessment is most likely to lead to improvement when it is part of a larger set of
conditions that promote change. Assessment alone changes little. Its greatest
contribution comes on campuses where the quality of teaching and learning is visibly
valued and worked at. On such campuses, the push to improve educational performance
is a visible and primary goal of leadership; improving the quality of undergraduate
education is central to the institution's planning, budgeting, and personnel decisions. On
such campuses, information about learning outcomes is seen as an integral part of
decision making, and avidly sought.
Through assessment, educators meet responsibilities to students and to the public.
There is a compelling public stake in education. As educators, we have a responsibility to
the publics that support or depend on us to provide information about the ways in which
our students meet goals and expectations. But that responsibility goes beyond the
reporting of such information; our deeper obligation -- to ourselves, our students, and
society -- is to improve. Those to whom educators are accountable have a corresponding
obligation to support such attempts at improvement.
Authors: Alexander W. Astin; Trudy W. Banta; K. Patricia Cross; Elaine El-Khawas; Peter T. Ewell; Pat Hutchings;
Theodore J. Marchese; Kay M. McClenney; Marcia Mentkowski; Margaret A. Miller; E. Thomas Moran; Barbara D.
Wright
This document was developed under the auspices of the AAHE Assessment Forum with support from the Fund for
the Improvement of Postsecondary Education with additional support for publication and dissemination from the
Exxon Education Foundation. Copies may be made without restriction.
Appendix 1C
20
Were doing just fine without it. (Assessment is medicine only for the sick.)
Were already doing it. (Assessment is just old wine in new bottles.)
Were far too busy to do it. (Assessment is an administrivial burden.)
The most important things we do cant/shouldnt be measured. (Assessment is too reductive and
quantitative.)
5. Wed need more staff and lots more money to do assessment. (Assessment is too complex and
expensive.)
6. Theyll use the results against us. (Assessment is a trick or a Trojan horse.)
7. No one will care about or use what we find out. (Assessment is a waste of time.)
Seven Reasonable Responses to Those (Mis)Perceptions
1. Were doing just fine without it.
Okay, then lets use assessment to find out what works, and to help us document and build on
our successes.
2. Were already doing it.
Okay, then lets audit all the assessments we already do to discover what we know and what we
dont.
3. Were far too busy to do it.
Okay, but since were already doing it, lets use assessment to see where and how we can save
time and effort.
4. The most important things we do cant/shouldnt be measured.
And not everything measurable should be measured, but lets see if we can agree on how we can
tell when were succeeding in these most important things.
5. Wed need more staff and lots more money to do assessment.
Since were unlikely to get more resources, how, what, and where can we piggyback, embed,
and substitute?
6. Theyll use the results against us.
They might. So, lets build in strong safeguards against misuse before we agree to assess.
7. No one will care about or use what we find out.
To avoid that, lets agree not to do any assessments without a firm commitment from
stakeholders to use the results.
Seven Transformative Guidelines for Using Assessment to Improve Teaching and Learning
1. Build shared trust. Begin by lowering social and interpersonal barriers to change.
2. Build shared motivation. Collectively determine goals worth working toward and problems
worth solvingand consider the likely costs and benefits.
3. Build a shared language. Develop a collective understanding of new concepts (mental models)
needed for transformation.
4. Design backward and work forward. Design backward from the shared vision and long-term
goals to develop coherent outcomes, strategies, and activities.
5. Think and act systematically. Understand the advantages and limitations of the larger system(s)
within which we operate and seek connections and applications to those larger worlds.
6. Practice what we preach. Use what we have learned about individual and organizational learning
to inform and explain our efforts and strategies.
7. Dont assume, ask. Make the implicit explicit. Use assessment to focus on what matters most.
Appendix 1D
21
Assessment
rubrics
outcomes
will
not
evaluation.
and
affect
learning
teacher
learning
Work Cited
Burroughs, William S. Naked Lunch. New York:
Grove Press, 1956. Reissue edition 1992.
22
Appendix 1E
23
Performance Context
Given specifications and
materials requiring a weld,
Measurable Objective
evaluate the performance needs
and match the welding method to
the required application.
Appendix 2
24
Appendix 3.1
25
Online Rubrics
For links to online rubrics, go to http://www.calstate.edu/acadaff/sloa/. Many rubrics have
been created for use in K-12 education, and they can be adapted for higher education. It's often
easier to adapt a rubric that has already been created than to start from scratch.
Rubrics have many strengths:
Complex products or behaviors can be examined efficiently.
Developing a rubric helps to precisely define faculty expectations.
Rubrics are criterion-referenced, rather than norm-referenced. Raters ask, "Did the student
meet the criteria for level 5 of the scoring rubric?" rather than "How well did this student do
compared to other students?"
Ratings can be done by students to assess their own work, or they can be done by others, e.g.,
peers, fieldwork supervisions, or faculty.
Rubrics can be useful for grading, as well as assessment.
Organizatio
n
Content
Style
(0-2)
The content is
inaccurate or overly
general. Listeners are
unlikely to learn
anything or may be
misled.
(3-5)
The content is
generally accurate, but
incomplete. Listeners
may learn some
isolated facts, but they
are unlikely to gain
new insights about the
topic.
(5-7)
The speaker is
generally relaxed and
comfortable, but too
often relies on notes.
Listeners are
sometimes ignored or
misunderstood.
(3-6)
(0-2)
The speaker appears
anxious and
uncomfortable; and
reads notes, rather
than speaks.
Listeners are largely
ignored.
(0-2)
Total Score
Appendix 3.2
26
carefully organized
and provides
convincing evidence
to support
conclusions.
(6-8)
The content is
accurate and
complete. Listeners
are likely to gain new
insights about the
topic.
(10-13)
The speaker is relaxed
and comfortable,
speaks without undue
reliance on notes, and
interacts effectively
with listeners.
(7-9)
Score
Appendix 3.3
27
Exceptional:
Evidence
demonstrates that
the student has
mastered this
objective at a
high level.
Appendix 3.4
28
Appendix 3.5
29
Appendix 4
30
Excellent
Satisfactory
Unsatisfactory
Score
Total:
Appendix 5
31
Style/Voice
Organization
Development
ExcellentMarkedly Exceptional
SuperiorClearly Above
Average
A thorough grasp of the subject
matter is demonstrated
Focus is clear and thoughtful
Body is generally supported by
facts, examples, etc. though
support will not be as varied or
vivid as in an excellent paper
Demonstrates understanding of
audience and purpose, though
may occasionally stray from it
Appendix 6
SatisfactoryFully
Competent
A basic grasp of the subject
matter is demonstrated
Focus is generally adequate
but may not be immediately
clear to all readers
Response to the assignment
is generally adequate
Body supported by facts,
examples, details, but are
mainly surface oriented and
generalized
Demonstrates only some
understanding of audience
and purpose
Ideas generally related to
one another and to the
focus, but may have some
unrelated material
Adequate introduction and
conclusion
Some attempt at transitions
Voice adequate to
audience/purpose, but often
is predictable
May be slight
inconsistencies of tone
Predictable word choice;
low range of synonyms
employed
Sentences mechanically
sound but lack in variety
32
PoorMarginally Acceptable
FailingUnacceptable
A basic lack of
understanding of the
subject matter is
demonstrated
Focus is not evident
Body largely
unsupported by
relevant facts or
examples
Demonstrates no
understanding of
audience/purpose
Minimal organization;
inappropriate or no
paragraphing
Ineffective or missing
introduction and
conclusion
Minimal or no use of
transitions
Voice/style not
possible due to severe
mechanical problems
Mechanics
Appendix 6
33
Simplistic or
incoherent sentences
outweigh intelligible
sentences
Diction often
inaccurate or severely
limited vocabulary
Mechanical errors
predominate
Appendix 6
34
English 101
10-13-03
Essay #2: Taking a Stand: High School Exit Exam
High school is stressful considering issue such as: peer pressure, the struggle of passing classes,
and trying to maintain a high Grade Point Average. Most students are desperately trying to keep
themselves a float in this society of raging waters. They feel they cannot handle anything else. For many
of them can hardly carry what they have already. Now students have one more burden to carry and that is
the high school exit exam.
Learning contains many key principles, but the most basic of all is desire. The students have to
have a passion to learn. Many argue this exam hurts the underprivileged such as minorities and lowincome families, but is this true (Burke, Exit Exam B5). The greatest hindrance that keeps most
students from learning is problems of drugs, alcohol, and domestic violence. These problems are found
both in the homes of the rich, the poor and almost of any ethnic background. There isnt any good reason
why a student, that doesnt have any disabilities or language barriers, should have problems learning.
Students that have such problems concerning language barriers and disabilities should be provided
programs that will steadily prepare them for the exit exam. High School students should be made to take
the exit exam to make sure they are progressing, and that they have basic skills to survive life, to get good
jobs or to pursue careers. There shouldnt be a student left behind, not knowing their basic skills of
reading, writing, and math.
During high school, the greatest amount of progression should be made above any other time in
grade school, and this can only be done with the help of our school system. What is done between grades
9th to 12th does matter, for whatever they learn between these grades they will probably carry with them
for the rest of their lives. The teachers should help the students progress by fully explaining what goals
they want the students to meet. They should push the student to think: always getting them involved in
Appendix 6
35
every class project and discussion. There needs to be an interaction between teachers and students. The
class should never look bored and stagnant. There is a great need for open communication between
teachers and students. Students should be able to come to the teacher if they have any trouble with the
assignment or any other issues pertaining to any of their educational needs.
If they are planning on giving an exam; that test high school students abilities; the schools should
fully prepare the teachers and the students. Teachers should be made to teach all the materials that will be
on the exams year around for the full four years of high school. Students should be tested every year, so
they can see where they need to progress for the upcoming year. This will be helpful to both the teacher
and the student. CNN Student News, center director Jack Jennings said, You have to provide a system to
help kids succeedThese test are a good idea if theyre done right (Jennings qtd. In States stick). We
cannot just drop an exam on students laps and expect that they take it if we dont fully prepare them. No
part of the exam should be a mystery to them; it should all be review. Students, on the other hand, should
be made accountable for what they learn. They should study often. This exam is supposed to test what
they have learned during these past four years of their lives. If we go about this the right way, this exam
should be like any other test for the student.
This exam should be taken so that the student will have the basic skills to survive life. Everyday,
if we realize it or not, we are surrounded by writing, reading, and mathematics. For example, anytime we
go to the store we use math, whether it is for calculating 30% off of item on sale or giving and receiving
money from the cahier. Another example is the ability to read or write, and its important usage for the
voter in an election. Its importance is beyond our reasoning, for we really have to know what we are
reading, when it has to do with drafting in different laws. Everyday we are surrounded by these
obscurities that call for basic skills, skills that may look non useful, but one-day students will need.
Once students graduate from high school, thats when life really begins. They will most likely use
all they learned in high school, in college and even after that in the work place. All students will need
these basic skills of reading, writing, and math in their jobs and also in whatever career they decide to
Appendix 6
36
pursue. The whole point of the exam is to encourage students to progress, so they wont feel lost and
confused, when they graduate and try to find a job or seek a profession.
The high school exit exam shouldnt even be a debate, if its just basic material that high school
students should already know. David Cooper, director of secondary education for Modesto City Schools,
said students may take the test up to eight times, and most will eventually pass (qtd. In Herendeen,
Students Cheer A1). Students shouldnt eventually understand the material; they should know the
material (qtd. In Herendeen, Students Cheer: A1). The reason why taking the high school exit exam is
an issue is because they dont already know the basic material, which will be sooner or later in life, be put
before them. We need to go back to the basics, and make sure that math, reading, and writing are being
taught before any other materials. These basics need to be priority, and any other extra curricular subject,
secondary. The only way we can make sure students are being taught, is to test their abilities. We need to
strive together as a people and make sure students are learning. We want students to leave high school
knowing they have progressed, that they have learned something of great value. They should feel
confident when they get out of high school. They should have the ability and opportunity to survive in
life, get a good job, and pursue the career of their dreams. It is our responsibility to make sure they have
their feet planted on solid ground, ready to go out in this world and make a difference.
Works Cited
Burke, Frank. Letter. The Modesto Bee 28 June 2003: B5
Herendeen Susan. Letter. The Modesto Bee 10 July 2003: A1
States stick with high-school exit exam. CNN Student News 20 Aug. 2003. 12. Oct.
2003
<http://www.cnn.com/2003/EDUCATION/08/13/high.school.exams.ap
Appendix 6
37
38
fact, imposed nuisances composed of contrived questions--irrelevant to their intent and success. Both parties are led
to believe that right answers matter more than habits of mind and the justification of one's approach and results.
A move toward more authentic tasks and outcomes thus improves teaching and learning: students have greater
clarity about their obligations (and are asked to master more engaging tasks), and teachers can come to believe that
assessment results are both meaningful and useful for improving instruction.
If our aim is merely to monitor performance then conventional testing is probably adequate. If our aim is to
improve performance across the board then the tests must be composed of exemplary tasks, criteria and standards.
WON'T AUTHENTIC ASSESSMENT BE TOO EXPENSIVE AND TIME-CONSUMING?
The costs are deceptive: while the scoring of judgment-based tasks seems expensive when compared to
multiple-choice tests (about $2 per student vs. 1 cent) the gains to teacher professional development, local
assessing, and student learning are many. As states like California and New York have found (with their writing and
hands-on science tests) significant improvements occur locally in the teaching and assessing of writing and science
when teachers become involved and invested in the scoring process.
If costs prove prohibitive, sampling may well be the appropriate response--the strategy employed in California,
Vermont and Connecticut in their new performance and portfolio assessment projects. Whether through a sampling
of many writing genres, where each student gets one prompt only; or through sampling a small number of all
student papers and school-wide portfolios; or through assessing only a small sample of students, valuable
information is gained at a minimum cost. And what have we gained by failing to adequately assess all the capacities
and outcomes we profess to value simply because it is time- consuming, expensive, or labor-intensive? Most other
countries routinely ask students to respond orally and in writing on their major tests--the same countries that
outperform us on international comparisons. Money, time and training are routinely set aside to insure that
assessment is of high quality. They also correctly assume that high standards depend on the quality of day-to-day
local assessment--further offsetting the apparent high cost of training teachers to score student work in regional or
national assessments.
WILL THE PUBLIC HAVE ANY FAITH IN THE OBJECTIVITY AND RELIABILITY OF JUDGMENT-BASED
SCORES?
We forget that numerous state and national testing programs with a high degree of credibility and integrity have
for many years operated using human judges:
the New York Regents exams, parts of which have included essay questions since their inception--and which
are scored locally (while audited by the state);
the Advanced Placement program which uses open-ended questions and tasks, including not only essays on
most tests but the performance-based tests in the Art Portfolio and Foreign Language exams;
state-wide writing assessments in two dozen states where model papers, training of readers, papers read "blind"
and procedures to prevent bias and drift gain adequate reliability;
the National Assessment of Educational Progress (NAEP), the Congressionally-mandated assessment, uses
numerous open-ended test questions and writing prompts (and successfully piloted a hands-on test of science
performance);
newly-mandated performance-based and portfolio-based state-wide testing in Arizona, California, Connecticut,
Kentucky, Maryland, and New York.
Though the scoring of standardized tests is not subject to significant error, the procedure by which items are
chosen, and the manner in which norms or cut-scores are established is often quite subjective--and typically
immune from public scrutiny and oversight.
Genuine accountability does not avoid human judgment. We monitor and improve judgment through training
sessions, model performances used as exemplars, audit and oversight policies as well as through such basic
procedures as having disinterested judges review student work "blind" to the name or experience of the student--as
occurs routinely throughout the professional, athletic and artistic worlds in the judging of performance.
Authentic assessment also has the advantage of providing parents and community members with directly
observable products and understandable evidence concerning their students' performance; the quality of student
work is more discernible to laypersons than when we must rely on translations of talk about stanines and
renorming.
Ultimately, as the researcher Lauren Resnick has put it, What you assess is what you get; if you don't test it you
won't get it. To improve student performance we must recognize that essential intellectual abilities are falling
through the cracks of conventional testing.
Appendix 6
39
Appendix 7
40
preparation and capability needed in the tasks to be delegated, and effectively supervises nursing care
being given by subordinates.
(5) Evaluates the effectiveness of the care plan through observation of the client's physical condition and
behavior, signs and symptoms of illness, and reactions to treatment and through communication with the
client and health team members, and modifies the plan as needed.
(6) Acts as the client's advocate, as circumstances require, by initiating action to improve health care or to
change decisions or activities which are against the interests or wishes of the client, and by giving the
client the opportunity to make informed decisions about health care before it is provided.
Appendix 7
41
Appendix 7
42
Appendix 7
43
Trait
Plot
Desired SLO:
Text
Analysis
4 points
3 points
Accurate plot
review
Accurate plot
review
1 point
Glaring plot
inaccuracies
Analysis of text
Literal
beyond literal
analysis
interpretation
Weak support
No specific
Supporting
with
specific
details as
Statements
details from film
support
Personal
No personal
Personal
evaluation not
evaluation
Reactions
based on
analysis
Film #1
10
7
3
1
Film #2
11
7
3
0
Number of Students Scoring at Each
Film #3
10
8
3
1
Point Level by Film Number Reviewed
Film #4
12
5
5
1
Film #5
13
5
3
0
Film #6
6
8
6
1
Film #7
9
7
7
5
Instructor Analysis: I handed out the trait scale to students on the first day of class, but I am not sure they consulted it; upon my inquiring
whether they had a copy near the end of the course, few students were able to locate it in their notebooks. This taught me that I should refer
to the scale more explicitly in class. I anticipated that it would be easy for students to give an analysis but difficult for them to identify
concrete support for their ideas. However, I discovered that students found it easier to point to specific places in the movies that gave them
ideas than to articulate those ideas. Therefore, I will revise the scale for the next course to reflect the relative challenges of these skills.
After viewing an assigned film based on a literary
text, write a review of the film. Include an appraisal
of the directors selection and effective translation of
content from the literary text and the dominant tone
the director seems to be trying to achieve, supporting
each statement with detail from the text and film and
your personal reaction to the cited scenes.
Analysis of text
beyond literal
interpretation
Support with
specific details
from text/film
Personal
evaluation based
on analysis
2 points
Minor
inaccuracies of
plot
Analysis of text
includes literal
interpretation
Few specific
details as
support
Little personal
evaluation
Intended Outcomes
(Objectives)
1. Students will demonstrate
proficiency in employable
Mass Communication skills.
Standardized Exams
Professional Certification
Performance Assessment
Grad Surveys/Interviews
Employer/Faculty Surveys
Actual Results
3. U of I Coordinator of Transfer
Articulation reported that out of 29
applicants from other schools to
Graphics a Mass Com student was
the only admit.
3. Continue to gather/monitor
data. Investigate how many
Parkland Graphics students
applied.
Appendix 8
Other
44
Scientific Inquiry
Numeracy
Communication
Mesa Community College Results from Student Learning Outcomes Assessment Spring 2002 and 2003
Outcome Statements
Results
1. Write a clear, well-organized paper using
Written: The mean score for the post-group was
documentation and quantitative tools when
significantly higher overall and on the scales for
appropriate.
content, organization and mechanics/style. When each
2. Construct and deliver a clear, well-organized,
skill is considered separately, students showed relative
verbal presentation.
strength in stating their own position, addressing the
prompt, using appropriate voice and style and sentence
structure. Students have consistently rated below the
overall average on acknowledging the opposing
position, developing each point with appropriate detail
and commentary, progressing logically and smoothly,
and using transitions and orienting statements.
Oral: Significant differences between beginning
students and completing students were shown in the
total percentage correct for the assessment overall and
for each of the subscales: knowledge about effective
interpersonal interchanges, small group interaction and
conducting oral presentations.
1. Identify and extract relevant data from given
The average percent correct was significantly higher
mathematical situations.
for the post-group overall and for outcomes related to
2. Select known models or develop appropriate
identifying and extracting relevant data, using models
models that organize the data into tables or
to organize data, obtaining results, and stating results
spreadsheets, graphical representations, symbolic/ with qualifiers. Patterns of performance have remained
equation format.
consistent over several years. Use of models is the
3. Obtain correct mathematical results and state
strongest area and use of results is the weakest area.
those results with the qualifiers.
4. Use the results.
Demonstrate scientific inquiry skills related to:
There was no significant difference in the average
1. Hypothesis: Distinguish between possible and
percent correct between groups in the 2002
improbable or impossible reasons for a problem.
administration; however, significant differences were
2. Prediction: Distinguish between predictions that
noted, overall, in prior years. Students have been most
are logical or not logical based upon a problem
successful in recognizing possible reasons for a
presented.
problem. Making a conclusion based upon information
3. Assumption: Recognize justifiable and necessary
presented has had the lowest percent correct for the
assumptions based on information presented.
past three years of administration.
4. Interpretation: Weigh evidence and decide if
generalizations or conclusions based upon given
data are warranted.
5. Evaluation: Distinguish between probable and
improbable causes, possible and impossible
reasons, and effective and ineffective action based
on information presented.
1. Identify a problem or argument.
The average total score was significantly higher for the
2. Isolate facts related to the problem.
post-group (completing), overall and for two sub3. Differentiate facts from opinions or emotional
scales: Interpretation and Evaluation of Arguments.
responses.
The post-group score was at the 45th percentile when
4. Ascertain the author's conclusion.
compared to a national sample. Average student scores
5. Generate multiple solutions to the problem.
have been consistently highest for the Interpretation
6. Predict consequences.
and Evaluation of Arguments sections and lowest for
7. Use evidence or sound reasoning to justify a
Inference.
position.
Appendix 8
45
Cultural Diversity
Information Literacy
Mesa Community College Results from Student Learning Outcomes Assessment Spring 2002 and 2003
Outcome Statements
Results
1. Demonstrate knowledge of human creations.
Significant differences were observed overall and in
2. Demonstrate awareness that different contexts
three of four outcome areas: Demonstrate an awareness
and/or worldviews produce different human
that different contexts and/or world views produce
creations.
different human creations; an understanding and
3. Demonstrate an understanding and awareness of
awareness of the impact that a piece has on the
the impact that a piece (artifact) has on the
relationship and perspective of the audience; an ability
relationship and perspective of the audience.
to evaluate human creations.
4. Demonstrate an ability to evaluate human
creations.
1. Given a problem, define specific information
The percent correct was significantly higher for the postneeded to solve the problem or answer the
group overall and for three of five outcome areas:
question.
evaluating currency and relevance of information,
2. Locate appropriate and relevant information to
identifying sources, and locating information. Students
match informational needs.
were most successful in evaluating information for
3. Identify and use appropriate print and/or
currency and relevance, followed by defining
electronic information sources.
information needed to solve a problem and identifying
4. Evaluate information for currency, relevancy, and appropriate sources. Locating information was relatively
reliability.
more difficult. Students were least successful in using
5. Use information effectively.
information effectively.
1. Identify and explain diverse cultural customs,
Students in the completing (post) group had
beliefs, traditions, and lifestyles.
significantly higher scores on direct measures of
2. Identify and explain major cultural, historical and knowledge and on several diversity and democracy
geographical issues that shape our perceptions.
outcomes in both years. Completing students agreed
3. Identify and explain social forces that can effect
more often that they have an obligation to give back to
cultural change.
the community. In the most recent administration
4. Identify biases, assumptions, and prejudices in
completing students rated themselves more highly than
multicultural interactions.
beginning students on having a pluralistic orientation,
5. Identify ideologies, practices, and contributions
being able to see both sides of an issue and their own
that persons of diverse backgrounds bring to our
knowledge of cultures. Further, they agreed more
multicultural world.
strongly with statements that support the value of
diversity, reflect tolerance for differences related to
gender, and indicate that they engage in social action
more often.
Appendix 8
46
Pre/Post Tests
Course Embedded Test
Portfolios
Intended Outcome(s):
1. Graduates from this program will have
acquired knowledge and skills needed for
entry-level positions in a variety of computerrelated fields.
Results:
1.a.1. Fall 2000:
Two students fell under the 4.5 rating. 80% of
the interns received an average score of 4.5 or
higher. The weakest area was identified as
"Ability to Plan," which received an average
score of 4.29.
Results:
1.e.1. Fall 2000: Data was collected and
reviewed for CIS 101 and CIS 117. 143
students answered questionnaires in 101 with
an average score of 84%. 39 students
answered questionnaires in 117 with an
average score of 90%.
1.e.2.. Spring 2001: Data was collected at the
end of the semester for CIS 101 and CIS 117.
105 students for CIS 101 had an average score
of 86%. 41 students for CIS 117 had an
average score of 96%.
1.e.3. Fall 2001: Data was collected from CIS
101 and CIS 117. 118 students for CIS 101
had an average score of 86%. 38 students for
CIS 117 had an average score of 98%.
Appendix 8
Capstone exam/project
Performance Assessment
Grad Surveys/Interviews
Employer/Faculty Surveys
Standardized Exams
Professional Certification
Other
Assessment Criteria:
1.a. When surveyed, employers of our interns will rate 80% of the students
with an average of 4.5 on a scale of
1-5. The rating will be composed of 14 skill areas each rated on a scale of 1-5.
Analysis and Action:
1.a.1. Fall 2000 data analyzed in Spring 2001:
This indirect measure is not providing the results anticipated. The committee
proposes making changes to the survey to make it a more valuable assessment
tool. In addition, information will be given to the instructors in CIS 297-CIS
seminar and CIS 231- Systems Analysis, Design and Administration to
enhance course content to encourage students to strengthen their "ability to
plan." A direct measure to show "ability to plan" will be included in the
capstone tests given near the completion of the program. (See 1.c.)
1.a.2. Spring 2002
Students did well overall in every area. The lowest marks came in the "ability
to plan" area with 1- Excellent, 4- Good ratings. Suggestions have been made
for providing additional information in CIS 297: Seminar and CIS
231:Systems Analysis, Design and Administration.
Assessment Criteria:
1.d. 90% of students will score 80% or higher on a standard, capstone test to
be administered near to their completion of program.
Analysis and Action:
1.d.1. Fall 1999 data analyzed in Spring 2000:
Faculty met and determined that the pilot instrument needed to be changed to
gather more accurate results. Students seemed confused by the questionnaire
and we felt the results were not valid enough.
Assessment Criteria:
1.e. All students in the introductory level required courses for all CIS
programs (101 and 117) will be given a set of five questions to be graded with
the final exam. Students completing their final courses in CIS will be given 10
questions.
Analysis and Action:
1.e.2.. Spring 2001:
Overall scores for CIS 101 improved by 2%. The weakest question in CIS 101
was identified. 25% of students missed the question about how to save files
using Save vs. Save As. Instructors were encouraged to spend more time on
this topic and the question was reworded to be easier to read for the next
semesters assessment test. Overall scores for CIS 117 improved by 3%.
1.e.3. Fall 2001:
Overall scores for CIS 101 stayed the same as the previous semester. The
rewording of the question about saving indicated that fewer instructors were
thoroughly teaching the concept of saving vs. the save as command. 29% of
the students answered the question about saving incorrectly. A memo was sent
out to all instructors outlining what students need to learn in CIS 101
pertaining to the save and save as command. Scores for CIS 117 improved by
2%.
47
Appendix 9
48
Category
From the National Standards for Business Education 2001 National Business Education Association, 1914 Association Dr., Reston, VA 20191.
Category #1 Title:
Category #2 Title:
Category #3 Title:
Appendix 10
49
Methods of Assessment
Strategies/techniques/instruments for
collecting the feedback data that
provide an evidence of the extent to
which objectives are reached
Type
Check the # of the SLO assessed by the
E=Enter
particular assessment method
I=Intermediate
X=Exit
A1 A2 A3 A4 A5 A6 A7 A8
F=Follow-up
Alumni survey
Findings/
Evaluation/
Conclusions
Results of
analysis and
interpretation
of the
measurement
data
I
Appendix 11 Example of Program Assessment Report
West Virginia State Community and Technical College
http://fozzy.wvsc.edu/ctc/program_assesment/General%20Education%20Audit%20Grid.doc
Appendix 11
50
Recommendations
for
Improvement
Recommended
actions for
Improving the
Program
Methods of Assessment
Strategies/techniques/instruments for
collecting the feedback data that
provide an evidence of the extent to
which objectives are reached
Type
Check the # of the SLO assessed by the
E=Enter
particular assessment method
I=Intermediate
X=Exit
A1 A2 A3 A4 A5 A6 A7 A8
F=Follow-up
Appendix 11
51
Findings/
Evaluation/
Conclusions
Results of
analysis and
interpretation
of the
measurement
data
Recommendations
for
Improvement
Recommended
actions for
Improving the
Program
COMMUNICATION SKILLS
Present ideas developed from diverse sources and points of view with consideration of target audience.
Demonstrate communication process through idea generation, organization, drafting, revision, editing, and presentation.
Participate in and contribute to collaborative groups.
Construct logical, coherent, well-supported arguments.
Employ syntax, usage, grammar, punctuation, terminology, and spelling appropriate to academic discipline and the professional world.
Demonstrate listening / interpretive skills in order to participate in communications and human exchange.
THINKING SKILLS
Use appropriate method of inquiry to identify, formulate, and analyze a current or historical problem/question (may include recognizing
significant components, collecting and synthesizing information, evaluating and selecting solution(s), applying and defending solution(s).
Translate quantifiable problems into mathematical terms and solve these problems using mathematical or statistical operations.
Interpret graphical representations (such as charts, photos, artifacts) in order to draw appropriate conclusions
Recognize strengths and weaknesses in arguments
Demonstrate observational and experimental skills to use the scientific method to test hypotheses and formulate logical deductions
Understand the uses of theories and models as applied in the area of study
Develop creative thinking skills for application in problem solving
Demonstrate a working knowledge of a technological application in an area of study.
DIVERSITY AND GLOBAL PERSPECTIVE
Recognize the diversity of humanity at the local, regional and global levels
Synthesize information about needs, concerns and contributions of different cultures within society
Identify the influence of cultural and ethnic backgrounds on individual and group attitudes and values.
Link cultural perspectives, practices, and interactions with the societal and physical environment from which they arose.
Explain the importance of cross-cultural influences on physical, cultural and spiritual heritage.
Relate and explain the connections between past and present events and/or issues.
AESTHETIC PERSPECTIVE
Analyze and evaluate literary, visual, or performing arts using discipline-specific approaches and criteria.
Reflect on personal responses to aesthetic experiences.
Incorporate aesthetic reflection into discipline-specific activities.
ETHICAL AND CIVIL VALUES
Identify and assess community needs and the responsibility to balance individual and societal needs
Display responsibility and integrity in one's choices and actions
Integrate knowledge in order to establish an ethical position on an issue and defend it with logical arguments
Develop an appreciation of education and lifelong learning
Understand social values and analyze their implications for the individual, community, society, and world.
Recognize the individual's responsibility to continue the exploration of the changing world and one's role in it.
Appendix 12
52
http://www.jccc.net/home/depts/6111/site/assmnt/cogout
Appendix 12
53
Mathematics Outcome
Outcome Statements: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to:
1. Identify relevant data (numerical information in mathematical or other contexts) by
a. extracting appropriate data from a problem containing extraneous data and/or
b. identifying appropriate data in a word problem.
2. Select or develop models (organized representations of numerical information, e.g., equation, table, graph) appropriate
to the problem which represent the data by
a. arranging the data into a table or spreadsheet and/or
b. creating pictorial representations (bar graphs, or pie charts, or rectangular coordinate graphs, etc.) with or
without technological assistance and/or
c. selecting or setting up an equation or formula.
3. Obtain and describe results by
a. obtaining correct mathematical results, with or without technological assistance and
b. ascribing correct units and measures to results.
4. Draw inferences from data by
a. describing a trend indicated in a chart or graph, and making predictions based on that trend and/or
b. describing the important features of data presented in a table or spreadsheet, and making predictions based on
that trend and/or
c. describing the important features of an equation or formula, and making predictions based on those features
and/or
d. making reasonable estimates when given problems involving quantities in any organized or disorganized
form and/or
e. drawing qualitative conclusions about the original situation based on the quantitative results that were
obtained.
The mathematics outcomes consist of four major outcomes, numbered 1 to 4. These major outcomes are each subdivided
into several subpoints labeled by letters. A major outcome is demonstrated when at least one subpoint has been
demonstrated, except for major outcome 3, where subpoint 3.a. must be demonstrated. A subpoint is demonstrated when at
least one instance of the subpoint has occurred, except for subpoints 3.a. (which requires at least 70 percent accuracy of the
items examined) and 3.b. (which requires at least 2 instances involving different measures).
Rubrics: The following rubric will measure the mathematics outcomes:
5 = All four major outcomes are demonstrated by the use of more than one subpoint per major outcome.
4 = All four major outcomes are demonstrated.
3 = Three major outcomes are demonstrated.
2 = Two major outcomes are demonstrated.
1 = Only one major outcome is demonstrated.
0 = No major outcomes are demonstrated.
Standards: At least 75 percent of all JCCC students earning associate degrees should obtain a score of 4 or more on the
mathematics outcomes rubric. At least 95 percent of all JCCC students earning associate degrees should obtain a score of 3
or more on the mathematics outcomes rubric.
Writing Outcome
Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to write a clear, well-organized paper using documentation and quantitative tools when appropriate.
Outcome Rubrics:
6 = Essay demonstrates excellent composition skills including a clear and thought-provoking thesis, appropriate and
effective organization, lively and convincing supporting materials, effective diction and sentence skills, and perfect or
near perfect mechanics including spelling and punctuation. The writing perfectly accomplishes the objectives of the
assignment.
5 = Essay contains strong composition skills including a clear and thought-provoking thesis, although development,
diction, and sentence style may suffer minor flaws. Shows careful and acceptable use of mechanics. The writing
effectively accomplishes the goals of the assignment.
4 = Essay contains above average composition skills, including a clear, insightful thesis, although development may
Appendix 12
54
be insufficient in one area and diction and style may not be consistently clear and effective. Shows competence in the
use of mechanics. Accomplishes the goals of the assignment with an overall effective approach.
3 = Essay demonstrates competent composition skills including adequate development and organization, although the
development of ideas may be trite, assumptions may be unsupported in more than one area, the thesis may not be
original, and the diction and syntax may not be clear and effective. Minimally accomplishes the goals of the
assignment.
2 = Composition skills may be flawed in either the clarity of the thesis, the development, or organization. Diction,
syntax, and mechanics may seriously affect clarity. Minimally accomplishes the majority of the goals of the
assignment.
1 = Composition skills may be flawed in two or more areas. Diction, syntax, and mechanics are excessively flawed.
Fails to accomplish the goals of the assignment.
Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on
each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6
(excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal
accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn
the score of 2 (unsatisfactory) on the communication rubrics The score of 1 represents a skill level beneath the expectation of
all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the
communications rubrics.
Speaking Outcome
Outcome Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to make a clear, well-organized verbal presentation.
Rubrics:
Very good/excellent (5-6) = The communicator presents a message that is exceptionally appropriate for the purpose,
occasion, and audience with a purpose that is exceptionally clear and identifiable. The message is supported using
material that is exceptional in quality and variety. The communicator uses an exceptionally clear and coherent
organizational structure, provides a logical progression within and between ideas, and uses language that is
exceptionally clear, vivid, and appropriate. The communicator makes exceptional use of vocal variety in a
conversational mode; has exceptional articulation, pronunciation, and grammar; and demonstrates physical behaviors
that provide exceptional support for the verbal message.
Satisfactory (3-4) = The communicator presents a message that is appropriate for the purpose, occasion, and audience
with a purpose that is adequately clear and identifiable. The message is supported using material that is appropriate in
quality and variety. The communicator uses a reasonably clear and coherent organizational structure, provides a
logical progression within and between ideas, and uses language that is reasonably clear, vivid, and appropriate. The
communicator makes acceptable use of vocal variety in a conversational mode; has acceptable articulation,
pronunciation, and grammar; and demonstrates physical behaviors that provide adequate support for the verbal
message.
Unsatisfactory (1-2) = The communicator presents a message that is not appropriate for either the purpose, occasion,
or audience or is without a clear and identifiable purpose for the message. The message is supported with material that
is inappropriate in quality and variety. The communicator fails to use a clear and coherent organizational structure,
does not provide a logical progression within and between ideas, and uses unclear or inappropriate language. The
communicator fails to use vocal variety; fails to speak in a conversational mode; fails to use acceptable articulation,
pronunciation, and grammar; or fails to use physical behaviors that provide adequate support for the verbal message.
Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on
each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6
(excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal
accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn
the score of 2 (unsatisfactory) on the communication rubrics The score of 1 represents a skill level beneath the expectation of
all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the
communications rubrics.
Appendix 12
55
56
If it appears that the assignment did not present an opportunity for students to perform a rubric, the artifact will be given a zero
(0) score for that rubric. For example, this may be a result of instances where the instructor=s assignment defined the
problem or method of gathering information. The subcommittee scorers should concur on those particular rubrics which
receive zeros.
Artifacts scored for Modes of Inquiry must allow the student to perform at least 2 of the 3 rubrics. Only rubrics with plus or
minus scores will be counted. A zero score is not counted and does not impact the outcome standard. It is not
necessary for the subcommittee scorers to concur on rubrics which receive plus or minus scores. The artifacts are
scored as follows:
3 = the student demonstrated the ability to perform all rubrics that the student had the opportunity to perform (3 or 2).
2 = the student was given the opportunity to perform all 3 rubrics and demonstrated the ability to perform 2 of them.
1 = the student demonstrated the ability to perform only one rubric.
0 = the student was unable to demonstrate the ability to perform any of the rubrics.
Standards: At least 80% of the Modes of Inquiry artifacts should receive a score of 3.
Appendix 12
57
B.1
Communicate
articulately in speech
and writing
1.
2.
COLL
101
I
E
E
R
E
R
I
E
E
R
E
R
E
R
R
A
I
E
I
E
I
E
E
R
I
E
I
E
R
A
4. Natural Science
BST
104
I
E
3. Mathematics
E
E
I
R
I
R
PHYS PHYS
CHEM CHEM PHYS PHYS PHYS PHYS
191 & 201 &
101
130
103
110
120
170
203
203
I
E
IE
R
A
I
E
Appendix 12
R
A
R
A
R
A
R
A
I
E
I
E
I
E
I
E
I
E
E
R
A
E
A
58
BIO
210
I
A
I
A
I
E
I
E
I
R
I
E
E
A
I
E
A
I
E
IE
R
A
I
E
I
E
R
A
I
E
R
A
I
A
B.6
I
R
A
BIO
102
I
A
A
I
E
BIO
101
I
A
I
A
I
E
A
I
A
I
A
I
E
R
A
I
E
R
I
E
R
I
E
I
E
R
A
I
E
R
A
I
E
R
A
I
E
I
E
B.7
B.8
Communicate articulately
in speech and writing
Think critically about issues,
theory, and application
Use effective human
relationship skills to work in
a diverse society
Function effectively and
positively in a team
environment
Use library print and
electronic resources for
literature research
Use computational skills to
solve problems, manipulate
and interpret numerical data,
and communicate data in a
logical manner
Employ fundamental
principles of science, the
scientific method of inquiry,
and skills for applying
scientific knowledge to
practical situations
Use computer technology to
organize, access, and
communicate information
6. Social Science
HU
M
101
7. Information Skills
I
A
I
A
E
R
E
R
I
A
I A
ET
112
CS
106
BST
240
ITEC
101
E
A
I
A
I
A
R
A
E
R
E
R
A
R
A
R
R
A
E
A
E
R
A
I=Introduces E=Emphasizes
R=Reinforces
A=Applies
Introduces-Student is not familiar with content or skill. Instruction concentrates on introducing students to the content area or skill.
Emphasizes-Student should have brought basic content or skill to the course. Instruction concentrates on enhancing content/strengthening skill and
adding new content material building more complex skills based on entrance competency
Reinforces-Student brings reasonable knowledge/content/skill/Competency to the situation as a result of content or skill being taught and/or emphasized
at some previous point in their educational career. Instructional activity continues to teach and build upon previous competency and reinforces content or
skill competency
Applies-Student has knowledge/content/skill/Competency as a result of content or skill being taught and/or emphasized at some previous point in their
educational career. Instructional activity applies a previously taught and/or emphasized content or skill.
Appendix 12
59
Appendix 13. Resources and References for Student Learning Outcomes Assessment
Good Practices
An Assessment Manifesto by College of DuPage (IL) is an excellent values statement.
9 Principles of Good Practice for Assessing Student Learning by the American
Association of Higher Education are the foundational principles of assessment.
Palomar College Statement of Principles on Assessment is a succinct two-page summary of
assessment and how it is done at Palomar College (CA).
Closing the Loop -- seven misperceptions of SLOs and responses to each by Tom Angelo.
Five Myths of Assessment by David Clement, Monterey Peninsula College published in
Inside English (Spring 2003) the newsletter of the English Council for California Two-Year
Colleges (www.ecctyc.org). Expresses concern that SLOs will affect faculty evaluation,
intrude on the classroom, diminish academic freedom, and lead to standards that are watered
down and blandly uniform.
The Case for Authentic Assessment by Grant Wiggins, presented at the California Assessment
Institute. The paper addresses several questions: What is authentic assessment? Why do we
need to invest in these labor-intensive forms of assessment? Wont authentic assessment be
too expensive and time-consuming? Will the public have any faith in the objectivity and
reliability of judgment-based scores?
Is Accreditation Accountable? The Continuing Conversation Between Accreditation and the
Federal Government by the Council for Higher Education Accreditation (2003) provides a
thorough discussion of the tensions between the federal governments call for accountability
for student learning and traditional process based peer review accreditation methods.
Establishing the Student Learning Outcomes Process
Assessment Plan/Progress Report by Isothermal Community College (NC) explains the SLO
process well.
Developing an Assessment Plan to Learn about Student Learning by Peggy Maki of AAHE
gives a tabular Assessment Guide which covers general steps in setting up a student
learning outcome assessment process.
Methods of Assessment of Student Learning classifies SLO methods as Direct, Indirect and
Outputs.
AssessmentAn Institution-Wide Process to Improve and Support Student Learning by
College of DuPage (IL) is a handbook which lays out the student learning outcomes process
and roles in general terms.
Defining and Assessing Learning: Exploring Competency-Based Initiatives a report of the
National Postsecondary Education Cooperative Working Group on Competency-Based
Initiatives in Postsecondary Education published by the National Center for Educational
Statistics, September 2002. Section 4 on Principles of Strong Practice is particularly useful,
giving twelve principles clustered in four areas: planning for competency-based education
initiatives; selecting assessment methods; creating and ensuring that learning experiences
lead to competencies; and reviewing assessment results to identify changes needed to
strengthen student learning. The report concludes with eight case studies; of particular note
are those of Sinclair Community College (OH) which has a flourishing competency-based
initiative that guarantees competencies of graduates and Hagerstown Community College
(MD) which uses a career transcript listing specific competencies.
Assessment at the Program Level by Trudy H. Bers. Notable features: 1) summarizes ten
approaches to program assessment, 2) discusses challenges to implementation, 3) describes
good practices at six community colleges.
60
General Education Core-Audit Grid from the University of West Virginia Community and
Technical College. Each of the colleges eight general education skills are identified on a
matrix that lists all courses within the five categories of GE courses, coding the level of
mastery as either I for Introduces, E for Emphasizes, R for Reinforces, or A for Applies.
Assessment of General Education Learning Outcomes: An Institutional Portfolio Approach to
Assessment of General Education Learning Outcomes Johnson County Community
College. This document defines the institutional portfolio, gives the logistics of
implementation, and then lists six GE outcome statements, each with detailed competencies,
rubrics and standards.
Summary of Results from Student Outcomes Assessment - Spring 2002 and 2003, Mesa (AZ)
Community College Office of Research and Planning. Mesa CC uses a student test sampling
approach to SLO assessment. This document details their GE outcome statements in seven
areas and summarizes the testing results.
Writing Measurable Outcomes
The Geneva College (PA) Program Guide has good examples of writing measurable
outcomes.
The Assessment Primer by the FLAG Project stresses deep learning by connecting
Curriculum, Instruction and Assessment (CIA). Particularly strong on matching goal with
assessment tool.
Learning Outcomes: Learning Achieved by the End of a Course or Program: Knowledge
Skills Attitudes By Shirley Lesch, George Brown Toronto City College. The ABCs of
learning outcomes in nine easy-to-read pages.
Tools of Assessment
Overview
Advantages and Disadvantages of Assessment Techniques by Barbara Wright (8/15/02,
presented at a California Assessment Institute workshop). Covers plusses and minuses of
portfolios, capstone courses and projects, performance assessments, embedded assessment,
classroom research and assessment, locally developed tests and commercial standard tests.
Rubrics: How-To Guides
The Use of Scoring Rubrics for Assessment and Teaching by Mary Allen of CSUs Institute
for Teaching and Learning is a three-page summary of what they are, how to create them,
and how to use them. An example is included on assessment of oral presentations. She also
has a six-page version entitled Developing and Applying Rubrics which has considerably
more detail.
Primary Trait Analysis: Anchoring Assessment in the Classroom by Ruth Benander, Janice
Denton, Deborah Page and Charlotte Skinner, Raymond Walters College (OH), from JGE:
Journal of General Education,Vol. 49, No.4, 2000.
Rubrics:: Examples
Map Rubric is a scoring tool for the Online Map Creation web site
(www.aquarius.geomar.de/omc).
Grading Standards: Written Work for The Living Environment BIOL 111 is a rubric for
writing in Biology (A-F scales with definitions) at Southern Illinois University.
Student Participation Assessment and Evaluation is a rubric with 4-point scales: frequently,
occasionally, seldom, almost never, used at Southern Illinois University.
Assessing Modeling Projects in Calculus and Precalculus by C. E. Emenaker of the University
of Cincinnati gives a math project problem with two scoring rubrics: analytic and holistic.
62
Scientific Report Rubric and Collaboration Rubric developed for the Cabrillo Tidepool
Study.
Rubric For Evaluating Web Sites originally developed by John Pilgrim, Horace Mann
Academic Middle School, San Francisco
Secondary Assessment Tools is a web site with links to several dozen simple Performance
Assessment rubrics (http://www.bcps.org/offices/lis/models/tips/assess_sec.html)
Student Learning Outcomes in the California State University is a web site that gives links to
about 50 scoring rubrics (http://www.calstate.edu/AcadAff/SLOA/links/rubrics.shtml).
Examples include the Scoring Guide for the CSU English Placement Test (EPT) and CSU
Fresno rubrics on Critical Thinking, Integrative Science, and Writing.
Portfolios
Individual Student Tracking Project gives a brief explanation of what portfolios are and how
to use them. From Palomar College (CA).
Classroom Assessment Techniques
Classroom Assessment: A Manual for Faculty Developers by the National Council for Staff,
Program and Organizational Development (NBCSPOP) is a step-by-step manual for putting
on a CATs workshop.
Embedded Assessment
The Journal of Chemical Education produces a Chemical Concepts Inventory which is a 22
question nationally normed multiple-choice test on basic chemistry concepts.
The Field-tested Learning Assessment Guide (FLAG Project) has good examples (problems,
tests, surveys) in science, math, engineering and technology (copy of this list is provided).
Course Embedded Assessment Process developed by Larry Kelley at University of Louisiana
Monroe gives a summary of course embedded assessment, examples of ten program
assessment plans, lays out the basics of rubrics, and gives several rubric templates.
Local California Community College Training and Resource Materials
Modesto Junior College (CA) held training institute in the summer of 2003 for 36 faculty and
staff entitled Measuring Student Learning Outcomes. The document includes activities for
writing measurable objectives, writing student learning outcomes starting with existing
course objectives, how to embed assessment in a course, the basics of rubric writing, and
how to construct a program assessment plan. MJC is also holding a summer training institute
in 2004 with an activity and resource guide entitled Student Learning OutcomesA Focus
on Results.Bakersfield College (CA) has assisted the majority of its faculty in writing
student learning outcomes for their courses. Faculty leaders Janet Fulks and Kate Pluta have
put together a resources manual entitled Assessing Student Learning that guides faculty
through the process of writing SLOs, including definitions, criteria, good and bad examples,
and SLOs from their own courses. The document also covers how to include all three
learning domains: cognitive, psychomotor, and affective. The document concludes with the
story of how the college ramped up the SLO process, a summary of achievements to date, a
philosophy statement on SLOs adopted by the Academic Senate, and a listing of web
resources.
State and National Standards on Academic & Vocational Competencies
Welding Codes & Standards (www.aws.org/cgi-bin/shop) AWS is recognized worldwide for
the development of consensus-based American National Standards. Over 170 standards -- as
codes, recommended practices, guides and specifications. Certification is offered in seven
different welding processes.
63
64
Endnotes
Is Accreditation Accountable by the Council for Higher Education Accreditation (2003) provides a thorough
discussion of the tensions between the federal governments call for accountability for student learning and
traditional process based peer review accreditation methods:
http://www.chea.org/pdf/CHEAmonograph_Oct03.pdf
ii
Lisa Brewsters approach is summarized in Course Level Assessment Currently Being Used: Why Turn
Towards Them? (October 2003) presented at the Student Learning Outcomes workshop sponsored by the
RP Group: http://cai.cc.ca.us/SLOworkshops/Strand2/Brewster%20on%20Speech%20SLOs.doc
iii
Janet Fulks SLOs are on the web at http://www2.bc.cc.ca.us/bio16/Student%20Learning%20Outcomes.htm
and her grading rubrics are at http://www2.bc.cc.ca.us/bio16/projects_and_grading.htm
iv
For an excellent short article on this topic see The Case for Authentic Assessment by Grant Wiggins at
http://ericae.net/edo/ED328611.htm
v
SLO Good Practice Statements:
Palomar College: http://www.palomar.edu/alp/principles.html
College of DuPage: http://www.lgu.ac.uk/deliberations/assessment/manifest.html
American Association of Higher Education: http://www.aahe.org/assessment/principl.htm
vi
Primary Trait Analysis definition is from Integrating the Assessment of General Education into the
ClassroomA Two-Year College Model by Ruth Benander and Janice Denton of Raymond Walters
College and Barbara Walvoord of University of Notre Dame presented at the Annual Meeting of the North
Central Accrediting Association in April of 1997:
http://www.rwc.uc.edu/phillips/Assessment/NCApaper.html
See also Effective Grading: A Tool for Learning and Assessment. Walvoord, Barbara E. and Virginia J. Anderson.
San Francisco: Jossey-Bass Publishing, Inc. 1998; and Primary Trait Analysis: Anchoring Assessment in
the Classroom by Benander, Denton, Page and Skinner; Journal of General Education, Vol. 49, No 4, 2000.
vii
Mary Allen at CSU Fresno has written a succinct two pages of advice on the Use of Rubrics that is well
worth reading: http://www.calstate.edu/acadaff/sloa/links/using_rubrics.shtml
For a more detailed commentary on rubrics, see Developing and Applying Rubrics by Ethelynda Harding,
also of CSU Fresno, presented at a chemistry conference in March of 2004:
http://www.csufresno.edu/cetl/Events/Events%2003-04/ChemConf/Rubrics.pdf
viii
Assessing Modeling Projects In Calculus and Precalculus: Two Approaches by Charles E. Emenaker,
University of Cincinnati, Raymond Walters College: http://www.maa.org/saum/maanotes49/116.html
ix
Summary of direct assessment methods taken from A Glossary of Measurement Terms ERIC Digest.
http://ericae.net/edo/ed315430.htm and the Temple University Teachers Connection.
www.temple.edu/CETP/temple_teach/ and the NCIIA Assessment Workshop.
www.nciia.org/CD/public/htmldocs/papers/p_and_j.pdf
x
A Private Universe Schneps, M. H. and P. M. Sadler (1987) Harvard-Smithsonian Center for
Astrophysics, Science Education Department, Science Media Group, A Private
Universe. Video. Washington, DC: Annenberg/CPB: Pyramid Film and Video, 18 minutes.
xi
For commentary on informal norming sessions on an English writing rubric see Using Rubrics by
Michelle Christopherson of Modesto Junior College: http://cai.cc.ca.us/SLOworkshops/Strand2/Using
Rubrics.doc
xii
Undergraduate Psychology Major Learning Goals And Outcomes: A Report, American Psychological
Association (March 2002): http://www.apa.org/ed/pcue/taskforcereport2.pdf
xiii
Taken from A Handbook on Assessment for Two Year Colleges by Ed Morante of College of the Desert:
http://cai.cc.ca.us/Fall2002Institute/2002/assessmenthandbookfinal.doc
xiv
Examples of Course Level Assessment Plans:
Raymond Walters College: http://www.rwc.uc.edu/phillips/Assessment/AcadAssess.html
California State University, Fresno, Anthropology:
http://www.csufresno.edu/cetl/assessment/Programs/Anthropology/AnthroPlan.pdf