Está en la página 1de 10

Quantative Measurement for

Evaluating Study Abroad Programs

Indicators for defining program profile

Jacques Bessieres

February 2009

jacques.bessieres@connectica.fr www.eiesp.org
Participating Study Abroad Experts
• Brunhilde BIEBUYCK Director of studies Columbia-Penn Programs in
Paris (Paris)
• Bill CLABBY Director of Research ISA (Austin, TX)

• Monique FECTEAU Resident Director TUFTS U. (Paris)

• Mary-Ann LETELLIER Paris Director CUPA (Paris)


• Nancy MERRITT Exchange Director MICEFA (Paris)

• Alain MICHEL President EIESP (Paris)

jacques.bessieres@connectica.fr www.eiesp.org
Some Available Study Abroad Tools

II. « Qualitative » IV. Program


Methodologies Indicators

Program
Perspective
Oriented
• Standards of Good
practice
• QUIP
• Engle & Engle
Classification
?
• Etc.

I. Personal development III. Global Study Abroad


tools Statistics
• ICC • Open Doors
Student • IDI • -----
• MAXSA
Perspective • SOPI (GU)
Oriented • U and W curves
• Etc.

Qualitative Quantitative
Tools Tools
jacques.bessieres@connectica.fr www.eiesp.org
Historic Comparison
• The study abroad profession should mobilize
itself to establish a uniform standard system for
evaluating programs
• Comparison with the hospitality industry
– Prohibition period
– New York City defined a Uniform Standard System of Indicators
to track and control alcohol consumption
– Proved to be very effective for the management of hotels and for
tracking customers behaviour
– Adopted at the global level to manage all hotels worldwide. Main
concepts still in operation today
– How? Hotels subscribe to an independant organisation
– Application to study abroad
• Avoid measurement tools being imposed ex cathedra
• Standardize recognized indicators

jacques.bessieres@connectica.fr www.eiesp.org
Key Objectives of the Research
• To facilitate comparisons between programs

• To help students and home institutions in selecting a


program that best suits the needs and goals of students

• To develop a « grid » used by all study abroad


actors/participants to assess programs at a glance

jacques.bessieres@connectica.fr www.eiesp.org
Scope and Limitations
• Scope
– Semester and Full-year programs
• excluding short-term & internship programs
– In countries with a higher education system similar in
structure to the U.S.

• Limitations
– Exclude financial data
– Degree of disclosure of information

• Constraint
– Data should be easy to collect
jacques.bessieres@connectica.fr www.eiesp.org
Program Profile Framework

Program Profile

(A) (H) (C)


Academic Housing Cultural
Data Data Data

(G) General Data

jacques.bessieres@connectica.fr www.eiesp.org
Program Metric Profile Set of Standards (PMP)
(See Exhibit 1)
• General Indicators (G) • Academic Indicators (A)
– Potential capacity of the program – No. of courses (student course load)
– No. of students – Direct enrollment ratio
– No. of semester students – Variety of courses index
– Year student ratio – No. of courses by type of academic
– Fall/Spring index institution
– No. of sending/home universities – No. of signed agreements with host
universities
– Home university concentration ratio 50%
– Exchange student ratio
– Home university concentration ratio 75%
– Courses taught in host language ratio
– Average No. of students per full time
support staff
• Cultural Indicators (C)
– No. of co-curricular events per
• Housing Indicators (H)
semester
– Homestay ratio
– No. of extra-curricular events per
– Foyer ratio semester
– Studio ratio – No. of sponsored performances per
– Apartment ratio student
– Outsourced housing ratio – Variety of performance activities

jacques.bessieres@connectica.fr www.eiesp.org
Main PMP Expected Benefits
• For Students prior to departure
– Help in choosing the most appropriate program
– Ensure that students will succeed academically
– Know the level of on-site support provided

• For Home Universities


– Program approval: better understanding of the key program
components
– Campus advisors: better guidance of students

• For Organisations like the Forum


– Enhance objectivity by providing quantitative information (e.g.,
ratios of different types of housing)
– Standardize amount & type of information available about
programs

jacques.bessieres@connectica.fr www.eiesp.org
Discussion - Questions

• Example:
– PMP data collection form (see Exhibit 2)
– Comparison of three programs (see Exhibit 3). What can we
derive from these examples?
• How to implement research results?
• Need for independent organizations to:
– refine indicators
– gather information
– validate data
– publish results ?
• How to deal with confidential information and ensure
accuracy?

jacques.bessieres@connectica.fr www.eiesp.org

También podría gustarte