Está en la página 1de 6

2011; 33: 364–369

Technology-enabled assessment of health


professions education: Consensus statement
and recommendations from the Ottawa 2010
conference
ZUBAIR AMIN1, JOHN R. BOULET2, DAVID A. COOK3, RACHEL ELLAWAY4, AHMAD FAHAL5,
ROGER KNEEBONE6, MOIRA MALEY7, DORIS OSTERGAARD8, GOMINDA PONNAMPERUMA9,
ANDY WEARN10 & AMITAI ZIV11
1
National University of Singapore, Singapore, 2Education Commission for Foreign Medical Graduates, USA, 3Mayo Clinic
College of Medicine, USA, 4Northern Ontario School of Medicine, Canada, 5University of Khartoum, Sudan, 6Imperial College,
UK, 7The University of Western Australia, Australia, 8Copenhagen University, Denmark, 9University of Colombo, Sri Lanka,
10
University of Auckland, New Zealand, 11Israel Centre for Medical Simulation, Israel
Med Teach Downloaded from informahealthcare.com by University of Dundee on 03/27/13

Abstract
The uptake of information and communication technologies (ICTs) in health professions education can have far-reaching
consequences on assessment. The medical education community still needs to develop a deeper understanding of how
technology can underpin and extend assessment practices. This article was developed by the 2010 Ottawa Conference Consensus
Group on technology-enabled assessment to guide practitioners and researchers working in this area. This article highlights the
changing nature of ICTs in assessment, the importance of aligning technology-enabled assessment with local context and needs,
the need for better evidence to support use of technologies in health profession education assessment, and a number of
For personal use only.

challenges, particularly validity threats, that need to be addressed while incorporating technology in assessment. Our
recommendations are intended for all practitioners across health professional education. Recommendations include adhering to
principles of good assessment, the need for developing coherent institutional policy, using technologies to broaden the
competencies to be assessed, linking patient-outcome data to assessment of practitioner performance, and capitalizing on
technologies for the management of the entire life-cycle of assessment.

Introduction Technology in context


We live in a world suffused with information and communi- New assessment technologies should be considered in the
cation technologies (ICTs). It is increasingly difficult now to larger context of healthcare and health professions education.
remember a time without high quality, synthesized electronic This section will explore change as a defining attribute of
information at our fingertips. Although such technologies did technology, the increasingly ubiquitous presence of technol-
not deliver on their promise immediately; systems were slow ogy in health professions education, and current applications
and the trustworthiness of data was uncertain (Hersh 1994). of ICTs in assessment.
Yet, the last decade has seen such technologies become an
integral part of the instruction, assessment, and clinical practice
of health professionals. However, fundamental challenges Change and technology
remain as we seek to make appropriate and well-informed
Technologies tend to not only subvert their predecessors, but
use of technologies in support of health professional assess-
in most cases they also immediately suggest further advances
ment. This article presents a series of consensus recommen-
(Graham 1999). Nevertheless, not everything changes when a
dations to educators, administrators and organizations
new technology is introduced into a given situation. We can
regarding the use of different technologies in support of
consider three essential aspects of change associated with the
assessment practices.
use of technology in assessment – transmediation, innovation
This article is divided into three parts: the first part describes
and prosthesis.
the role of ICT in assessment; the second part highlights
challenges in using technology in assessment with a specific . Transmediation involves moving existing information,
consideration of threats to the valid interpretation of assess- practices, and tools to new media while retaining their
ment scores and associated outcomes; and the third part essential qualities. For example, traditional end-of-course
considers issues for research in this area. paper-based examinations that involve whole classes sitting

Correspondence: Z. Amin, Yong Loo Lin School of Medicine, National University of Singapore, NUHS Tower Block, Level 12, 1 E Kent Ridge Road,
Singapore 119228, Singapore. Tel: 00 65 6772 4572; fax: 00 65 6872 1454; email: paeza@nus.edu.sg

364 ISSN 0142–159X print/ISSN 1466–187X online/11/050364–6 ß 2011 Informa UK Ltd.


DOI: 10.3109/0142159X.2011.565832
Technology-enabled assessment of health professions education

down together are often transmediated by replicating the applications available from Apple’s iTunes App Store and still
same exam on a computer. Although the medium has growing (CHCF 2010).
changed (computer has replaced paper), the fundamental Higher education is also suffused with technologies
content and process remains the same. including PowerPoint and learning management systems
. Innovation refers to forms and processes that could not such as BlackboardÕ or MoodleÕ . Whether we approve or
exist without technology. Recent innovations in assessment not, contemporary students’ learning is as much about using
include technology-enhanced simulation (Margolis et al. Google and Wikipedia as it is about using an institutional
2004; Gesundheit et al. 2009) and dynamic media such as library (Ellaway & Martin 2008; Ellaway & Masters 2008). While
interactive images and video. The relative proportions of much of the ICTs used in health profession education are fairly
transmediation and innovation indicate how much of the generic, there are a number of healthcare-specific technologies
original has been translated through using ICT. The extent that can be employed for both formative and summative
of real innovation tends to be low following evolutionary assessment, in particular applications of simulation ranging
rather than revolutionary trajectories. While this is not from low-fidelity task trainers to high-fidelity mannequins
necessarily problematic, we may inadvertently introduce (Issenberg et al. 2005; Boulet 2008).
artifacts and enforce inappropriate orthodoxies in the Clearly, the applications and significance of technologies in
design and use of the technology in question healthcare and education are broad-based, inherent, and
(Scarborough & Corbett 1992) if we only transmediate pervasive (Greenhalgh 2001; Ellaway & Masters 2008). It is in
Med Teach Downloaded from informahealthcare.com by University of Dundee on 03/27/13

current practice. For instance, many online exams unnec- this context that technology in assessment of health professions
essarily still now follow the limitations of paper-based education must be considered, in part because it defines what is
examination materials. normative, acceptable and sustainable, and in part because
. Prosthesis occurs when ICT extends action beyond human good assessment practice should faithfully reflect the clinical
limits, allowing us to do things faster, more accurately, and environment (Ellaway et al. 2009; Kneebone 2009a).
in more places simultaneously than would be possible
without the technology. For instance, ICTs can extend and Technology in health professions
enhance assessment workflows and logistics by facilitating
education assessment
the development of questions and exams, managing
For personal use only.

security, and marking and providing feedback. Indeed, in Although the use of ICT in assessment is not new (Tekian et al.
many cases efficiency and quality control are the primary 1999; Bradley 2006), major developments of technology in
reasons for institutions to adopt e-assessment (Bennett health profession education assessment are largely centered
2005). on computer-based assessment (Margolis et al. 2004), use of
simulation (Norcini & McKinley 2007; Boulet & Murray 2010),
and management of assessment processes. Not surprisingly,
some of the strongest evidence supporting the use of
Technology in health professional technology comes from these areas (Norcini et al. 2011).
education
Both medicine and education are intrinsically technology- Computer-based assessments
enabled phenomena (Reiser 2009; Economist 2010). Modern Over the past two decades, improvements in ICTs have led to
medicine involves a large systematized knowledge base and a many enhancements in item and test construction, test
vast armamentarium of diagnostic tools, treatments, and other delivery, and scoring. The use of paper and pencil multiple
technologies. Becoming a doctor is in many ways synonymous choice examinations has gradually been transmediated into
with becoming a technocrat with humanistic goals of care and computer-based delivery of test content, often over the
altruism; someone whose authority comes in large part from Internet via secure, encrypted, connections. Computers also
their appropriate use and control of technology (e.g., tools, offer innovation and prosthesis: provided the item pool is
medicines, devices, and knowledge repositories). Similarly, sufficiently large and a detailed blueprint exists, automated test
education is founded on the systemization of learning, again construction software can be used to generate multiple test
significantly structured around technologies such as com- forms (Epstein & Hundert 2002; Norcini & McKinley 2007).
puters, smart classrooms, and simulation (Kress 2010). Moreover, the use of computers enables rapid scoring,
ICT is clearly fundamental to any healthcare system. It is including the generation of adaptive testing and the provision
currently manifested predominantly in the form of electronic of tailored feedback. The use of technology also allows for the
medical records, electronic ordering systems, picture archival construction of computerized virtual patient cases where those
and communication systems (PACS), billing systems, and more being assessed are tasked with managing a patient (or
generally through PubMed, online journals, databases, hand- patients) in simulated real time on the computer (Schuwirth
held and mobile technologies. The uptake of new technolo- et al. 1996; Dillon et al. 2004).
gies remains rapid with two-thirds of physicians and 42% of
the public using smart phones as of late 2009 (California
Simulation and simulators
HealthCare Foundation, CHCF 2010). The creation of ICT
applications related to health and healthcare is also moving There have also been many innovations in simulator technol-
quickly; as of February 2010, there were nearly 6000 medical ogy including part task trainers and various electromechanical
365
Z. Amin et al.

mannequins (Issenberg et al. 1999; Gordon et al. 2001; Ziv Assessing the wrong construct
et al. 2003; Norcini & McKinley 2007; Kneebone 2009b; Boulet
& Murray 2010). Simulation and simulators create a safe, Assessment fundamentally involves making inferences about
learner-centered environment where mistakes do not result in the learner – inferences about their knowledge, attitudes,
harm to the patient (Gordon et al. 2001; Boulet et al. 2003; general competence, communication skills, and so forth. This
Fried et al. 2004; Issenberg et al. 2005). Appropriate use of intended inference is called a construct. Unfortunately, the
technologies allows easier sampling of a much broader ICTs can sometimes interfere in unintended ways to alter the
domain of physician competencies (Dev et al. 2007). In construct that is actually measured, and this of course
addition to the focused assessment of individual skills, adversely impacts the meaning of the assessment results. For
innovative procedures have been developed such as the use instance, a poorly-designed assessment tool might measure the
of part-task trainers together with standardized patients (e.g., candidate’s ability to use the technology rather than (or in
suturing using a skin pad attached to a real human being) to addition to) measuring the intended clinical performance (i.e.,
allow concurrent assessment of both procedural and commu- the intended construct). Of course, if the purpose of the
nication skills (Kneebone et al. 2002). Crisis events can also be assessment is to assess candidate’s ability to the use of
modeled, allowing healthcare teams to be evaluated in realistic technology (such as working with an EMR) then the construct
environment including rare but important clinical events is, in part, the use of technology (Shachak et al. 2009).
essential for teaching patient-safety (Sica et al. 1999; Wong
Med Teach Downloaded from informahealthcare.com by University of Dundee on 03/27/13

et al. 2002). Distributed simulation using portable, low-cost,


and highly immersive environment offers a new avenue of Deviation from real-life
testing clinical skills in authentic setting (Kneebone et al. experiences
2010). Onscreen simulations, such as virtual patients, are
another growing form and one that has been found to have Many technology-based assessments attempt to emulate real-
utility in assessment as well as learning (Fischer et al. 2005; life experiences. Since clinical practice is complex and
Round et al. 2009). Overall, advancements in simulator influenced by multiple variables (Epstein & Hundert 2002),
technology have opened the door for more authentic assess- the scripts for such assessments must be carefully developed
ments that can be used to assess a much wider range of (Kneebone 2009b). Using simple assessment activities that
skills that, previously, were difficult to measure (Issenberg minimize the interactions among key variables may make the
For personal use only.

et al. 2005). process easier to handle, but this will also widen the gap
between the assessment activity and the clinical reality that it
represents. Conversely, an instructor might design an assess-
Management of assessment processes ment that contains details or requires actions that unnecessar-
Prosthesis by virtue of improved logistics (i.e., enhanced ily increase the complexity of the exercise beyond the learner’s
efficiency, tracking, and quality assurance) is a vitally impor- current level of training. This complexity, particularly if it is not
tant yet oft overlooked advantage ICTs in health professional germane or intrinsic to the construct being assessed, can
assessment; perhaps because students are not directly increase cognitive load, which in turn can cause measured
involved. It is often in this area that the greatest benefits are performance to suffer. Finally, there is a risk that the designers
to be found (Ellaway & Masters 2008). Examples of technology of technology-enhanced assessment activities might select
in assessment resource management include item banking, topics that do not reflect situations encountered in a typical
plagiarism detection, data monitoring and reporting, result practice (Downing & Haladyna 2004). One should ensure that
analysis, remote tracking and telemetry. In addition, planned the measures of assessment are well linked to the practical
integration of assessment processes with clinical environment context rather than what the ‘‘simulator" can effectively model
allows linking of patient outcome related data to the perfor- and measure.
mance of clinicians (Scalese et al. 2007; Shavit et al. 2007).

Challenges in the use of technology Tension between learner


for assessment assessment & course/technology
evaluation
Critical evaluation of the evidence supporting the use of ICTs
in assessment is essential but not the only requisite for The educators implementing a new assessment technology
informed decision making. We must also appreciate the many will often be interested in evaluating the performance of the
challenges that can result from using ICTs in assessment. These technology itself. To the degree that this evaluation interferes
challenges include factors such as resource limitation (e.g., with the optimal assessment process, this could invalidate any
expensive simulators), a lack of trained faculty, ethical inferences concerning the ability of the learners. For example,
challenges (e.g., balancing commercial interest with educa- the act of measurement could directly affect trainees perfor-
tional needs), and organizational inertia. While these factors mance (Hawthorne effect), or could paradoxically cause them
are important, this section will focus on potential validity to pay more attention to the simulation than they otherwise
threats associated with using technologies in assessment, not would. Identifying and balancing these confounders are
least because validity is central to any good assessment (Cook critically important when applying study findings to real life
& Beckmann 2006). applications.
366
Technology-enabled assessment of health professions education

Inappropriate levels of fidelity Reliability

Validity and reliability are linked to the fidelity of represen- Technological advances, including the development of com-
tation of clinical contexts and candidate’s actions within puter-based delivery of test content and the evolution of part-
technology-enhanced assessment. While greater fidelity task trainers, objective structured clinical examination
enhances the perceived realism of the assessment activity, it (OSCEs), and electromechanical mannequins, have allowed
can also increase the complexity and cognitive load associated for the construction of many new and different types of
with the assessment exercise. High-fidelity assessments (e.g., assessment processes. Like all assessments, however, the
simulation) may be poorly suited for assessing some learning sources of measurement error need to be investigated and
objectives (e.g., knowledge) or may not be well-suited for quantified. For OSCEs, especially those involving standardized
certain specialties. For example, full immersion simulation with patients, computer-based training of patient actors can
human patient simulators works well for anesthetic teams, but enhance the fidelity of their portrayal and minimize scoring
may be less appropriate when the focus is on surgeons and errors, thus yielding more reliable estimates (Errichetti &
others doing procedural interventions (Kneebone 2009b). Boulet 2006). Additional research concentrating on the appli-
Also, higher fidelity usually comes at a price – both the cation of technology for the training of those involved in
monetary cost of the technology itself, and the cost in healthcare-related performance assessments, including raters,
is needed. Finally, with the introduction of physical and
instructor time to develop and conduct the assessment
onscreen simulators into the assessment domain, test devel-
Med Teach Downloaded from informahealthcare.com by University of Dundee on 03/27/13

(Reznick & MacRea 2006). Thus, assessors should tar-


opers have been challenged with the construction of new
get appropriate levels of fidelity for the given assessment
scoring rubrics (Margolis et al. 2004). For some assessments,
task so as to ensure that this study is both meaningful and
such as those involving procedural skills, the evaluation tools
sustainable.
tend to be case-specific, potentially limiting the generalizability
of the scores. For others, including simulation scenarios keyed
to measuring more generic skills such as teamwork and ethical
Future research behavior, the evaluation tools and scoring criteria can be
difficult to interpret, even for experts, resulting in evaluations
While it is beyond the scope of this article to outline a specific
that can be subject to rater effects. As advances in technology
research agenda related to technology-enabled assessment,
For personal use only.

broaden the assessment domain, it is important that research


there are some general issues that must be addressed to ensure
be conducted to determine the specific sources of measure-
that any new technology-based developments, whether used
ment error in the scores.
for formative or summative purposes, yield valid and defen-
sible scores and/or decisions.
Other research areas
While research concerning the validity and reliability of
Validity assessment scores is paramount, technological advancements
As technology improves, offering more ways to assess in test delivery, test construction, and simulation for assess-
candidates, we must continue to be concerned with the ment also provide other opportunities for targeted studies.
validity of the inferences we make based on the assessment One of the most important research areas rests in ascertaining
scores. Although technological improvements can provide the comparative efficiency of competing testing approaches.
more efficient delivery of assessments, yield higher fidelity test While higher fidelity assessment models may be perceived to
content, and enable rapid scoring and tailored feedback, we be more effective in measuring educational outcomes, they are
still need to ensure the validity of the assessment results. As costly, can be logistically complex, and may not yield
outlined earlier, one can focus research efforts on investigating appreciably better measures of ability (Reznick & MacRea
potential threats to validity (Wiggins 1993). These types of 2006). Likewise, if ICTs are used to deliver formative assess-
studies could include looking at the impact of candidate ments, research is needed to best align educational models
familiarity with the assessment method on ability estimation, with the learning needs of the participants. These outcome
timing and pacing issues and, more broadly, the relationship measures can eventually be used to validate prior assessment
between the fidelity of the assessment and performance. scores.
However, the most salient validity issue is establishing the
relationship between performance on the assessment and, for
healthcare practitioners, the performance with ‘‘real’’ patients.
Conclusions
Longitudinally, we need to determine whether advances in The deliberations of this consensus group have highlighted the
assessment technologies lead to better patient outcomes or changing nature of ICTs, their near-ubiquitous presence in
other benefits such as cost saving. Also, given the prevailing healthcare and education, the importance of integrating
literature on impact of assessment on learning (Galvagno & technology-enabled assessment within health professional
Segal 2009; Larsen et al. 2009), and the enhancements in the education as a whole, the evidence supporting use of
fidelity of various assessments, comparative research aimed at technologies in health profession education assessment, the
quantifying the educational impact of new assessment formats educational challenges related to use of technologies in
is certainly called for. assessment, and directions for future research.
367
Z. Amin et al.

Because of the rapidity of change associated with ICTs, Recommendations for assessor and test developers
institutions and assessment planners should remain vigilant
(1) Assessors and test developers should ensure validity of
and develop necessary expertise in technology enabled
technology-enabled assessment through careful atten-
education and assessment, along with a coherent and respon-
tion to the constructs being measured, and through
sible institutional use of technology. Assessment planners
selecting appropriate realistic scenarios and activities.
should ensure that the basic tenets of quality education and
(2) Assessors and test developers should take into account
assessment are adequately met and are not compromised
local technological contexts and should make appro-
through the use of ICTs.
priate use of available technologies in designing
It is our consensus view that judicious use of ICTs can
assessments.
greatly improve assessment practices across the spectrum of
(3) Assessors and test developers should actively devise
health profession education. We hope that this article will help
assessment strategies to include broader competencies
raise awareness of the scope and capability of using ICTs in
such as team-work, monitoring of practitioners perfor-
support of assessment, to stimulate collaboration around their
mance, and patient-safety through the appropriate
development, and to incorporate ICTs in assessment in a
usage of technologies.
planned, supported and sustainable manner. The following set
of consensus recommendations is intended to support these
Future research
goals.
Med Teach Downloaded from informahealthcare.com by University of Dundee on 03/27/13

(1) Researchers should study the relationships between


assessment of performance in simulated environment
Consensus statement and and performance in real-life practice settings.
recommendations (2) Researchers should study the application of different
technologies to specific contexts, so as to better inform
General recommendations the selection of technologies they use.
(1) Institutional leaders, teachers, and other stakeholders (3) Researchers should develop scoring methods that
should understand and follow general principles of automate the collection, integration, and analysis of
quality assessment when using technology enabled the vast and often novel information available through
For personal use only.

assessment. technology enabled assessment.


(2) Educators and leaders should employ technologies that (4) Researchers should establish better and more robust
serve a demonstrable purpose or otherwise enable or links between workplace systems and patient outcome
extend current capabilities. Contextual considerations data and their use for assessment, including, but not
such as educational needs, resource efficiency, and limited to, data from electric medical records and
relevance to local healthcare should be the primary clinical charts.
deciding factors in choosing the appropriate technolo-
gies for enabling or enhancing assessment. Declaration of interest: The authors report no conflicts of
(3) Technology-enabled assessment needs to be integrated interest. The authors alone are responsible for the content and
within the larger ecosystem of health professional writing of the article.
education through a coherent and comprehensive
approach to its planning, implementation, and
management. Notes on contributors
All group members contributed to the conceptualization and development
of consensus statements and recommendations.
Recommendations for institutions and policy makers
ZUBAIR AMIN, MD MHPE (lead), Yong Loo Lin School of Medicine,
(1) Curriculum and Assessment Committees should include National University of Singapore, Singapore

member(s) with expertise in technology-enabled JOHN R. BOULET, PhD, Education Commission for Foreign Medical
Graduates, Philadelphia, USA
assessment to facilitate the appropriate planning, inte-
DAVID A. COOK, MD MHPE, Mayo Clinic College of Medicine, Rochester,
gration, and implementation of technology-enabled
Minnesota, USA
assessment.
RACHEL ELLAWAY, PhD, Northern Ontario School of Medicine, Ontario,
(2) Institutional leaders should facilitate appropriate faculty
Canada
and student development in using technology-enabled
AHMAD FAHAL, MBBS, MD, MS, FRCS, University of Khartoum, Sudan
assessment for their current and future needs.
ROGER KNEEBONE, PhD, FRCS and FRCGP, Imperial College, London, UK
Conversely, individual teachers and developers
MOIRA MALEY, PhD, CertMedEd, The Rural Clinical School of Western
should take a proactive approach towards personal Australia; The University of Western Australia, Australia
and professional development in the use of rapidly DORIS OSTERGAARD, MD, PhD Danish Institute for Medical Simulation,
changing technology in assessment. Copenhagen University, Denmark
(3) Institutions should capitalize on available technologies GOMINDA PONNAMPERUMA, MBBS, MMEd, PhD, University of Colombo,
for the entire life-cycle of management of assessment Sri Lanka
processes including examination development, admin- ANDY WEARN, MBChB, MMedSc, MRCGP, University of Auckland,
istration, data acquisition, analysis, reporting, storage, New Zealand
and quality assurance. AMITAI ZIV, MD, Israel Centre for Medical Simulation, Israel
368
Technology-enabled assessment of health professions education

Hersh W. 1994. Relevance and retrieval evaluation: Perspectives from


References
medicine. J Am Soc Inform Sci 45(3):201–206.
Bradley P. 2006. The history of simulation in medical education and Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JM, Petrusa ER,
possible future directions. Med Educ 40(3):254–262. Waugh RA, Brown DD, Safford RR, Gessner IH, et al. 1999. Simulation
Bennett RE. 2005. Using new technology to improve assessment. Educ technology for health care professional skills training and assessment.
Meas 18(3):5–12. Available from: http://www3.interscience.wiley.com/ JAMA 282:861–866.
journal/119079459/abstract?CRETRY=1&SRETRY=0 (accessed 4 June Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. 2005.
2010). Features and uses of high-fidelity medical simulations that lead
Boulet JR, Murray D, Kras J, Woodhouse J, Mcallister J, Ziv A. 2003. to effective learning: A BEME systematic review. Med Teach
Reliability and validity of a simulation-based acute care skills assess- 27(1):10–28.
ment for medical students and residents. Anesthesiol 99:1270–1280. Kneebone RL. 2009a. Practice, rehearsal, and performance: An approach
Boulet JR. 2008. Summative assessment in medicine: The promise of for simulation-based surgical and procedure training. JAMA 302:
simulation for high-stakes evaluation. Acad Emerg Med 1336–1388.
15(11):1017–1024. Kneebone R. 2009b. Perspective: Simulation and transformational change:
Boulet JR, Murray D. 2010. Simulation-based assessment in anesthesiology: The paradox of expertise. Acad Med 84(7):954–957.
Requirements for practical implementation. Anesthesiol 112(4): Kneebone R, Arora S, King D, Bello F, Sevdalis N, Kassab E, Aggarwal R,
1041–1052. Darzi A, Nestel D. 2010. Distributed simulation – accessible immersive
Bradley P, Bligh J. 2005. Clinical skills centres: Where are we going? Med training. Med Teach 32:65–70.
Educ 39(7):649–650. Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. 2002. An
California HealthCare Foundation 2010. How smart phone is changing innovative model for teaching and learning clinical procedures. Med
health care for consumers and providers. Available from http:// Educ 36:628–634.
Med Teach Downloaded from informahealthcare.com by University of Dundee on 03/27/13

www.chcf.org/publications/2010/04/how-smartphones-are-changing- Kress G. 2010. Multimodality: A social semiotic approach to contemporary


health-care-for-consumers-and-providers (accessed 23 June 2010). communication. London, UK: Routledge.
Cook DA, Beckman TJ. 2006. Current concepts in validity and reliability for Larsen PD, Butler A, Roediger III HL. 2009. Repeated testing improves long-
psychometric instruments: Theory and application. Am J Med 119: term retention relative to repeated study: A randomised controlled trial.
166.e7–e16. Med Educ 43:1174–1181.
Dev P, Youngblood P, Heinrichs WL, Kusumoto L. 2007. Virtual worlds and Margolis MB, Clauser BE, Harik P. 2004. Scoring the computer-based case
team training. Anesthesiol Clin 25:321–336. simulation component of USMLE Step 3: A comparison of preopera-
Dillon GF, Boulet JR, Hawkins RE, Swanson DB. 2004. Simulation in the tional and operational data. Acad Med 79:S62–S64.
United States medical licensing examinationTM (USMLETM). Qual Saf Norcini JJ, McKinley DW. 2007. Assessment methods in medical education.
Health Care 13:41–45. Teach Teach Educ 23:239–250.
Downing SM, Haladyna TM. 2004. Validity threats: Overcoming interfer- Norcini JJ, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith
For personal use only.

ence with proposed interpretations of assessment data. Med Educ R, Hays R, Kent A, Perrott V. 2011. Criteria of good assessment
38:327–333. Consensus statement and recommendations from the Ottawa 2010
Economist 2010. Wireless health care when your carpet calls your Conference 2010 Conference. Med Teach 33(3):206–214.
doctor. 8 April 2010. New York. Available from: http://www. Reiser SJ. 2009. Technological medicine: The changing world of doctors
economist.com/business-finance/displaystory.cfm?story_id=15868133& and patients. New York: Cambridge University Press.
source=hptextfeature (accessed 10 April 2010). Reznick RK, MacRea H. 2006. Teaching surgical skills – changes in the
Ellaway RH, Kneebone R, Lachapelle K, Topps D. 2009. Practica continua: wind. N Engl J Med 355:2664–2669.
Connecting and combining simulation modalities for integrated teach- Round J, Conradi E, Poulton T. 2009. Improving assessment with virtual
ing, learning and assessment. Med Teach 31(8):725–731. patients. Med Teach 31(8):759–763.
Ellaway R, Martin R. 2008. What’s mine is yours – open source as a new Scalese RJ, Obeso VT, Issenberg SB. 2007. Simulation technology for skills
paradigm for sustainable healthcare education. Med Teach training and competency assessment in medical education. J Gen Intern
30(2):175–179. Med 23(Suppl 1):46–49.
Ellaway R, Masters K. 2008. AMEE Guide 32: e-learning in medical Scarborough H, Corbett JM. 1992. Technology and organization: Power,
education Part 1: Learning, teaching and assessment. Med Teach meaning and design. London, UK: Routledge.
30(5):455–473. Schuwirth LWT, van der Vleuten CPM, De Kock CA, Peperkamp AGW,
Epstein RM, Hundert EM. 2002. Defining and assessing clinical compe- Donkers HHLM. 1996. Computerized case-based testing: A
tence. JAMA 387:226–235. modern method to assess clinical decision making. Med Teach
Errichetti A, Boulet JR. 2006. Comparing traditional and computer- 18(4):294–299.
based training methods for standardized patients. Acad Med 81(10): Shachak A, Hadas-Dayagi M, Ziv A, Reis S. 2009. Primary care physicians’
S91–S94. use of an electronic medical record system: A cognitive task analysis.
Fischer M, Kopp V, Holzer M, Ruderich F, Jünger J. 2005. A modified J Gen Intern Med 24(3):341–348.
electronic key feature examination for undergraduate medical students: Shavit I, Keidan I, Hoffmann Y, Mishuk L, Rubin O, Ziv A, Steiner I. 2007.
Validation threats and opportunities. Med Teach 27(5):450–455. Enhancing patient safety during pediatric sedation: The impact of
Fried MP, Satava R, Weghorst S, Gallagher A, Sasaki C, Ross D, Sinanan M, simulation-based training of non-anesthesiologists. Arch Pediatr
Uribe J, Zeltsan M, Arora H, et al. 2004. Identifying and reducing errors Adolesc Med 161(8):740–743.
with surgical simulation. Qual Saf Health Care 13:19–26. Sica GT, Barron DM, Blum R, Frenna TH, Raemer DB. 1999. Computerized
Galvagno SM, Segal BS. 2009. Critical action procedures testing: A novel realistic simulation: A teaching module for crisis management in
method for test enhanced learning. Med Educ 43:1182–1187. radiology. AJR Am J Roentqenol 172:301–304.
Gesundheit N, Brutlag P, Youngblood P, Gunning WT, Zary N, Fors U. Tekian A, McGuire CH, McGaghie WC. 1999. Innovative simulations for
2009. The use of virtual patients to assess the clinical skills and assessing professional competence: From paper-and-pencil to virtual
reasoning of medical students: Initial insights on student acceptance. reality. Chicago. Dept of Medical Education, University of Illinois at
Med Teach 31(8):739–742. Chicago. Chicago, USA.
Gordon JA, Wilkerson WM, Shaffer DW, Armstrong EG. 2001. Practicing Wiggins G. 1993. Assessing student performance exploring the purpose
medicine without risk: Students’ and educators’ responses to high- and limits of testing. San Francisco, USA: Jossey-Bass.
fidelity patient simulation. Acad Med 76(5):469–472. Wong SH, Ng KF, Chen PP. 2002. The application of clinical
Graham G. 1999. The internet: A philosophical inquiry. London, UK: simulation in crisis management Training. Hong Kong Med J
Routledge. 8:131–135.
Greenhalgh T. 2001. Computer assisted learning in undergraduate medical Ziv A, Wolpe PR, Small SD, Glick S. 2003. Simulation-based medical
education. BMJ 322:40–44. education: An ethical imperative. Acad Med 78:783–788.

369

También podría gustarte