Está en la página 1de 32

A Meta-Analysis of the Effects of Calculators on Students' Achievement and Attitude Levels

in Precollege Mathematics Classes


Author(s): Aimee J. Ellington
Source: Journal for Research in Mathematics Education, Vol. 34, No. 5 (Nov., 2003), pp. 433-
463
Published by: National Council of Teachers of Mathematics
Stable URL: http://www.jstor.org/stable/30034795 .
Accessed: 09/04/2011 13:14

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at .
http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless
you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you
may use content in the JSTOR archive only for your personal, non-commercial use.

Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at .
http://www.jstor.org/action/showPublisher?publisherCode=nctm. .

Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed
page of such transmission.

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of
content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms
of scholarship. For more information about JSTOR, please contact support@jstor.org.

National Council of Teachers of Mathematics is collaborating with JSTOR to digitize, preserve and extend
access to Journal for Research in Mathematics Education.

http://www.jstor.org
Journalfor Researchin MathematicsEducation
2003, Vol 34, No. 5, 433-463

A Meta-Analysis of the Effects of


Calculators on Students' Achievement
and Attitude Levels in Precollege
Mathematics Classes
Aimee J. Ellington, Virginia Commonwealth University

Thefindingsof 54 researchstudieswereintegrated throughmeta-analysis to deter-


minetheeffectsof calculators onstudentachievement andattitudelevels.Effectsizes
weregenerated throughGlassiantechniques of meta-analysis,
andHedgesandOlkin's
(1985)inferentialstatisticalmethodswereusedto testthesignificanceof effectsize
data.Resultsrevealedthatstudents'operational skillsandproblem-solving skills
improved whencalculators wereanintegralpartof testingandinstruction. Theresults
forbothskilltypesweremixedwhencalculators werenotpartof assessment, butin
all cases, calculatoruse did not hinderthe developmentof mathematical skills.
Studentsusingcalculators hadbetterattitudes towardmathematics thantheirnoncal-
culatorcounterparts.Further research is neededintheretentionof mathematics skills
afterinstructionandtransfer of skillsto othermathematics-relatedsubjects.
Key words:Achievement;Attitudes;Calculators;Meta-analysis;Statistical
size
power/effect

Over the last century, pedagogical methods in mathematicshave been in a


gradualyet constantstateof change.One instigatorof changein mathematicsclass-
roomshas been technology.Kaput(1992) describedthe role of technologyin math-
ematics education as "a newly active volcano-the mathematicalmountain...
changingbefore our eyes, with myriadforces operatingon it and within it simul-
taneously"(p. 515). Technologyandthe pedagogicalchangesresultingfromit have
a decisive impacton whatis includedin the mathematicscurriculum.In particular,
whatstudentsaretaughtandhow they learnaresignificantlyinfluencedby the tech-
nological forces at work on and within "the mathematicalmountain."The situa-
tion is compounded by the fact that technology is evolving at a rapid pace.
Mathematicseducatorshave the arduoustaskof keepingup with the advancesand
incorporatingthem in lessons and activities. Althoughthis is not easy to do, most
educatorstoday cannot imagine a classroom withouttechnology.
The calculatoris a technologicalforce thathas been a catalystfor lively debate
within the mathematicseducation community during the last 30 years. In the
1970s, the educationalrelevanceof the calculatorwas a controversialtopic. More

This articleis basedon the author'sdoctoraldissertationcompletedat the


Universityof Tennesseeunderthedirectionof DonaldJ. Dessart.
434 A Meta-Analysis
Effectsof Calculators:

recently, calculators have become commonplace, and discussion has focused


aroundways to help studentsachieve maximumbenefitsfromthe use of this tech-
nology. The highlightsof the debate have been outlined in a series of reviews of
calculatoruse research,which I summarizenext.
The CalculatorInformationCenter(CIC) at Ohio State Universityand several
independentreviewers (Neubauer, 1982; Parkhurst,1979; Rabe, 1981; Roberts,
1980;Sigg, 1982) reportedon concernsraisedby the calculator'sintroductioninto
the classroom. The reviews reportedon the successes and pitfalls in the imple-
mentation of calculator use in American schools (Suydam, 1978, 1979, 1980,
1981, 1982), respondedto criticismthatthe calculatornegatively affected results
of standardized mathematics achievement tests (Suydam, 1979, 1980), and
addressedthe possibility thatnegative calculatoreffects outweighed the benefits
of calculatoruse (Sigg, 1982).Thetwo most significantfindingswerethatthe calcu-
latordid not negativelyaffectstudentachievementin mathematicsandthatstudents'
attitudestowardmathematicswere not influencedin a positive or negative way by
calculatoruse. The lack of availabilityof calculatorresearchfor educatorsin the
field (Sigg, 1982) and the inadequateuse of calculatorsin the assessmentprocess
(Roberts, 1980; Sigg, 1982) were two issues of concernraisedby the researchers.
The tone of the debate shifted in the late 1980s with the introductionof the
graphingcalculator.When the discussion focused on how to incorporatecalcula-
torsin the most effective manner,the NationalCouncilof Teachersof Mathematics
(NCTM, 1989) gave the graphingcalculatorcredit for "the emergence of a new
classroomdynamicin whichteachersandstudentsbecomenaturalpartnersin devel-
oping mathematicalideas and solving mathematicalproblems"(p. 128). Reviews
publishedduringthis time reportedmixed resultsfor calculatoruse, with positive
resultsbecoming moreprevalentas time passed, particularlyfor the development
of problem-solvingskills (Gilchrist, 1993). Graphingtechnology was determined
to be the centralreasonfor studentimprovementin three areas:understandingof
graphicalconcepts, the abilityto makemeaningfulconnectionsbetweenfunctions
and their graphs, and enhanced spatial visualization skills (Penglase & Arnold,
1996). The findingsrelatingto students'achievementin mathematicswere incon-
clusive due to the prevalentuse of skill-basedtestingprocedures(Gilchrist, 1993;
Penglase & Arnold, 1996).
In roughlythe sametime frameas coveredby the calculatorreviews summarized
above, Hembreeand Dessart(1986, 1992) statisticallyintegrateda set of quanti-
tative calculatorstudies in a comprehensivereview throughmeta-analysis.The
resultswere most significantfor calculatoruse in Grades3 through9. Each study
includedin the meta-analysisinvolvedstatisticalcomparisonsof studentswho used
calculatorswith studentswho studiedthe same mathematicalmaterialbut without
the use of calculators. Two importantfindings were (a) the calculator had no
significant effect on students'conceptualknowledge of mathematicsand (b) the
calculatorhad a positive influence on students'attitudestowardmathematics.
For computationaland problem-solving skills, Hembree and Dessart (1986)
separatedthe studiesbasedon mode of testingandanalyzedeach groupseparately.
AimeeJ. Ellington 435

When calculators were part of the assessment process, the computationaland


problem-solving skills of students of low or average ability improved. When
students in experimentalgroups were not allowed access to calculators during
testing,averagestudentswho used calculatorsduringinstructionimprovedin both
theircomputationalandproblem-solvingskills. The only exceptionwas the fourth
gradewherecalculatorshada negativeeffect on computationalskills. Overall,the
resultswereencouragingfor the role of calculatorsin mathematicsclassrooms.The
negative result in Grade4 was a reminderto educatorsthat "calculators,though
generallybeneficial, may not be appropriatefor use at all times, in all places, and
for all subjectmatters"(Hembree& Dessart, 1992, p. 25).
The numberof classroomsnot incorporatingcalculatorswithinthe mathematics
curriculumhas diminishedsignificantlyin the last few years, andyet the concerns
over calculatorsarestill prevalent.On succeedingpagesof the May/June1999issue
of MathematicsEducation Dialogues, Ralston (1999) encouragedthe complete
abolishmentof paper-and-pencilcomputations(p. 2), whereas Mackey (1999)
recommendedthe use of calculatorsbe extremely limited (p. 3). Otherevidence
suggested that educators were most comfortable with the middle ground. For
example, in the same edition of MathematicsEducationDialogues resultsfrom a
surveyrevealedthatmost educatorsbelieved "calculatorsshouldbe used only after
studentshadlearnedhow to do the relevantmathematicswithoutthem"(Ballheim,
1999, p. 6).
To investigatefurtherthe effects of calculators,I designedandconducteda meta-
analysis for threereasons. First,the literaturecurrentlycontains over 120 studies
featuringa single aspect of this technologicalforce-the effects of calculatoruse
on studentsin mathematicsclassrooms.Recentcalculatorreviews featuringsome
of these studies (Gilchrist, 1993; Penglase & Arnold, 1996) did not employ infer-
ential methods of evaluation. Thus, a statistical analysis of studies conducted
duringthe last 15 years was warranted.Second, the calculatorcontroversyhas not
been resolved in the years since the Hembreeand Dessartmeta-analysisappeared
in print,suggestingthatresearchin this areamustcontinue.Third,the mathematics
classroom has experienced a variety of changes since the mid-1980s including
significantadvancesin technology, such as the introductionof the graphingcalcu-
lator, an increase in the level of technological sophisticationof the mathematics
educationpopulation,and documentedencouragementby organizationslike the
NCTM for exploring the pedagogicaluses of calculatorsin classrooms.
Statisticalintegrationof results from the body of studies conductedduringthis
time period is an appropriateway to assess the calculator'simpacton studentsin
the modernclassroom.This articleaddressesthe concernsexpressedby educators
duringthe last 3 decadesthroughan examinationand synthesisof resultsprovided
by a set of calculator-basedresearch studies featuringprecollege mathematics
students.In particular,the analysis covers the calculator'sinfluence on students'
performancein the following areas:(a) operational,computational,and concep-
tual skills; and (b) general problem-solving skills including two aspects: the
numberof problemsattemptedas the resultof havingaccess to a calculatorduring
436 Effectsof Calculators:
A Meta-Analysis

instructionandthe abilityto select the appropriateproblem-solvingstrategy.This


meta-analysisalso considersthe calculator'srole in the developmentof studentatti-
tudes towardmathematics.

METHOD

This study followed the procedures outlined by Lipsey and Wilson (2001).
Othermeta-analyticaltechniquesestablishedby experts in the field (e.g., Cooper
& Hedges, 1994; Hedges & Olkin, 1985; Hedges, Shymansky, & Woodworth,
1989) were also incorporatedas necessary. In the next sections of this article, I
presentinformationon variousaspects of the meta-analysis.

Constructsand Designs in Calculator-UseResearchStudies


Reviews of calculator-basedstudies over the last 30 years revealed that most
researchinvolving use of calculatorsfocused on changes to studentachievement
levels and attitudes toward mathematics.These two constructswere featuredin
Hembreeand Dessart's (1986) meta-analysis.Because therehas been no change
in the focus of recent researchreports,the currentstudy featuredthe same cate-
gories and subcategoriesof achievementand attitudeas those outlinedin the first
meta-analysis.The definitionsbelow originallyappearedin the writingsof Hembree
& Dessart (1986, 1992).
Forthe achievementconstructI sortedthe datainto threecategories:acquisition,
retention, and transfer of mathematicalskills. Skills acquisition was measured
immediatelyafter treatment;skills retentionwas measuredafter a predetermined
time lapse following treatment;and skills transferwas measuredby evaluatingthe
ways that studentsused the skills in othermathematics-relatedareas.These skills
were furthersorted into one of two subcategories,which I call Category I and
CategoryII and explain below.
Category I included skills that I identified as operational, computational,or
conceptual. Operationalskills were those I identifiedas the specific skills neces-
saryto solve the mathematicalproblemson tests of studentachievement.If the skill
was clearly computational,then I included data from that study in a separate
analysis of computationalskills, and likewise for studies that involved under-
standingof mathematicalconcepts. If an authordid not provide informationthat
allowed a skill to be identified as strictly computationalor conceptual, then I
includedthe datain the operationalskills category.
Category II involved a subcategory of problem-solving skills that were not
explicitly statedwiththe mathematicalproblemsused for assessment.Instead,these
were skills that studentsselected from their mathematicalrepertoireto solve the
verbalproblemslisted on the achievementtests. Althoughthe numberof problems
correct (and the number of problems partiallycorrect when partial credit was
counted in assessment) was covered by the generalcategory of problem-solving
skills, severalstudieslookedat two otheraspectsof problemsolving:productivity-
AimeeJ. Ellington 437

the numberof problems attemptedby students and selectivity-the numberof


appropriatestrategiesthey used. The selectivity category was somewhat subjec-
tive because it was based on the researcher'sopinion and expertise as to whether
the strategywas appropriateto a particularsituation.
The attitudeconstruct included the six attitudinalfactors of the Mathematics
Attitude Inventorydeveloped through the Minnesota Research and Evaluation
Project(Sandman,1980). The factors are these: (a) attitudetowardmathematics,
(b) anxiety towardmathematics,(c) self-concept in mathematics,(d) motivation
to increasemathematicalknowledge, (e) perceptionof mathematicsteachers,and
(f) value of mathematicsin society. Most attitude-relatedresultsinvolved only the
first factor.Studies thateitherexplicitly cited the MathematicsAttitudeInventory
or used other available attitude measures like the scales developed by Aiken
(1974) andFennemaand Sherman(1976) providedresultsrelatedto the otherfive
factors.One otherfactorthatwas includedin this meta-analysisbut thatHembree
and Dessert(1986, 1992) did not includewas students'attitudestowardthe use of
calculatorsin mathematics.
In most studies, two groups of studentswere taughtby equivalentmethodsof
mathematicalinstructionwith the treatmentgroupusingcalculatorsandthe control
group having no access to calculators.Several studies compoundedthe situation
by includingspecialcurriculummaterialsdesignedfor calculatoruse. Inbothcases,
the effects of calculatoruse were measuredby comparingthe groups' responses
to posttreatmentevaluations.The roleof the calculatorin posttreatmentassessment
was a significantfactorin the meta-analysisreportedhere Whentreatmentgroups
were not allowed access to calculatorsduring testing, the studies were used to
analyzestudentdevelopmentof mathematicalskills duringthecalculatortreatment.
Whentreatmentgroupshadaccess to calculatorsduringposttreatmentevaluations,
the studies were used to evaluate the calculator'srole in the extension of student
mathematicalskill abilities aftertreatmentwas concluded.

Identificationof Studiesfor the Meta-Analysis


The initial search for studies involved a perusalof the EducationalResources
InformationCenter(ERIC)andthe DissertationAbstractsInternational (DAI) data-
bases. A manual search of the Journalfor Research in MathematicsEducation
(JRME),SchoolScienceand Mathematics,andEducationalStudiesin Mathematics
from the beginning of 1983 to March, 2002 was used to locate citations and
abstracts. I paid particularattention to the annual bibliographiescompiled by
Suydamand publishedin JRMEfrom 1983 to 1993. When evaluatinga study for
inclusionin the meta-analysis,I scannedthe accompanyingbibliographyfor other
inclusionpossibilities.The finalcriteriafor inclusionin the meta-analysiswere that
the studywas publishedbetweenJanuary1983 andMarch2002; it featuredthe use
of a basic, scientific, or graphingcalculator;it involved studentsin a mainstream
K-12 classroom;and the reportof findings provideddatanecessaryfor the calcu-
lation of effect sizes. In the case of missing data, I attemptedto gatherthe infor-
438 Effectsof Calculators:
A Meta-Analysis

mation from the authorsof the original studies. For example, one reportomitted
class sample sizes and four reportsfailed to specify whetheror not the treatment
group had access to calculatorsduringposttreatmentevaluations.In all cases, the
missing informationwas successfully obtainedbefore data analysis continued.
The relationshipbetween the characteristicsof a study and its results is crucial
to meta-analysis.Therefore,quantifyingthe findings and study characteristicsis
a significantpartof dataorganization.Once all of the studies are coded, the tech-
nique of meta-analysis attempts to determine statistical similarities between
researchresults for the various study characteristics(Glass, McGaw, & Smith,
1981). For the meta-analysisreportedhere, characteristicsof studies featuredin
the meta-analysisappearin Table 1 andwere consideredas independentvariables.
Although some characteristicsin the table are self-explanatory,others need elab-
oration.Forthe treatmentlength,testonly refersto cases wherethe calculatorswere
a factoronly on the test (i.e., availableor not availablefor studentsto use) without
an instructionalcomponentbeforehand.With regardto curriculum,special mate-
rials are those designed for instructionwith calculatorsas opposed to traditional
materials used by both treatmentand control groups. Pedagogical use refers to
using the calculatoras an essential element in the teachingand learningof math-
ematics;functionaluse means thatit was used only in activities such as computa-
tion, drill and practice,and checking paper-and-pencilwork.

Table 1
Characteristicsof StudiesFeaturedin the Meta-Analysis
Characteristic
Publicationstatus Journal,Dissertation,Otherunpublishedsource
Test instrument Standardized,Nonstandardized(teachermade)
Educationaldivision Elementary,Middle School, High School
Ability of students Mixed, Low, High
Treatmentlength Test only; 0-3 weeks, 4-8 weeks, 9 or more weeks
Curriculum Traditional,Special
Calculatoruse Functional,Pedagogical
Calculatortype All types allowed, Basic, Scientific, Graphing
Study design Random,Nonrandom
Sample size 1-100, 101-200, 210-1000, over 1000

Effect sizes calculated from the numericaloutcomes from achievement and


attitudeassessmentswere dependentvariables.Because the authorsof the studies
providedmeansandstandarddeviationsor informationfromstatisticaltests based
on means and standard deviations, the effect size measure for the meta-analysis was
the standardized mean difference-the difference in the experimental and control
group means divided by a pooled standard deviation (Lipsey & Wilson, 2001). A
AimeeJ. Ellington 439

positive effect size indicatedthatthe experimentalgrouphad a higher mean than


the controlgroupfor a particularstudy, whereasa negativeeffect size impliedthat
the controlgroupperformedbetterthanthe treatmentgroup;an effect size of zero
indicatedthat there was no difference between the treatmentand control groups.
Given that Hedges and Olkin (1985) proved thatthe raw effect size has distribu-
tion bias, each rawvalue was correctedfor this problemandthe resultingvalue was
used in furtheranalysis. The magnitudeof the effect sizes vary accordingto many
differentfactors,includingthe researcher'smethodsandthe subjectof the research.
Mindfulof these considerations,Cohen (1988) has establishedbasic guidelineson
evaluatingthe magnitudeof effect sizes, with values near0.2, 0.5, and0.8 consid-
ered to be small, medium, and large, respectively.
With two exceptions, the skill achievementor attitudedata gatheredfrom one
articlewas used to generateone effect size for the meta-analysis.Liu (1993) and
Pennington(1998) studiedgroupsof studentsinvolved in two differenttreatments
witheachtreatmentgroupbeingcomparedto a controlgroup.One groupwas taught
with calculatorsby traditionalinstructionmethods and the other treatmentgroup
was taught with special calculator-relatedinstruction materials. Because each
article presenteddata on two treatmentgroups that differed significantly in the
methodof treatment,the resultingeffect sizes were not averagedinto one value.
For these two articles, the two independent groups were considered separate
primarystudies for the purposeof analysis.

Data Analysis Procedures


Hedge's Qstatisticwas used to test the homogeneityof a groupof effect sizes.
This has a chi-squaredistributionwith k - 1 degrees of freedom, where k is the
numberof effect sizes. A set of effect sizes is called homogeneousif each element
in the set is an estimateof the population'seffect size (Hedges & Olkin, 1985) and
the variancein effect sizes is the result of sampling error.Because the potential
existed for othersourcesof variability,a randomeffects model outlinedby Lipsey
& Wilson (2001) was used to addressthe variationamong effect sizes.
Fora homogeneousset of effect sizes, the populationeffect size is best estimated
by a weightedmeanof unbiasedeffect sizes; therefore,a weightedmeanandcorre-
sponding 95% confidence intervalwere generatedfor each homogeneous set of
effect sizes. The mean effect size was determinedto be statistically significant
(representinga significantdifferencebetween the treatmentgroupandthe control
group's achievement or attitude scores) when the corresponding confidence
intervaldid not containzero. If the test for homogeneityrevealedsignificanthetero-
geneity (i.e., p < .01), outliers were removed one at a time until a nonsignificant
Qwas obtained. Although the traditionaldefinition of outlier is a value that is
significantly larger or smaller than the others in a data set, outliers in meta-
analysiscan also resultfrom sample size. Forexample, if an effect size was gener-
ated for a study with a sample size significantly smaller than that in the other
studies, then thateffect size could be an outlier.In this study, I used a methodfor
440 Effectsof Calculators:
A Meta-Analysis

identifyingoutliersdevelopedby HuffcuttandArthur(1995) thatconsiderseffect


size value and sample size.
Finally, an analysis of independentvariableswas conductedto determinethe
effect of moderatorvariableson heterogeneityof the effect size data sets and the
magnitudeof the weightedmeaneffect sizes. These analyseswere conductedwith
the entire set of effect sizes includingthose deemed outliersat an earlierstage of
analysis.

Description of the Studies


The initial searchof the broadlydefined categoryof calculator-basedresearch
in the K- 12 classroom uncovered86 studies for the meta-analysis.After evalu-
atingthe studiesaccordingto the criteriaessentialfor meta-analysis(e.g., presence
of a treatmentandcontrolgroup;datanecessaryfor calculatingan effect size), 32
studieswere eliminated.The final set of 54 studies(see Appendixfor the citations)
publishedbetweenJanuary1983 throughMarch2002 provideddatafor 127 effect
sizes. Each study was classified accordingto the characteristicsshown in Table 1,
and the distributionof the studies accordingto them appearsin Table 2. Looking
at the breakdownof studies within a particularcharacteristic,the numbersdo not
sum to 54 since several studies providedseparatedatafor more thanone classifi-
cation(e.g., one studyprovideddataon bothmiddleandhigh school students).Also,
the curriculumandcalculatoruse variableswere not coded for studiesthatfeatured
only a test and no instructionwith calculators.

Table2
Distributionof StudiesFeaturedin theMeta-AnalysisAccordingto theSet of Characteristics
Number Number
Characteristic of studies Characteristic of studies
Publicationstatus Curriculum
Journal 9 Traditional 41
Dissertation 37 Special 6
Otherunpublished source 8 Calculatoruse
Test instrument Functional 11
Standardized 24 Pedagogical 36
Nonstandardized 33 Calculator type
Educationaldivision All typesallowed 4
Elementaryschool 9 Basic 25
Middleschool 20 Scientific 3
Highschool 26 Graphing 22
Ability of students Study design
Mixed 46 Random 44
Low 2 Nonrandom 10
High 7 Samplesize
Treatment length 1-100 28
Testonly 7 101-200 18
0-3 weeks 17 201-1000 4
4-8 weeks 9 Over1000 4
9 ormoreweeks 21
AimeeJ. Ellington 441

Overall, 85% of the studies appearedeither as journalarticles,dissertations,or


masterstheses. Of the remainingstudies, one was an unpublishedreportand the
other eight were ERIC documents.A variety of standardizedtests were used to
assess achievement(e.g., ScholasticAptitudeTest, Iowa Test of Basic Skills), and
nonstandardmethods of assessment used by some researcherswere teacher or
researcherdesigned tests.
With regardto gradelevel, roughlytwo thirdsof the reportsfeaturedmore than
one grade level, and as a consequence it was not possible to analyze the data by
individualgrades.Instead,I sortedthe studiesinto educationaldivisions-elemen-
tary,middle,andhigh school.The elementarygradeswererepresentedby the fewest
numberof studies. Nearly 70% of the studies involved at least one of grades 8
through12. Based on this distribution,inferencesdrawnfrom this meta-analysis
are best appliedto mathematicsstudentsin highergrades.
The lengthof calculatortreatmentrangedfromtest only (i.e., the studiesinvolved
a test with no instructionaluse of calculatorsbeforetesting)to 650 days (i.e., 3 1/2
school years).The durationof the treatmentphaseexceeded 30 days for nearly60%
of the studies. Only three studies evaluatedstudentsafter a predeterminedreten-
tion periodrangingfrom 2 to 12 weeks.
Randomassignmentof classes of studentsto the calculatortreatmentwas used
in 81%of the studies. In the remainingstudies it was eitherclear thatassignment
to treatmentwas not randomor the studydesign was not obvious fromreadingthe
article.Although a randomized,strictlycontrolledstudy is the ideal, this type of
study is not always possible in educationalresearch.Studies using randomand
nonrandomassignmentwere includedin the meta-analysis.The studydesign vari-
able was coded to determinewhetheror not includingnonrandomizedstudiesinflu-
enced the meta-analyticalfindings (Lipsey & Wilson, 2001).
Combined sample sizes of treatmentand control groups ranged from 14 to
48,081. Eighty-five percent of the studies were conducted with samples of 200
participantsor less. Four calculator studies were conducted with over 4,000
studentsparticipating.

RESULTS

The sections that follow presentresults from the meta-analysisand interpreta-


tions of the findings. For comparisonpurposes,analysis of heterogeneoussets of
effect sizes was conductedtwice: (1) with all of the effects and (2) with outliers
removedyielding a homogeneousset of effects. The tablesprovidethe 95%confi-
denceintervalsthatwereusedto determinethe statisticalsignificanceofg, thecorre-
sponding weighted mean effect size. Hedges Qstatistics, used to determinethe
homogeneity of each set of effect sizes, are also included in these tables. For the
skill data that was heterogeneousin the first roundof the analysis, I includedan
independentvariableanalysis to gain informationaboutthe heterogeneityof the
data.The resultsfor studentattitudesand the correspondingindependentvariable
analysis conclude the results section.
442 Effectsof Calculators:
A Meta-Analysis

Effect Sizes
Table 3 contains the number of effect sizes gathered for each achievement
construct(acquisition,retention,transfer)and category of skills analyzed in this
study. Each numberin the table representscomparisonof achievementdata from
treatmentand control groups. The results are organizedaccordingto method of
testing-with or withoutcalculators.

Table3
Number of Effect Sizes for the Achievementand Skill Constructs,Their Categories, and
Calculator Use in Testing
without
Testing calculators withcalculators
Testing
RetentionTransfer
Acquisition RetentionTransfer
Acquisition Total
CategoryI skills
Operational 15 0 0 25 2 1 43
Computation 15 1 0 12 2 0 30
Concepts 8 0 0 11 0 0 19
CategoryII skills
Problemsolving 7 0 0 14 2 1 24
Productivity 0 0 0 1 0 0 1
Selectivity 3 0 0 6 1 0 10
Total 48 1 0 69 8 2 127

With regardto the numberswithin Table 3, there were 15 effect sizes used to
analyze studentacquisitionof operationalskills. These values were gatheredfrom
15 studies thatprovideddataon studentacquisitionof operationalskills in which
the skills were not strictlycomputationalor conceptual,butinsteadwas a composite
of the two. Each reportcontainedquantitativedata on the comparisonof a treat-
ment and control group in which the treatmentgroup had access to calculators
duringinstructionbut not duringtesting. Similarly,the resultsoutlinedbelow on
studentacquisitionof problem-solvingskills when calculatorswere partof testing
and instructionare based on the analysis of 14 effect sizes.
The general category of operationalskills contains the most informationfor
analysiswith 43 effect sizes acrossbothtestingconditions.Resultson productivity
are not providedsince only one studyprovideddataon this problem-solvingskill.
Sixty-nine effect sizes were available to analyze skills acquisitionwhen calcula-
torswere partof testingas well as instruction.The resultsof the acquisitionof oper-
ational and problem-solvingskills when calculatorswere not partof the testing
process are based on 48 effect sizes.
Althoughtechnicallyan analysiscan be conductedwith as few as two effect sizes,
the results from such a small numberof studies are not a strongreflection of the
populationunderconsideration.Therefore,I chose to analyzeonly those categories
with three or more effect sizes. The 54 studies includedin this meta-analysisdid
not providemuchinformationon skills retentionandtransferso this meta-analysis
does not provide informationon these two aspects of achievement.
AimeeJ. Ellington 443

EffectSize Findingsfor Acquisitionof Skills-Testing WithoutCalculators


Table 4 includes the results from an analysis of the achievement constructof
acquisitionof operationaland problem-solvingskills in testing situationsthatdid
not permitthe use of calculators.The results showed that there were two signifi-
cantfindingsin whichthe 95%confidenceintervaldid notcontainzero:operational
skills (.03, .31) and selectivity skills (.15, .44). Because the Qstatisticwas signif-
icant for operationalskills (indicatingthat the data set was not homogeneous), I
conductedfurtheranalysis, which resultedin the removalof an outlierandin Qno
longer being statisticallysignificant.The operationalskills weighted mean effect
size (g = .17) was slightly smaller(g = .14) afterthis was done, resultingin a new
confidence interval of (.01, .38) that also did not contain zero. These findings
suggest thatfor assessmentsof operationalskills and problem-solvingselectivity
skills in which calculatorswere not allowed duringtesting, studentsusing calcu-
lators during instructionperformedbetter than the control group. For problem-
solving selectivity skills, the meaneffect (g = .30) was based on threestudiesthat
assessed the abilityof studentsusing calculatorsto select the appropriateproblem-
solving strategies,andconsequentlythis resultshouldbe interpretedwith caution.

Table4
Resultsfrom the Analysis ofAcquisition ofSkills-Testing WithoutCalculators
Skill Type k g CI U3 Q Ne N
Operationalskills
All studies 15 .17 (.03, .31) 57 29.5" 1069 1065
Outliersremoved 14 .14 (.01, .38) 25.5 1044 1047
skills
Computational
All studies 15 .03 (-.14, .20) 32.1* 843 886
Outliersremoved 14 -.02 (-.16, .11) 17.7 763 786
Conceptualskills
All studies 8 .05 (-.20, .29) 28.1* 650 715
Outliersremoved 7 -.05 (-.19, .09) 7.3 590 655
Problem-solvingskills
All studies 7 .16 (-.01, .32) 5.2 287 273
Selectivity skills
All studies 3 .30 (.15, .44) 62 1.0 346 420

Note.k = numberof studies;g = weightedmeaneffect size; CI = 95%confidenceintervalforg;


U3= percentageof areabelowg onthestandardnormalcurve(reported
onlyforCIsthatdonotcontain
zero);Q = homogeneity statistic;Ne = combinedexperimental
groupsamplesize;Nc = combined
controlgroupsamplesize.
*
p < .01

Both of these values (g = .17 and g = .30) are consideredto be small according
to the guidelinesfor evaluatingthe magnitudesof effect sizes. The U3 statisticwas
calculatedfor each weighted mean effect size in orderto presenta clearerinter-
pretationof each value. The U3 statistic converts an effect size to the percentage
444 A Meta-Analysis
Effectsof Calculators:

of area falling below the given effect size value on the standardnormal curve
(Cohen, 1988). U3is the percentageof studentsin the treatmentgroupwho scored
higher than the median score of the control group while, based on the definition
of median,50%of studentsin the controlgroupscoredhigherthanthe medianscore
of the control group. For the weighted mean effect sizes for these constructs,the
U3 statisticsare 57 for operationalskills and 62 for selectivity skills. The value 57
meansthatwhile 50%of studentsin the controlgroupscoredhigherthanthe median
on achievementtests of operationalskills, 57%of studentsusingcalculatorsduring
instructionscored higherthan the median score of the controlgroup on achieve-
ment tests of operationalskills. Based on the writings of Cohen (1988), another
interpretationis the averagestudentwho had access to a calculatorduringinstruc-
tion hada mathematicsachievementscorethatwas greaterthan57%of the students
who did not have access to calculators during instruction.The U3 statistic for
problem-solvingselectivityskills was slightlyhigherat 62, butwas basedon a small
numberof studies.
The computational,conceptual, and problem-solvingskills categories did not
yield statisticallysignificantresults because their confidence intervalscontained
zero. Because the datasets for the computationaland conceptualskills constructs
were not homogeneous in the initial stage of analysis, the outlier analysis was
conducted.However,even afterremovingoutliers,the confidenceintervalsfor the
weighed mean effect sizes still contained zero. Therefore, students who used
calculators during instructiondid not perform significantly higher on tests of
mathematicalachievementwithoutcalculatorsthantheirnoncalculator-usecoun-
terparts.Whereasstudentsdid not benefit from the use of calculatorswhen devel-
oping computationaland conceptualskills, their abilities were also not hindered
by calculatoruse.
The problem-solvingdataset was homogeneousafterthe first stage of analysis,
so it was not necessaryto runthe outlieranalysis. Althoughthe lower value of the
confidence intervalis negative, the value is small enough to be consideredzero.
Therefore,the studentsin the treatmentand controlgroupswere not significantly
differenton assessmentmeasuresof problem-solvingskills.

EffectSize Findingsfor Acquisitionof Skills-Testing WithCalculators


As shown in Table 5, statisticallysignificant weighted mean effect sizes were
generated for four of the five construct categories in which calculators were
allowed duringtesting. Selectivity was the only one thatdid not have a significant
effect size. Because the Qstatisticwas significantfor these fourcategories,outlier
analysis was conducted.Three constructs,operationalskills (g = .38), computa-
tional skills (g = .43), and problem-solvingskills (g = .33), were slightly affected
by the removal of outliers resultingin g values of .32, .41, and .22, respectively.
However, these changes in effect size magnitudewere minimal,and the resulting
effect size values for all four constructscan be consideredas small to medium.
Aimee J. Ellington 445

Table5
Resultsfrom the Analysis of Acquisitionof Skills-Testing WithCalculators
SkillType k g CI U3 Q Ne N
Operational skills
All studies 25 .38 (.28, .48) 65 243.1" 32892 31397
Outliersremoved 19 .32 (.21, .42) 30.6 3589 3534
Computational skills
All studies 13 .43 (.18, .67) 67 63.2* 3277 2123
Outliersremoved 11 .41 (.23, .59) 24.7 3213 2069
Conceptual skills
All studies 11 .44 (.20, .68) 67 60.4* 3100 2444
Outliersremoved 8 .44 (.19, .69) 17.1 2653 2090
Problem-solving skills
All studies 14 .33 (.12, .54) 63 41.6" 3226 2089
Outliersremoved 12 .22 (.01, .43) 19.9 400 404
Selectivityskills
All studies 6 .20 (-.01, .42) 2.4 153 189
Note:k = number g = weighted
of studies; meaneffectsize;CI= 95%confidence
interval
forg;
U3= percentageof areabelow g on the standardnormalcurve(reportedonly for CIs thatdo not contain
zero); Q = homogeneity statistic;Ne = combined experimentalgroup sample size; Nc = combined
control groupsample size.
*p<.01

The U3 statistic was calculated to interpretthe mean effect sizes for each
constructcategory. Studies of computationaland conceptualskills generatedthe
highestvalue (67); the values for the othertwo constructswere in the same middle
60s range. With respect to the skills necessary for understandingmathematics
concepts and computation,67% of studentsusing calculatorsduring instruction
scoredhigherthanthe medianscore of the controlgroupon mathematicsachieve-
menttests. Similarstatementscan be made comparingmorethan60% of students
usingcalculatorswith theircontrolgroupcounterpartsin termsof operationalskills
and problem-solvingskills.
Six studies of problem-solving selectivity skills in which calculators were
allowed duringtesting yielded a weighted mean effect size (g = .20) thatwas not
statistically significant. Therefore,developmentof the skills necessary to select
appropriateproblem-solvingstrategieswas neitherhelped nor hinderedby calcu-
latoruse.
All of the weighted mean effect sizes generatedfor the constructsunderboth
testingconditionswere relativelysmall. However,Cohen (1988) statesthatdue to
the circumstancesunderwhich these studieswere conductedthis is to be expected:
"Whenphenomenaare studied which cannot be broughtinto the laboratory,the
influence of uncontrollableextraneousvariables ('noise') makes the size of the
effect smallrelativeto these(makesthe 'signal'difficultto detect)"(p. 25). It should
also be noted thatwhen comparingthe two methodsof assessment,the studies in
whichcalculatorswere allowed duringtestingyielded morestatisticallysignificant
446 Effectsof Calculators:
A Meta-Analysis

results than the studies in which calculators were not part of the assessment
process.
For the resultsoutlinedabove thatwere based on homogeneousdata sets in the
first stage of analysis,it can be assumedthatthe weightedmeaneffect size was the
best estimateof the populationrepresentedby the data.Forstudiesthatdid not allow
calculatorsduringtesting, the problem-solvingskills categoryand the selectivity
skills categorywere homogeneousin the first stage of analysis.This was also true
for selectivityskills whencalculatorswerepartof instructionandtesting.Therefore,
the weighted mean effect sizes and correspondingconfidence intervalsfor these
constructsadequatelyrepresentedthe populationfrom which the datacame.
The weighted mean effect size was not the best estimate of the population
(Lipsey & Wilson, 2001) for the sets of effect sizes thatwere heterogeneousin the
first stage of analysis (i.e., operationalskills, computationalskills, andconceptual
skills when calculatorswere not includedin testing; operationalskills, computa-
tional skills, conceptualskills, and problem-solving skills when testing included
calculators)and the difference was likely based on a study's characteristics(i.e.,
independentvariables).In orderto determinethe influence of independentvari-
ableson the heterogeneityof effect sizes in each achievementconstruct,I conducted
an analysis of moderatorvariables.This was done to gain insight into the reasons
a set of effects was heterogeneousand to help explain the influence of the coded
independentvariableson the studentachievementconstructsunderconsideration.

Analysis Using ModeratorVariables


In the moderatorvariableanalysis,all characteristicsexcept for samplesize listed
in Table 1 were included with some merged in order to produce meaningful
comparisons.The sectionsthatfollow presentresultsfor the independentvariables
that yielded significantresults;for the variablesthatdid not, such results are not
reported.The significancelevel, p < .01, was used to determinewhethertherewas
a significant difference in effect size magnitudesfor each independentvariable.
Ninety-five percent confidence intervals were used to determinethe statistical
significance of g, the correspondingweighted mean effect size.
Testingwithoutcalculators.Becausethe initialtest for homogeneityfor problem-
solving skills and selectivityrevealeda homogeneousset of effect sizes (see Table
4), an independentvariableanalysis was not conductedfor this set of data.Instead,
the focus for this partof the study was on the skill type variableson operational,
computational,and conceptual.The results from this analysis appearin Table 6.
The data in the top portionof Table 6 show thatfor operationalskills, only one
independent variable-treatment length-produced significant differences for
effect size magnitudesacross three treatmentcategories:0-3 weeks, 4-8 weeks,
and 9 or more weeks. Treatmentlength (QB = 14.5, p < .01) resultedin a negative
weighted mean effect size (g = -.17) for studies conductedover a 4-8 week treat-
ment period. However, the value was not significantlydifferentfrom zero based
Aimee J. Ellington 447

Table6
ModeratorVariableAnalysis of Skill Effects-Testing WithoutCalculators
Skills
Operational
Variable k g CI Qw QB
Treatmentlength
0-3 weeks 4 .31 (.14, .48) 1.8 14.5"
4-8 weeks 3 -.17 (-.36, .02) 1.5
9 ormoreweeks 8 .24 (.05, .42) 11.7
Skills
Computational
Variable k g CI QB
Qw
Treatmentlength
0-3 weeks 3 .14 (-.51, .78) 7.2 13.4"
4-8 weeks 3 -.25 (-.49, -.01) 1.9
9 ormoreweeks 9 .06 (-.08, .20) 9.6
Skills
Conceptual
Variable k g CI QB
Qw
Educationaldivision
Elementary school 4 -.06 (-.29, .18) 5.5 16.5"
Middleschool 2 .52 (-.26, 1.29) 5.8
Highschool 2 -.15 (-.38,.09) 0.2
Treatmentlength
0-3 weeks 3 .26 (-.48, 1.00) 17.4" 10.3"
4-8 weeks 2 -.29 (-.55, -.04) 0.3
9 or moreweeks 3 .08 (-.06, .22) 0.2
use
Calculator
Functional 4 -.21 (-.42, .01) 2.3 7.1*
Pedagogical 4 .21 (-.15, .57) 18.7*
Note.k = numberof studies;g = weightedmeaneffect size; CI = 95%confidenceintervalforg;
homogeneity statistic; QB=difference between contrasted categories
* Qw= .01
p<

on the confidence intervalvalues. Positive values were generatedfor calculator


treatmentsof operationalskills lasting 0-3 weeks (g = .31) and 9 or more weeks
(g = .24); in both cases, the weighted mean effect sizes were significantly larger
thanzero. Therefore,the operationalskills of studentsusing calculatorsless than
or equal to 3 weeks or 9 or more weeks improved.Because the resultfor 9 or more
weeks was based on eight studies, this result was the most credible of the results
presentedfor the analysis by treatmentlength.
As shown in the middleportionof Table 6, treatmentlength (QB= 13.4,p < .01)
also produceda significantresultfor effect sizes resultingfromcomputationalskills
assessmentsin which calculatorswere not partof testing.The 0-3 weeks and 9 or
moreweeks categoriesyielded positive weighted meaneffect sizes, but the values
were not significantlydifferentfrom zero. Therefore,for these treatmentlengths,
studentsusing calculatorsduring instructionbut not during testing were neither
helped norhinderedby calculatoruse. The negative weighted mean effect size for
448 Effectsof Calculators:A Meta-Analysis

studiesconductedovera 4-8 week treatmentperiodindicatedthatstudentsnot using


calculatorsduringinstructionoutperformedtheir calculatorcounterpartson tests
of computationalskills.
The conceptual skills construct(see the lower portion of Table 6) resulted in
significantdifferencesfor threeindependentvariables:educationaldivision, treat-
ment length, and calculatoruse. With respect to educationaldivision (QB = 16.5,
p < .01), the weightedmeaneffect sizes generatedfor elementaryschool (g = -.06),
middle school (g = .52), and high school studies (g = -.15) did not correspondto
a significantdifferencein the conceptualskills assessmentoutcomesfor calculator
and noncalculatorstudents. With respect to treatmentlength, the results were
similarto those reportedfor the computationalskills construct.The 0-3 weeks and
9 or more weeks time framesyielded positive weightedmeaneffect sizes thatwere
not significantly differentfrom zero. Therefore,for these two treatmentlengths,
students using calculators during instructionon conceptual skills were neither
helpednorhinderedby calculatoruse. The 4-8 week time frameresultedin a nega-
tive weighted mean effect size, suggesting that studentswho did not have access
to calculators outperformedstudents who used calculators during lessons on
conceptualskills. The magnitudeof effect sizes for the conceptualskills construct
also differed significantly with respect to the calculator use variable (QB = 7.1,
p < .01). The weighted mean effect sizes for functionaluse and pedagogical use
were small, and neithervalue was significantlydifferentfrom zero.
Testing with calculators. This section presentsresults of an analysis of opera-
tional skills, computationalskills, and conceptualskills by independentvariable
for studies in which the calculatorswere partof testing.Therewas no single inde-
pendent variable for which a significant difference existed across all three
constructs.The operationalskills andconceptualskillsconstructswerethe two areas
most affected by the variablesthatwere featuredin this analysis.
The top portionof Table7 containsthe resultsfor the operationalskills analysisof
the independentvariables.The analysisrevealedthatthe magnitudeof effect sizes
differedsignificantlywith respectto publicationstatus(QB = 140.6,p < .01). The
weightedmeaneffectsizewas smallestforstudiespresentedas otherunpublished docu-
ments such as those from ERIC(g = .27). The value generatedfrom dissertations
(g = .31) was slightlylarger.The weightedmeaneffect size for studiesthatappeared
as journalarticles(g = .50) was moderatein size. All threevalues were statistically
significantin favorof studentswho hadaccess to calculatorsduringinstruction.
Based on the significant difference in the test instrumentvariable(QB = 18.3,
p .01), nonstandardizedtests yielded a slightly larger(g = .44) weighted mean
<
effect size when comparedwith standardizedtests (g = .32). However,both values
were moderatein size but statisticallysignificant.Therefore,studentstakingstan-
dardizedand nonstandardizedteacher-madetests of operationalskills benefited
from calculatoruse duringinstruction.
The educational division variable also had a significant influence on the
magnitudeof effect sizes (QB = 17.1, p < .01) but only in the middle school and
Aimee J. Ellington 449

Table 7
Moderator Variable Analysis of Operational and ComputationalSkill Effects-Testing
WithCalculators
OperationalSkills
Variable k g CI QB
Qw
Publicationstatus
Journal 7 .50 (.36, .65) 29.0* 140.6"
Dissertation 12 .31 (.08, .54) 31.9"
Other 6 .27 (.13, .41) 41.6"
Test instrument
Standardized 9 .32 (.18, .46) 177.2* 18.3*
Nonstandardized 16 .44 (.24, .63) 47.7*
Educationaldivision
Elementaryschool 1 .48 (.17, .78) 0.0 17.1"
Middle school 7 .57 (.15, .98) 34.1*
High school 17 .32 (.21,.44) 192.0*
Ability level
Mixed 22 .35 (.24, .45) 224.9* 14.1*
High 3 .69 (.29, 1.10) 4.2
Treatmentlength
Test only 6 .29 (.15, .43) 148.8* 28.3*
0-3 weeks 8 .47 (.11, .82) 33.8*
4-8 weeks 3 .34 (-.11, .79) 6.0
9 or more weeks 8 .49 (.18, .81) 26.3"
Calculatortype
All 4 .25 (.10, .41) 144.0* 28.5*
Basic/scientific 8 .55 (.20, .90) 31.1*
Graphing 13 .40 (.19, .60) 39.5*
Study design
Random 21 .33 (.23, .44) 217.6* 19.9*
Nonrandom 4 .68 (.35, 1.01) 6.2
ComputationalSkills
Variable k g CI QB
Qw
Publicationstatus
Journal 3 .82 (.26, 1.73) 23.3* 19.6"
Dissertation 9 .18 (-.12, .48) 20.3*
Other 1 .96 (.63, 1.28) 0.0
Study design
Random 10 .24 (.02, .47) 28.7* 27.6*
Nonrandom 3 1.18 (.57, 1.78) 6.9*
Note.k = numberof studies;g = weightedmeaneffect size; CI = 95%confidenceintervalforg;
Qw= homogeneity statistic;QB = differencebetweencontrasted
categories
*p<.01

high school. The weighted mean effect size for studies at the middle school level
(g = .57) was largerthanthe effect size for studiesat the high school level (g = .32).
Forbothdivisions,theresultswere statisticallysignificantin favorof studentsusing
calculatorsduringinstruction.
450 Effectsof Calculators:
A Meta-Analysis

The significant difference in effect size magnitudewith respectto ability level


(QB= 14.1,p < .01) resultedfrom 3 studiesof high ability studentsthatwere sepa-
ratedfrom the remaining22 studiesconductedin mixed abilityclassrooms.There
were no studies of operationalskills that were conductedsolely with low ability
students.The weighted mean effect size for the high ability studies (g = .69) was
in the high range, while the correspondingstatistic for the mixed ability studies
(g = .35) was moderatein size. Both values were statistically significant.
The analysis of independentvariablesrevealedthatthe magnitudeof the effect
sizes differed significantlywith respect to treatmentlength (QB = 28.3, p < .01).
Unlike the similaranalysis describedabove for studies in which calculatorswere
not allowed duringtesting, therewere no negative weighted mean effect sizes for
any of the four treatmentlength categories. The effect size value for studies
featuringonly a test (g = .29) was small, whereasthe weighted mean effect sizes
for the 0-3 weeks and 9 or more weeks categoriesfell in the moderaterange.The
results for the test only, 0-3 weeks, and 9 or more weeks categorieswere statisti-
cally significantin favorof studentsusing calculators.For studiesconductedover
a 4-8 week time frame,studentswere neitherhelped norhinderedby the inclusion
of calculatorsin testing and instruction.
Calculatortype (QB = 28.5, p < .01) also revealedsignificantdifferencesin effect
size magnitudes,and all threeresultswere significantlydifferentfrom zero. Four
studiesallowed studentsto use any type of calculator(basic,scientific,or graphing)
anddid not featureany formof mathematicsinstruction.Eachstudywas a compar-
ison of studentstakinga test with access to calculatorsandstudentstakingthe same
test without access to calculators.The studies with basic or scientific calculators
yielded a higherweighted mean effect size (g = .55) when comparedto the effect
size for studiesfeaturingthe graphingcalculator(g = .40). The resultsfor this inde-
pendentvariablereveal thatfor all types of calculators,studentsusing calculators
duringtesting and instructionperformedbetterthan their noncalculatorcounter-
partson tests of operationalskills.
Lastly, the magnitudeof effect sizes for the operationalskills constructdiffered
significantlywith respectto studydesign (QB = 19.9,p < .01). The four studies in
which the treatmentgroupwas not selectedby randomassignmentgenerateda rela-
tively large weighted mean effect size (g = .68). The studies in which random
assignmentwas used resultedin a moderateeffect size (g = .33). Both values repre-
sent a statistically significant difference between assessment results of students
usingcalculatorsduringinstructionandstudentswithno access to calculatorsduring
instruction.Due to selection bias in the nonrandomizeddesign (Lipsey & Wilson,
2001), it appearsthatthe nonrandomizedstudies overestimatedthe magnitudeof
the effect of calculatorson operationalskills.
The lower portionof Table 7 shows that significant differences in effect size
magnitudesfor two independentvariablesresultedfromthe analysisof the compu-
tationalskills construct.Forpublicationstatus(QB = 19.6,p < .01), the resultwas
similarto the one reportedabove for operationalskills. The weighted mean effect
size for studies appearingin journals(g = .82) was fairly large.The effect size for
AimeeJ. Ellington 451

dissertations (g = .18) was not significantly different from zero, whereas the
journal result was statisticallysignificant in favor of the calculatorgroup. Study
design also revealed significantdifferences(QB = 27.6, p < .01) in the magnitude
of effect sizes. The threestudiesconductedwithoutrandomassignmentto the treat-
mentgroupyielded a largeweightedmeaneffect size (g = 1.18). Based on the size
of this value, the nonrandomizedstudies appearto overestimatethe overall effect
of calculatorson computationalskills. The effect size for studies conductedwith
random assignment (g = .24) was smaller, but both values were significantly
differentfrom zero.
The upperportion of Table 8 contains the results of the independentvariable
analysis for conceptual skills and shows significant differences in effect size
magnitudesfor five variables.For test instrument(QB = 7.5, p < .01), the studies
using nonstandardizedtests (g = .60) yielded a weighted meaneffect size thatwas
statisticallysignificant.Therefore,studentswho took teacher-madetests of concep-
tual skills benefitedfromcalculatoruse. The resultfor standardizedtests (g = .16)
was not statisticallysignificant.With respect to educationaldivision (QB = 13.0,
p < .01), the middleschool division generatedthe largestweightedmeaneffect size
(g = .70) followed by the high school division (g = .43). The effect size for the
elementarydivision (g = -.14) was based on only two studies and was not statis-
tically significant.The middle and high school values were significantlydifferent
from zero.
A significant difference in effect size magnitude was found for ability level
(QB = 11.5, p < .01). The studies that featuredhigh ability students were sepa-
ratedfrom the studies conducted in mixed ability classrooms. The result was a
higher weighted mean effect size for the high ability classes (g = .84), but the
value was not significantly different from zero. For calculator use (QB = 17.5,
p < .01), the studies in which the calculatorhad a functionalrole yielded a smaller
effect size value (g = .12) when compared with the studies in which the calcu-
lator had a pedagogical role (g = .69). The effect size for the functional studies
was not statistically significant in favor of the studentsusing calculatorsbut the
value for pedagogical studies revealedthatstudentswho used calculatorsoutper-
formed their noncalculatorcounterpartson assessments of conceptual skills. A
significant difference in effect size magnitudewas generatedby calculatortype
(QB= 9.7, p < .01). The weightedmeaneffect size (g = .69) for the graphingcalcu-
lator studies was in the high range. The effect size value for the studies featuring
basic and scientific calculators(g = .13) was not statisticallysignificant.
The lower portion of Table 8 contains the independentvariable analysis for
problem-solvingeffect sizes fromstudiesin which calculatorswere allowedduring
testing. Significantdifferencesin effect size magnitudesresultedfrom analysis of
abilitylevel (QB = 12.0,p < .01) andcalculatortype (QB = 12.9,p < .01). The high
abilitycategorycontainedonly one effect size. The low abilitycategory,consisting
of two effect sizes, yielded a negative weightedmeaneffect size (g = -.18) butthe
resultwas not statisticallysignificantfor eitherthe calculatoror the noncalculator
group.The mixed ability studies generateda moderateeffect size value (g = .43)
452 Effectsof Calculators:A Meta-Analysis

Table8
ModeratorVariableAnalysisof Conceptualand Problem-SolvingSkillEffects-Testing With
Calculators
Skills
Conceptual
Variable k g CI QB
Qw
Testinstrumen
Standardized 3 .16 (-.12, .44) 18.2" 7.5*
Nonstandardized 8 .60 (.16, 1.05) 34.7*
Educationaldivision
Elementary school 2 -.14 (-.42, .15) 0.1 13.0"
Middleschool 5 .70 (.13, 1.27) 34.9*
Highschool 4 .43 (.03, .82) 12.6"
Abilitylevel
Mixed 8 .29 (.06,.52) 32.6* 11.5*
High 3 .84 (-.04,1.71) 16.2*
Calculatoruse
Functional 2 .12 (-.05, .29) 0.5 17.5*
Pedagogical 7 .69 (.23, 1.16) 31.0*
Calculator
type
Basic/scientific 4 .13 (-.14, .40) 19.7" 9.7*
Graphing 7 .69 (.23, 1.15) 31.0"
Skills
Problem-Solving
Variable k g CI QB
Qw
Abilitylevel
Mixed 11 .43 (.20, .65) 29.6* 12.0*
Low 2 -.18 (-.59, .23) 0.0
High 1 .15 (-.31, .62) 0.0
Calculatortype
Basic/scientific 11 .23 (-.01, .47) 19.9 12.9*
Graphing 3 .61 (.12, 1.10) 8.9
Note.k = numberof studies;g = weightedmeaneffect size; CI = 95%confidenceintervalforg;
= homogeneitystatistic;QB= differencebetweencontrasted
categories
*Qw
p <.01

thatwas significantlydifferentfromzero. Therefore,the problem-solvingskills of


studentsin mixed ability classroomsimprovedfrom calculatoruse duringtesting
andinstruction.Withregardto the calculatortype variable,the studiesthatfeatured
the graphing calculator yielded a weighted mean effect size (g = .61) in the
moderateto highrangethatwas statisticallysignificantin favorof the studentsusing
calculators.The studies that involved a basic or scientific calculatorgenerateda
relativelysmall effect size value (g = .23) thatdid not significantlyfavorthe calcu-
latorgroup.

Findings Regarding Student Attitudes


Table9 summarizesthe meta-analyticalfindingsregardingthe attitudeconstructs.
The data set for the attitudetowardmathematicsconstructwas heterogeneousat
AimeeJ. Ellington 453

the initial stage of analysis. Therefore,just as with the skills constructs,analysis


was conductedwith all studiesandthen afterthe removalof outliers.Due to insuf-
ficient data,inferentialstatisticscould not be generatedfor four of the categories:
anxietytowardmathematics,motivationto learnmathematics,attitudetowardmath-
ematics teachers,and students'perceptionsof the value of mathematicsin society.

Table9
Resultsfrom the Analysis of AttitudeConstructs
Construct k g CI U3 Q Ne Nc
Attitudetowardmathematics
All studies 18 .32 (.07, .58) 63 134.5" 1366 1286
Outliersremoved 12 .20 (.01, .40) 22.8 491 457
Self-conceptin mathematics
All studies 4 .05 (-.06, .16) 2.6 706 631
Attitudetowarduseof
calculators
in mathematics
All studies 3 .09 (-.19, .36) 3.7 645 556
Note:k = numberof studies;g = weightedmeaneffect size; CI = 95%confidenceintervalforg;
U3= percentageof areabelowg onthestandardnormalcurve(reported
onlyforCIsthatdonotcontain
zero);Q = homogeneity statistic;Ne = combinedexperimental
groupsamplesize;Nc= combined
controlgroupsamplesize.
*p<.01

The data for the students' attitudes toward mathematicsconstruct yielded a


statisticallysignificantweightedmeaneffect size (g = .32). The value was slightly
smaller after the removal of outliers. This weighted mean effect size means that
on attitudesurvey instruments,the studentsusing calculatorsduring instruction
reporteda betterattitudetowardmathematicsthan the studentswho did not use
calculators.This weighted mean effect size is in the small to moderaterange.The
U3 statistic for this value was 63. One interpretationof this statistic is that the
averagestudentwho had access to a calculatorduringinstructionreportedan atti-
tudetowardmathematicsthatwas betterthan63%of the studentswho did not have
access to calculatorsduringinstruction.
Smallweightedmeaneffectsizes weregeneratedforstudents'self-conceptin math-
ematics (g = .05) and attitudestowarduse of calculatorsin mathematics(g = .09).
Both of these effect size values were based on a small numberof studies. Neither
value was significantly differentfrom zero. Therefore,studentswho used calcu-
latorsduringinstructionand studentswho did not use calculatorsduringinstruc-
tion reportedsimilaropinions on questionsregardingthese attitudeconstructs.
Due to the heterogeneityof effect sizes for the attitudetowards mathematics
construct,an analysis of independentvariableswas conducted,and the resultsare
presentedin Table 10. Significantdifferencesin the magnitudesof weightedmean
effect sizes arereportedfor seven variables.Regardingpublicationstatus(QB = 11.7,
p < .01), one journal was combined with the results for dissertations,and the
454 A Meta-Analysis
Effectsof Calculators:

weighted mean effect size (g = .35) was small to moderate.The value was signif-
icantly differentfrom zero. The effect size value for the ERICdocumentsor other
unpublisheddocumentswas negative (g = -.01) but close to zero and not statisti-
cally significantin favor of studentsin the controlgroup.The test instrumentvari-
able (QB = 22.5, p < .01) yielded a weightedmeaneffect size for studiesusing stan-
dardized tests (g = .32) slightly larger than the value for studies using
nonstandardizedtests (g = .28). The result for standardizedtests is statistically
significant in favor of studentswho had access to calculatorsduringinstruction.

Table10
ModeratorVariableAnalysis-Attitude Construct
Variable k g CI QB
Qw
Publicationstatus
Journal/dissertation 16 .35 (.09, .62) 106.5" 11.7"
Other 2 -.01 (-1.48, 1.46) 16.3"
Testinstrument
Standardized 10 .32 (.05, .59) 49.8* 22.5*
Nonstandardized 8 .28 (-.23, .78) 62.2*
Educational division
Elementary school 4 .15 (-.19, .49) 4.2 10.7"
Middleschool 7 .28 (-.12, .69) 64.1"
Highschool 7 .38 (-.12, .89) 55.4*
Abilitylevel
Mixed 16 .23 (-.01, .46) 87.2* 28.7*
High 2 1.06 (-.24, 2.37) 18.5*
Treatment length
0-3 weeks 5 .21 (-.26, .67) 18.7* 14.2*
4-8 weeks 4 .40 (-.43, 1.22) 79.9*
9 or moreweeks 9 .32 (.06, .58) 21.8"
Calculatoruse
Functional 5 .36 (-.18, .90) 64.6" 27.9*
Pedagogical 13 .32 (.07, .58) 42.0*
Calculator
type
Basic/scientific 10 .17 (-.10, .44) 39.9* 47.8*
Graphing 8 .49 (.11, .87) 46.7*

Note.k = numberof studies;g = weightedmeaneffect size; CI = 95%confidenceintervalforg;


Qw= homogeneity statistic;QB = differencebetweencontrasted
categories
*p<.01

Increasingweightedmeaneffect sizes accordingto increasingdivision (g = .15,


g .28, g = .38, respectively)were the resultof the analysis of educationaldivi-
=
sion (QB = 10.7, p < .01). However, none of the values significantlyfavored the
studentsusing calculatorsduringinstruction.Based on the analysisof abilitylevel,
(QB= 28.7, p < .01), the weightedmeaneffect size for studiesfeaturinghigh ability
students(g = 1.06) was large, but it was based on datafrom only two studies.The
AimeeJ. Ellington 455

effect size value for mixed ability classes (g = .23) was small and not statistically
significant.
The 9 or moreweeks categoryproduceda smallto moderateweightedmeaneffect
size (g = .32) duringthe analysis of treatmentlength (QB = 14.2, p < .01). This
result significantly favored the students who had access to calculators during
instruction.The effect size values for the 0-3 weeks categoryand4-8 weeks cate-
gory (g = .21 andg = .40, respectively)were relatively similarin size, but neither
value was statisticallysignificant.
The weighted mean effect sizes for the functional (g = .36) and pedagogical
(g = .32) categorieswereclose in size afterthe analysisof calculatoruse (QB= 27.9,
p < .01). The pedagogical result was statisticallysignificant in favor of students
who had access to calculatorsduringinstruction.Lastly, significantdifferencesin
the magnitudesof effect sizes were foundwithrespectto calculatortype (QB = 47.8,
p < .01). The studies that featuredthe graphingcalculatorgenerateda moderate
weighted mean effect size (g = .49) that was statisticallysignificantfor students
who hadaccess to calculators.The effect size value for studiesusingbasic or scien-
tific calculatorswas small (g = .17) and not statisticallysignificant.

SummaryofMajor Findings
When calculatorswere included in instructionbut not testing, the operational
skills andthe abilityto select the appropriateproblem-solvingstrategiesimproved
for the participatingstudents.Under these conditions, there were no changes in
students'computationalskills andskills usedto understandmathematicalconcepts.
When calculatorswere partof both testing and instruction,the operationalskills,
computationalskills, skills necessary to understandmathematicalconcepts, and
problem-solvingskills improvedfor participatingstudents.Underthese conditions,
therewere no changesin students'abilityto select the appropriateproblem-solving
strategies.Studentswho usedcalculatorswhile learningmathematicsreportedmore
positive attitudestoward mathematicsthan their noncalculatorcounterpartson
surveys taken at the end of the calculatortreatment.

DISCUSSION

The purposeof this meta-analysiswas to determinethe effects of calculatorson


students'acquisitionof operationaland problem-solvingskills as well as student
attitudestowardmathematics.The studieson which these resultswere based were
conductedprimarilyin classroomsin which studentswere engaged in a traditional
mathematicscurriculum.The readershould keep in mind that in most cases, the
participatingclassroomswere not using curriculummaterialsspecificallydesigned
for calculatoruse, but at the same time, it shouldbe noted that in two thirdsof the
studies the calculatorhad an active role in the teaching and learningprocess.
456 Effectsof Calculators:
A Meta-Analysis

Overviewof Findings
When calculators were available during instruction but not during testing,
students in grades K-12 maintainedthe paper-and-pencilskills and the skills
necessary to understandmathematicalconcepts. The operationalskills of these
studentsimprovedas a resultof calculatoruse duringinstruction.Studentsreceived
the most benefitwhencalculatorshada pedagogicalrole in the classroomandwere
notjust availablefor drillandpracticeor checkingwork.The resultsfor operational
skills favored mixed ability classes, with high ability studentsneitherhelped nor
hinderedby calculatoruse duringinstruction.The meta-analysisreportedheredoes
not include results for low ability students.In orderto have a positive influence
on students' operationalskills, the findings suggest that calculator use during
instructionshould be long term (i.e., 9 or more weeks). With respect to problem
solving,the skills of precollegestudentswerenothinderedby the inclusionof calcu-
latorsin mathematicsinstruction.Based on a limited numberof studies, the skills
necessary to select the appropriateproblem-solvingstrategiesmay improve as a
result of calculatoruse.
When calculatorswere included in testing and instruction,studentsin grades
K-12 experienced improvementin operational skills as well as in paper-and-
pencil skills andthe skills necessaryfor understandingmathematicalconcepts.With
regardto operationalskills andconceptualskills, the resultsof calculatoruse were
most significant for classes in which the calculator'srole was pedagogical. The
calculatorbenefitedstudentsin mixed abilityclasses andclasses consistingof high
ability students.The meta-analysisdoes not reportresults sufficient for general-
izations to be made for classes of low ability students.When the calculatorwas
includedin testing andinstructionof conceptualskills, studentsbenefitfrom short
term(0-3 weeks) use of calculators.Benefits to operationalskills can be seen with
shortterm or long term (9 or more weeks) calculatoruse.
Underthe sametestingandinstructioncircumstances,improvementin problem-
solving skills for studentsin mixedabilityclasses appearedin the results.This meta-
analysis does not providesufficientdatafor generalizationsfor classes consisting
of low or high abilitystudents.Students'abilitiesto select the appropriateproblem-
solving strategieswere not hinderedby the calculator'srole in testing and instruc-
tion. The increase in problem-solvingskills may be most pronouncedundertwo
conditions:(1) when special curriculummaterialshave been designed to integrate
the calculatorin the mathematicsclassroom and (2) when the technological tool
in use was the graphingcalculator.These resultsshouldbe interpretedwith caution
because the data are based on a small numberof studies.
Allowing studentsto use calculatorsin mathematicsmay resultin betterattitudes
towardmathematics.In this study, attitudesshowed the most improvementafter
9 or moreweeksof calculatoruse. Students'self-conceptin mathematicsandattitude
towardthe use of calculatorsin mathematicswere not hinderedby calculatoruse.
In this meta-analysisacrossall constructs,the resultsfor studieslasting4-8 weeks
either favored studentswho did not have access to calculatorsduringinstruction
Aimee J. Ellington 457

or did not show significant differences between the two groups. For many
constructs, the results based on studies lasting less than 4 weeks or more than
8 weeks were favorable for calculatoruse. This discrepancymay be related to
students'abilitiesto retainwhatthey learn.In short-termstudies,retentionwas not
assessed, but in long-termstudies, retentionwas somewhatsignificantespecially
with concepts learnedearly in the treatmentphase.
Based on the natureof the datagatheredfrom the 54 studies,the effect of calcu-
lator use in individual grades could not be determined.Hembree and Dessart
(1986, 1992) reportedin theirmeta-analysisthatwhencalculatorswere not allowed
during testing, the use of calculatorsin instructionhad a negative effect on the
computationalskills of students in fourth grade. Unfortunately,this particular
resultcould not be supportedor disprovedby the currentmeta-analysis.Based on
studiesconductedwithinthe elementarydivision,the developmentof computational
skills was nothinderedby calculatoruse duringinstructionfor bothwith andwithout
calculatoruse in testing.
When calculatorswere not allowed duringtesting,resultswere not significantly
differentfor one typeof calculatoras comparedto the others.Whencalculatorswere
an integralpartof the testing process, the resultsbased on graphingcalculatoruse
weresignificantlybetterthantheresultsof basicor scientificcalculatorsin two areas:
conceptualskills and problem-solvingskills. Operationalskills benefitedfrom all
threetypes of calculators.Lastly,graphingcalculatorshada moresignificantinflu-
ence on students'attitudeswhen comparedwith othertypes of calculators.

Recommendationsfor Classroom Usage


The resultsfrom this meta-analysissupportthe use of calculatorsin all precol-
lege mathematicsclassrooms. When considering the grade distributionof the
studies based on the educational divisions (elementary, middle, high school),
length of calculator availability during instruction should increase with each
increasinggradelevel. Because limited researchhas been conductedfeaturingthe
early grades, calculatoruse should be restrictedto experimentationand concept
developmentactivities. Calculatorsshouldbe carefullyintegratedinto K-2 class-
roomsto strengthenthe operationalgoals of these grades,as well as fosterstudents'
problem-solvingabilities.
Calculatorsshouldespecially be emphasizedduringthe instructionof problem-
solving skills in middle and high school (i.e., Grade6 throughGrade 12) mathe-
matics courses. This emphasismay resultin increasedsuccess in problemsolving
as well as more positive attitudestoward mathematics.Teachers should design
lessons thatintegratecalculator-basedexplorationsof mathematicalproblemsand
mathematical concepts with regular instruction, especially in these grades.
Calculators should be available during evaluations of middle and high school
students'problem-solvingskills andtheirunderstandingof mathematicalconcepts.
This recommendationis basedon the resultsreportedin this meta-analysis,andthe
inconsistenciesnotedby otherreviewers(Gilchrist,1993;Penglase& Arnold,1996;
458 Effects of Calculators: A Meta-Analysis

Roberts, 1980) that occur when tests are given without calculatorsafter instruc-
tion has takenplace with calculators.

Recommendationsfor FutureResearch
Consideringthe searchconductedto gatherrelevantstudiesandrecognizingthe
fact thatthis paperdoes not fully addressmanyquestionsthathave been raisedby
mathematicseducators,I propose several areasin which furthercalculator-based
researchis needed. Only a few studies involved the calculator'srole in the reten-
tion andtransferof operationalskills andstudents'abilitiesto select the appropriate
problem-solvingskills. Researchersneed to consider students' abilities to select
appropriateproblem-solving strategies in light of available technology and to
retaintheir operationaland problem-solvingskills after instructionwith calcula-
tors. Also, furtherresearchis neededregardingthe transferof skills to othermath-
ematical subjectsand to areasoutside of mathematics.
Based on the definition used to identify a mathematicalskill as a problem-
solving skill in preparingfor this meta-analysis,little informationwas available
on the relationshipbetween the graphingcalculatorand studentachievement in
problem-solving skills. The studies featuringthe graphingcalculator primarily
focused on the acquisitionof operationalskills;consequently,the problem-solving
results were primarilybased on basic and scientific calculators.Therefore,future
researchshould include studies of graphingcalculatoruse in the developmentof
problem-solvingskills.
In spite of the fact thatthe NCTM (1989, 2000) has been advocatingchanges to
the mathematicscurriculumwith computerand calculatortechnology as an inte-
gral component, the search for studies for this meta-analysis yielded only six
studies in which special curriculummaterialswere designed for calculatoruse.
Because this numberreflects only 11%of the studies analyzed,this is an area in
which more researchneeds to be conducted.

REFERENCES

Aiken, L. (1974). Two scales of attitudetowardmathematics.Journalfor Research in Mathematics


Education,5, 67-71.
Ballheim, C. (1999). How our readersfeel aboutcalculators.In Z. Usiskin (Ed.), Mathematicseduca-
tion dialogues (p. 4). Reston, VA: NationalCouncil of Teachersof Mathematics.
Cohen, J. (1988). Statisticalpower analysisfor the behavioralsciences. Hillsdale, NJ: Erlbaum.
Cooper,H., & Hedges, L. (Eds.). (1994). Thehandbookof researchsynthesis.New York, NY: Russell
Sage Foundation.
Fennema,E., & Sherman,J. (1976). Mathematicsattitudescales: Instrumentsdesignedto measureatti-
tudestowardthe learningof mathematicsby malesandfemales.JournalforResearchin Mathematics
Education, 7, 324-326.
Gilchrist,M. E. (1993). Calculatoruse in mathematicsinstructionand standardizedtesting:An adult
education inquiry. Dayton, VA: Virginia Adult EducatorsResearch Network. (ERIC Document
ReproductionService No. ED 372919)
Glass, G. V., McGaw,B., & Smith,M. L. (1981). Meta-analysisin social research.Beverly Hills, CA:
Sage.
Hedges,L. V., & Olkin,I. (1985). Statisticalmethodsformeta-analysis.SanDiego, CA: AcademicPress.
Aimee J. Ellington 459

Hedges, L. V., Shymansky,J. A., & Woodworth,G. (1989). A practical guide to modernmethodsof
meta-analysis. Washington, DC: National Science Teachers Association. (ERIC Document
ReproductionService No. ED309952)
Hembree, R. & Dessart, D. J. (1986). Effects of hand-heldcalculators in precollege mathematics
education:A meta-analysis.Journalfor Research in MathematicsEducation,17, 83-99.
Hembree,R., & Dessart, D. J. (1992). Researchon calculatorsin mathematicseducation.In J. T. Fey
(Ed.), Calculatorsin mathematicseducation(pp. 23-32). Reston,VA: NationalCouncilof Teachers
of Mathematics.
Huffcutt,A., & Arthur,W. (1995). Developmentof a new outlierstatisticfor meta-analyticdata.Journal
ofApplied Psychology, 80, 327-334.
Kaput,J. J. (1992). Technology and mathematicseducation. In D. A. Grouws (Ed.), Handbookof
researchon mathematicsteaching and learning (pp. 515-556). New York:Macmillan.
Lipsey, M., & Wilson, D. (2001). Practical meta-analysis.ThousandOaks, CA: Sage.
Liu, S. T. (1993). Effects of teaching calculatoruse and problem solving strategieson mathematics
performanceand attitudeof fifth gradeTaiwanesemale and female students(Doctoraldissertation,
Memphis State University, 1993). DissertationAbstractsInternational,54, 4354A.
Mackey, K. (1999). Do we need calculators?In Z. Usiskin (Ed.), Mathematicseducation dialogues
(p. 3). Reston, VA: NationalCouncil of Teachersof Mathematics.
NationalCouncilof Teachersof Mathematics.(1989). Curriculumand evaluationstandardsforschool
mathematics.Reston, VA: Author.
NationalCouncilof Teachersof Mathematics.(2000). Principlesand standardsfor school mathematics.
Reston, VA: Author.
Neubauer,S. G. (1982). Theuse ofhand-heldcalculatorsin schools: A review.SouthBend, IN: Indiana
University.(ERIC DocumentReproductionService No. ED220272)
Parkhurst,S. (1979). Hand-held calculators in the classroom: A review of the research. (ERIC
DocumentReproductionService No. ED200416)
Penglase, M., & Arnold, S. (1996). The graphics calculator in mathematicseducation: A critical
review of recentresearch.MathematicsEducationResearchJournal, 8, 58-90.
Pennington,R. (1998). A study to determinethe effect of instructionin effective use of a calculatoron
test scores of middle school students (Master's thesis, Salem-Teikyo University, 1998). (ERIC
DocumentReproductionService No. ED434030)
Rabe, R. M. (1981). Calculatorsin the mathematicscurriculum:Effectsand changes. SouthBend, IN:
IndianaUniversity. (ERICDocumentReproductionService No. ED204178)
Ralston,A. (1999). Let's abolishpencil-and-paperarithmetic.In Z. Usiskin (Ed.), Mathematicseduca-
tion dialogues (p. 2). Reston, VA: NationalCouncil of Teachersof Mathematics.
Roberts, D. M. (1980). The impact of electronic calculatorson educationalperformance.Review of
EducationalResearch,50, 71-88.
Sandman,R. S. (1980). The mathematicsattitudeinventory:Instrumentanduser's manual.Journalfor
Research in MathematicsEducation, 11, 148-149.
Sigg, P. O. (1982). The hand-heldcalculator: Effects on mathematicalabilities and implicationsfor
curriculumchange. South Bend, IN: IndianaUniversity School of Education. (ERIC Document
ReproductionService No. ED218147)
Suydam,M. N. (1978). State-of-the-artreviewon calculators: Theiruse in education.Columbus,OH:
CalculatorInformationCenter.(ERICDocument ReproductionService No. ED 167426)
Suydam, M. N. (1979). The use of calculators in pre-college education: A state-of-the-artreview.
Columbus, OH: CalculatorInformationCenter. (ERIC Document ReproductionService No. ED
171573)
Suydam, M. N. (1980). Using calculators in pre-college education: Third annual state-of-the-art
review.Columbus,OH:CalculatorInformationCenter.(ERICDocumentReproductionService No.
ED206454)
Suydam, M. N. (1981). The use of calculators in pre-college education: Fourth annual state-of-the-
art review.Columbus,OH:CalculatorInformationCenter.(ERICDocumentReproductionService
No. ED206454)
460 Effectsof Calculators:A Meta-Analysis

Suydam,M. N. (1982). The use of calculators in pre-college education: Fifthannual state-of-the-art


review.Columbus,OH:CalculatorInformationCenter.(ERICDocumentReproductionService No.
ED220273)

Author

Aimee J. Ellington, Departmentof Mathematicsand Applied Mathematics,VirginiaCommonwealth


University, 1001 West Main Street, P.O. Box 842014, Richmond, VA 23284-2014;
ajellington@vcu.edu

APPENDIX

Bibliographyof Studies Includedin the Meta-Analysis


Abuloum, K. (1996). Graphingcalculators:Teachers perceptions, training,and attitude (Doctoral
dissertation,Universityof Nebraska,1996). DissertationAbstractsInternational,57, 1063A.
Aldridge, W. B. (1991). The effects of utilizing calculatorsand a mathematicscurriculumstressing
problem solving techniques.DissertationAbstractsInternational,54, 0404A. (UMI No. AAC91-
18846)
Ansley, T., Spratt,K., & Forsyth,R. (1989). The effects of using calculatorsto reduce the computa-
tionalburdenon a standardizedtest of mathematicsproblemsolving. Educationaland Psychological
Measurement,49, 277-286.
Autin, N. P. (2001). The effects of graphingcalculatorson secondarystudents'understandingof the
inversetrigonometricfunctions.DissertationAbstractsInternational,62, 0890A. (UMI No. AAT30-
09261)
Bartos,J. J. (1986). Mathematicsachievementand the use of calculatorsfor middleelementarygrade
children.DissertationAbstractsInternational,47, 1227A. (UMI No. AAC86-14912)
Boudreau,J. (1999). A treatmenton middle school studentsto increasestudentachievementon geom-
etry skills (Master'sthesis, CaliforniaState University, 1999). MastersAbstractsInternational,37,
1297A.
Bridgeman,B., Harvey,A., & Braswell,J. (1995). Effects of calculatoruse on scores on a test of math-
ematical reasoning.Journalof EducationalMeasurement,32, 323-340.
Burnett,C. M. (1985). The effectiveness of using a hand-heldcalculatoras an instructionalaid in the
teaching of fraction to decimal conversion to sixth-gradestudents (Doctoral dissertation,Boston
University, 1985). DissertationAbstractsInternational,46, 2174A.
Chandler,P. A. (1992). The effect of the graphingcalculatoron high school students' mathematical
achievement (Doctoral dissertation, University of Houston, 1992). Dissertation Abstracts
International,53, 3832A.
Colefield, R. P. (1985). The effect of the use of electronic calculatorsversus hand computationon
achievement in computationalskills and achievement in problem-solving abilities of remedial
middle school studentsin selected business mathematicstopics (Doctoraldissertation,New York
University, 1985). DissertationAbstractsInternational,46, 2168A.
Devantier, A. (1992). The impactof graphingcalculatorson the understandingof functions and their
graphs(Master'sthesis, CentralMichiganUniversity, 1992). MastersAbstractsInternational,31,
535A.
Drottar,J. (1999). An analysisof the effect of the graphingcalculatoron studentperformancein algebra
II (Doctoraldissertation,Boston College, 1999). DissertationAbstractsInternational,60, 56A.
Edens, H. S. (1983). Effects of the use of calculators on mathematicsachievement of first grade
students(Doctoraldissertation,Universityof Virginia, 1983). DissertationAbstractsInternational,
43, 3248A.
Aimee J. Ellington 461

Ellerman,T. (1998). A studyof calculatorusage on the mathematicsachievementof seventhandeighth


grade students and on attitudesof students and teachers (Doctoral dissertation,Louisiana Tech
University, 1998). DissertationAbstractsInternational,59, 1101A.
Fleener,M. (1988). Using the computeras an instructionaltool to aid in mathematicalproblemsolving
(Doctoral dissertation,University of North Carolina, 1988). Dissertation AbstractsInternational,
50, 0611A.
Frick,F. A. (1988). A studyof the effect of the utilizationof calculatorsand a mathematicscurriculum
stressing problem-solving techniques on student learning (Doctoral dissertation, University of
Connecticut,1988). DissertationAbstractsInternational,49, 2104A.
Giamati,C. M. (1991). The effect of graphingcalculatoruse on students'understandingof variations
on a familyof equationsandthe transformations of theirgraphs.DissertationAbstractsInternational,
52, 0103A. (UMI No. AAC91-16100)
Glover, M. (1992). The effect of the hand-heldcalculatoron the computationand problem solving
achievementof studentswith learningdisabilities (Doctoraldissertation,State Universityof New
York at Buffalo, 1991). DissertationAbstractsInternational,52, 3888A.
Graham,A., & Thomas, M. (2000). Building a versatile understandingof algebraicvariableswith a
graphiccalculator.EducationalStudies in Mathematics,41, 265-282.
Graham,A., & Thomas, M. (2000). A graphic calculator approach to understandingalgebraic vari-
ables. Paperpresentedat the TIME 2000 InternationalConference,Auckland,New Zealand.
Hall, M. K. (1992). Impactof the graphingcalculatoron the instructionof trigonometricfunctionsin
precalculus classes (Doctoral dissertation, Baylor University, 1992). Dissertation Abstracts
International,54, 0451A.
Harskamp,E., Suhre,C., & Van Streun,A. (1998). The graphicscalculatorin mathematicseducation:
An experimentin the Netherlands.HiroshimaJournal of MathematicsEducation,6, 13-31.
Heath, R. D. (1987). The effects of calculatorsand computerson problemsolving ability, computa-
tionalability,andattitudetowardsmathematics(Doctoraldissertation,NorthernArizonaUniversity,
1987). DissertationAbstractsInternational,48, 1102A.
Hersberger,J. (1983). The effects of a problem-solvingorientedmathematicsprogramon gifted fifth-
gradestudents.DissertationAbstractsInternational,44, 1715A. (UMI No. AAC83-23998)
Humphries,J. H. (1986). The effectiveness of the LittleProfessorcalculatoras a supplementto kinder-
gartenmathematicsprogram(Doctoraldissertation,EastTexas StateUniversity,1986).Dissertation
AbstractsInternational,47, 2014A.
Kelly, M. G. (1984). The effect of the use of the hand-heldcalculatoron the developmentof problem-
solving strategies (Doctoral dissertation, Utah State University, 1984). Dissertation Abstracts
International,46, 2169A.
Langbort,C. R. (1983). An investigationof the abilityof fourthgradechildrento solve wordproblems
using hand-heldcalculators (Doctoral dissertation,University of California, 1983). Dissertation
AbstractsInternational,43, 2914A.
Lawrence,I., & Dorans,N. (1994). Optionaluse of calculators on a mathematicaltest: Effecton item
difficulty and score equating. (Report No. ETS-RR-94-40). Princeton, NJ: EducationalTesting
Service. (ERIC DocumentReproductionService No. ED382656)
Lesmeister, L. (1997). The effect of graphing calculators on secondary mathematicsachievement
(Master's thesis, University of Houston Clear Lake, 1996). Masters Abstracts International,35,
0039A.
Lewis, J., & Hoover, H. D. (1983). Theeffect on pupil performanceof using hand-heldcalculators on
standardizedmathematicsachievementtests. Paperpresentedat the annualmeeting of the National
Council on Measurementin Education,Los Angeles, CA. (ERICDocumentReproductionService
No. ED204152)
Lim, C. (1992). Effects of using calculatorson computationalability on non-college bound students,
their attitudes toward mathematics and achievements on unit tests in probability and statistics
(Doctoraldissertation,Temple University, 1991). DissertationAbstractsInternational,52, 2450A.
Liu, S. T. (1993). Effects of teaching calculatoruse and problem solving strategieson mathematics
performanceand attitudeof fifth gradeTaiwanese male and female students(Doctoraldissertation,
Memphis State University, 1993). DissertationAbstractsInternational,54, 4354A.
462 Effectsof Calculators:A Meta-Analysis

Lloyd, B. (1991). Mathematicstest performance:The effects of item type andcalculatoruse. Applied


Measurementin Education,4, 11-22.
Long, V. M., Reys, B., & Osterlind,S. J. (1989). Using calculatorson achievementtests. Mathematics
Teacher,82, 318-325.
Magee, E. F. (1985). The use of the minicalculatoras an instructionaltool in applyingconsumermath-
ematics concepts (Doctoral dissertation, University of Virginia, 1985). Dissertation Abstracts
International,47, 2104A.
Merckling,W. (1999). Relationship(s)betweenthe perceptualpreferencesof secondaryschool students
and their achievementin functions using a graphingcalculator(Doctoral dissertation,St. John's
University, 1999). DissertationAbstractsInternational,60, 371A.
Morgan,R., & Stevens, J. (1991). Experimentalstudyof the effects of calculator use on the advanced
placement calculus examinations.(ReportNo. ETS-RR-91-5).Princeton,NJ: EducationalTesting
Service. (ERICDocumentReproductionService No. ED392816)
Ottinger,T. P. (1994). Conceptualand procedurallearningin first-yearalgebrausing graphingcalcu-
latorsandcomputers(Doctoraldissertation,GeorgiaStateUniversity,1993).DissertationAbstracts
International,54, 2934A.
Pennington,R. (1998). A studyto determinethe effect of instructionin effective use of a calculatoron
test scores of middle school students (Master's thesis, Salem-Teikyo University, 1998). (ERIC
DocumentReproductionService No. ED 434030)
Rich, S. B. (1991). The effect of the use of graphingcalculatorson the learningof functionconcepts
in precalculusmathematics(Doctoraldissertation,Universityof Iowa, 1990).DissertationAbstracts
International,52, 0835A.
Riley, A. G. (1992). Effects of integratingTI Math Explorercalculatorsinto the grades 3 through6
mathematicscurriculumon achievementandattitude(Doctoraldissertation,Texas A&M University,
1992). DissertationAbstractsInternational,54, 1661A.
Rodgers, K. (1996). The effects on achievement,retentionof mathematicalknowledge, and attitudes
towardmathematicsas a resultof supplementingthe traditionalalgebraII curriculumwith graphing
calculatoractivities(Doctoraldissertation,SouthernIllinoisUniversity,1995).DissertationAbstracts
International,57, 0091A.
Ruthven, K. (1990). The influence of graphiccalculatoruse on translationfrom graphicto symbolic
forms. EducationalStudies in Mathematics,21, 431-450.
Ryan,W. (1999). Theeffectsof usingthe TI-92calculatorto enhancejunior high students'performance
in and attitudetowardgeometry.(ERICDocumentReproductionService No. ED436414)
Scott, B. A. (1994). The effect of graphingcalculatorsin algebraII classrooms: A study comparing
achievement, attitude,and confidence (Doctoral dissertation,University of North Texas, 1994).
DissertationAbstractsInternational,55, 2755A.
Siskind, T. (1994). The effect of calculator use on mathematicsachievement for ruralhigh school
students.Rural Educator,16 (2), 1-4.
Stacey, K., & Groves, S. (1994). Calculators in primary mathematics.Paperpresentedat the 72nd
meeting of the NationalCouncil of Teachersof Mathematics,Indianapolis,IN. (ERIC Document
ReproductionService No. ED373963)
Starr, S. (1989). The effects of including calculators on the problem-solving instructionfor low-
income sixth-grade students (Master's thesis, Mercer University, 1989). (ERIC Document
ReproductionService No. ED309070)
Szetela, W., & Super, D. (1987). Calculatorsand instructionin problemsolving in grade 7. Journal
for Research in MathematicsEducation, 18, 215-229.
Upshaw, J. T. (1994). The effect of the calculator-based,graph-explorationmethodof instructionon
advancedplacementcalculusachievement(Doctoraldissertation,Universityof SouthCarolina,1993).
DissertationAbstractsInternational,54, 4023A.
Vazquez,J. L. (1991). The effect of the calculatoron studentachievementin graphinglinearfunctions.
DissertationAbstractsInternational,51, 3660A. (UMI No. AAC91-06488)
Weber, T. (1999). Graphingtechnology and its effect on solving inequalities.DissertationAbstracts
International,60, 88A. (UMI No. AAC99-15749)
Aimee J. Ellington 463

Whisenant,M. A. (1989). The effects of the use of calculatorsin algebraI classes on basic skills main-
tenance and algebra achievement (Doctoral dissertation, University of North Texas, 1989).
DissertationAbstractsInternational,54, 0450A.
Wilkins,C. (1995). The effect of the graphingcalculatoron studentachievementin factoringquadratic
equations (Doctoral dissertation, Mississippi State University, 1995). Dissertation Abstracts
International,56, 2159A.

También podría gustarte