Está en la página 1de 19

This article was downloaded by: [Chulalongkorn University]

On: 25 December 2014, At: 06:30


Publisher: Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

International Journal of Mathematical


Education in Science and Technology
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/tmes20

University students’ perspectives on


diagnostic testing in mathematics
a b a
Eabhnat Ní fhloinn , Ciarán Macan Bhaird & Brien Nolan
a
School of Mathematical Sciences, Dublin City University, Dublin,
Ireland
b
Department of Mathematics and Statistics, National University
of Ireland Maynooth, Maynooth, Ireland
Published online: 01 May 2013.

Click for updates

To cite this article: Eabhnat Ní fhloinn, Ciarán Macan Bhaird & Brien Nolan (2014) University
students’ perspectives on diagnostic testing in mathematics, International Journal of Mathematical
Education in Science and Technology, 45:1, 58-74, DOI: 10.1080/0020739X.2013.790508

To link to this article: http://dx.doi.org/10.1080/0020739X.2013.790508

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever
or howsoever caused arising directly or indirectly in connection with, in relation to or
arising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
International Journal of Mathematical Education in Science and Technology, 2014
Vol. 45, No. 1, 58–74, http://dx.doi.org/10.1080/0020739X.2013.790508

University students’ perspectives on diagnostic testing in mathematics


Eabhnat Nı́ Fhloinn,a∗ Ciarán Macan Bhairdb and Brien Nolana
a
School of Mathematical Sciences, Dublin City University, Dublin, Ireland; b Department of
Mathematics and Statistics, National University of Ireland Maynooth, Maynooth, Ireland
(Received 8 October 2012)
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

Many universities issue mathematical diagnostic tests to incoming first-year students,


covering a range of the basic concepts with which they should be comfortable from
secondary school. As far as many lecturers are concerned, the purpose of this test is
to determine the students’ mathematical knowledge on entry. It should also provide
an early indication of which students are likely to need additional help, and hopefully
encourage such students to avail of extra support mechanisms at an early stage. However,
it is not clear that students recognize these intentions and there is a fear that students
who score poorly in the test will have their confidence further damaged in relation
to mathematics and will be reluctant to seek help. To this end, a questionnaire was
developed to explore students’ perspectives on diagnostic testing. Analysis of responses
received to the questionnaire provided an interesting insight into students’ perspectives
including the optimum time to conduct such a test, their views on the aims of diagnostic
testing, whether they feel that testing is a good idea, and their attitudes to the support
systems put in place to help those who scored poorly in the test.
Keywords: mathematics; diagnostic testing; perspectives; opinions; feedback

1. Introduction
The poor core mathematical skills of a large number of students entering higher education
continue to be a cause of concern for mathematics educators. This concern has been high-
lighted in numerous studies dating back over 20 years in Ireland alone.[1–4] Similarly, major
studies have been undertaken in the United Kingdom (UK) on the mathematical prepared-
ness of new undergraduates.[5,6] They have shown strong evidence of a ‘steady decline’
in basic mathematical skills and ‘increasing inhomogeneity in mathematical attainment
and knowledge’, and have recommended that ‘students embarking on mathematics-based
degree courses should have a diagnostic test on entry’.[6] However, they were also at pains
to point out that diagnostic testing is a means to an end:

Diagnostic testing should be seen as part of a two-stage process. Prompt and effective follow-up
is essential to deal with both individual weaknesses and those of the whole cohort.[6,p.iii]

This is echoed in the analysis of Hassler et al. [7,p.25] who ‘suggest that diagnostic tests
are evaluated in as much detail as possible, so one can advise upon remedial measures’.
Without some form of support for students identified as being at-risk of struggling with
their mathematics module, there is a distinct possibility that diagnostic testing may not be


Corresponding author. Email: eabhnat.nifhloinn@dcu.ie


C 2013 Taylor & Francis
International Journal of Mathematical Education in Science and Technology 59

of any significant benefit to these students:

In situations where students are simply told their test result and advised to revise certain topics
on their own, there is little evidence that this happens.[8,p.8]

On the other hand, Heck and van Gastel [9] reported the positive effects that can accrue
from interventions that are informed by the results of diagnostic testing.
Considerable time and effort is invested by academics in developing and administering
diagnostic tests. They are designed to determine students’ mathematical knowledge on
entry, to provide an early indication of which students are likely to need additional help,
and to encourage such students to avail of extra support mechanisms at an early stage.
However, it is not clear that students fully recognize these intentions. Therefore, a student
questionnaire was developed to further investigate students’ perspectives of diagnostic
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

testing.
The questionnaire was issued in two Irish universities, Dublin City University (DCU)
and National University of Ireland Maynooth (NUIM), who conduct their diagnostic testing
of first-year students differently: in DCU, it runs during the orientation week, where it is
directly linked to the Mathematics Learning Centre (MLC) and in NUIM, it runs during
mathematics lectures in the first week of the semester. Both universities offer targeted
support systems based on the results of the diagnostic tests. These results, when taken with
the students’ Leaving Certificate mathematics results, are used to determine if a student
is at-risk or not. The Leaving Certificate (LC) is a high-stakes examination at the end of
secondary school in Ireland which is used to determine entry to higher education. The
students in question in both universities are first-year service-mathematics students, who
have a diverse profile in terms of prior achievement in mathematics. Certain programmes,
such as science, have a minimum entry requirement for mathematics (identical in the
two universities); others simply require the student to have passed LC mathematics. The
questionnaire covered a broad range of topics relating to student opinion on the diagnostic
tests and the related feedback follow-up – see Section 3 for more details. We report here on
our analysis of the responses to the questionnaire, concentrating on the items that dealt with
the timing of the test, the aims of the test, the value of diagnostic testing and the adequacy
of post-test feedback.
The diagnostic testing regime in both institutions is briefly described in Section 2. In
Section 3, we discuss the project methodology including both the design and implementation
of the questionnaire, and how the data were analyzed. Section 4 is the results section of
the paper which contains the analysis of the student responses. In Section 5, we discuss
the main themes that emerged in answers to each question and we expand on the overlap
between answers. This section also includes our conclusions.

2. Implementation of diagnostic testing


In 2003, the UK Learning and Teaching Support Network (LTSN) MathsTEAM project
produced a detailed collection of case studies of diagnostic testing throughout the UK.[10]
This report highlighted the range of testing being undertaken, as well as the results obtained.
It identified possible barriers to test execution and gave general recommendations. The
report concluded that:

Diagnostic testing provides a positive approach to a situation. For the student it provides a
constructive method, which leads to ongoing support, and for the academic it is an indication
60 E. Nı́ Fhloinn et al.

of ‘what is needed’ in terms of teaching and curriculum changes. As the number of institutions
implementing these tests increases it is becoming an integral part of mathematical education
for first year students.[10,p.7]

Diagnostic tests fall into two main delivery types: paper-based and computer-based.
The optimal choice is generally dependent upon internal resources within each university.
A lack of sufficiently reliable computing facilities can mean that a paper-based test is the
only sensible choice, particularly at the start of the academic year when many students
may not yet be set up on the computing system. Although a large number of diagnostic
tests are multiple-choice, enabling speedy return of marks to students, some universi-
ties, such as the University of Limerick (UL), have opted instead for open-ended ques-
tions. This provides them with greater information about the mathematical deficiencies in
question:
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

The test was designed for marking by hand so that one could investigate the specific errors that
students make and identify where the gaps in student knowledge lie. . .As the students were
provided with rough work areas, it was possible to determine why students were making the
type of mistakes they were.[11,p.228]

The majority of diagnostic tests are given during the orientation week or in the first
weeks of the academic year [10,p.4]; in some cases, such as the Institute of Technology in
Tralee [12] and the University of Amsterdam, [9], the tests are repeated several weeks later
to assess students’ improvements.

2.1. Diagnostic testing in the Dublin City University


The MLC in the Dublin City University (DCU) opened in February 2004 and developed a
diagnostic test for incoming first-year service mathematics students from 2004–2005. The
test consists of 15 multiple-choice questions on a range of basic mathematical skills: per-
centages, fractions, numerical and algebraic manipulation, and solving linear and quadratic
equations. It is paper-based and was initially conducted during the first mathematics class
of the year. In the past few years, it has been carried out during the orientation week
(see [13] for further details). Students who receive below a certain grade in the diagnostic
test are deemed to be at-risk of failing their mathematics module and are advised to attend
refresher sessions on basic mathematics during the first two weeks of the semester and to
make frequent use of the MLC during the year.

2.2. Diagnostic testing in the National University of Ireland Maynooth


For more than 20 years, all students who register for a first-year service mathematics mod-
ule in the National University of Ireland Maynooth (NUIM) have taken a diagnostic test to
determine their ability level in mathematics. The 20-question multiple-choice test is issued
by the Department of Mathematics and Statistics and covers basic topics in mathematics
such as functions, algebraic manipulations and indices. Students who fail the diagnostic
test are deemed at-risk and are assigned extra supports. These supports have taken a variety
of different forms; since 2008, students are assigned to an online Mathematics Proficiency
Course (MPC), delivered by the department and the Mathematics Support Centre (MSC)
using a combination of free online materials available from www.mathcentre.ac.uk, includ-
ing text, diagnostic tests and videos. A quiz was also added to the MPC so students could
International Journal of Mathematical Education in Science and Technology 61

test whether or not they have mastered the material. There is one follow-up workshop per
week which is run by an experienced MSC tutor.

3. Methodology
Although considerable importance is attached to diagnostic testing by academic staff, it
is unclear whether students also experience it as a valuable exercise, or indeed, whether
it might adversely affect their mathematical confidence and engagement with support.
Therefore, in 2009, the first two authors and Dr Olivia Fitzmaurice from UL developed an
anonymous student questionnaire to investigate students’ perspectives on this issue. This
consisted of 20 questions, seven of which related to the profile of the respondent, with
the remainder addressing different aspects of student opinion on diagnostic testing. The
questionnaire was piloted with a small group of students in each university, after which
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

responses were reviewed with a statistician and the questions adjusted accordingly in order
to ensure greater clarity.
The questionnaire was issued to first-year service mathematics students in DCU and
NUIM midway through the first semester of 2009–2010. In DCU, the paper-based ques-
tionnaire was issued during mathematics lectures, resulting in 663 responses from students
enrolled in a range of different courses, including Science, Engineering, Business, Account-
ing and Finance, Computing, and Science Education. In NUIM, the questionnaire was made
available online via Moodle to 337 science and 174 mathematical studies students. There
were 131 responses from science and 74 from mathematical studies. It was not possible
to get class time in NUIM to issue the questionnaire due to a number of previous surveys
having already taken place. The authors are aware of the restrictions of issuing an online
questionnaire, and that the respondents may not form a representative sample of the students
who sat the diagnostic test. However, the results from the online questionnaire were very
similar to the paper-based feedback, so we feel that their inclusion is valid. Any significant
differences are reported upon when they arise.
The responses received to the open questions contained rich data, so in order to gain
a proper insight into the answers given, we applied General Inductive Analysis (GIA).[16]
GIA provides an approach to Grounded Theory analysis of data, and can be summarized by
the following stages of analysis: (i) preparation of raw data; (ii) close reading (identification
of ‘meaning units’ in text); (iii) creation of categories (upper level/parent categories, often
determined by research objectives, or lower level/child categories, often determined in vivo
during analysis of raw data); (iv) checking of text extracts that should appear in more than
one category; (v) revision/refinement of the category system; and (vi) reliability testing.
This summary is based on that given in [16], which may be consulted for more details.
GIA was applied to each question separately with an emphasis on the themes which
emerged from the responses rather than on the responses themselves. The analysis was
carried out independently by the three authors, and the results compared to verify their
authenticity. As mentioned previously, the themes and responses from both institutions
were typically consistent, and any anomalies are discussed below.

4. Analysis of results
In this paper, we present the results of the analysis of the five key open questions, namely
Questions 9, 12, 13, 17 and 20 (see Appendix 1). These five questions focused on students’
attitudes towards the diagnostic test and, as such, the responses were intimately connected
to the key objectives of this project. The main themes which emerged from this process
62 E. Nı́ Fhloinn et al.

Table 1. Suitability of test timing: summary of the responses to the question ‘Do you think that this
is the most suitable time to sit the test? Please give a reason for your answer’.

Response DCU NUIM Total

Timing suitable 378 127 505


Timing unsuitable 152 28 180
Mixed/neutral 20 8 28
Total 550 163 713

are presented and discussed below along with sample quotations of students’ responses.
Further discussion of these main themes and the overlap of categories between questions
are included in Section 5. A minority of responses, labelled as Mixed/Neutral, could not be
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

accurately placed within a single specific parent category, but were consistent with other
coded responses.

4.1. Suitability of test timing


This issue was dealt with in Question 9:

Do you think that this is the most suitable time to sit the test? Please give a reason for your
answer.

Our research objective was to find out how students felt about the timing of the test.
Note that students in DCU take the test during the orientation week, students in NUIM
take the test during the first week of the term and students in both institutions receive no
advance notification about the test from the staff. The students’ last mathematical activity
(apart from a small number of mature students and direct-entry students) would have been
the LC examination at the beginning of June. Following the method of GIA, the principal
split in the parent categories identified was between those respondents who answered ‘yes’
(Timing suitable) to the direct question and those who answered ‘no’ (Timing unsuitable).
A summary of the response categories is given in Table 1.

4.1.1. Timing suitable


The majority (505 or 71%) of responses were positive with 109 of these saying ‘yes’ or a
variation. Of the remaining 396 responses, we present the four main subcategories which
emerged from the analysis of the data.
The diagnostic test helps to diagnose knowledge [151 students]. The responses in this
subcategory indicated that students believe that the timing of the test makes both students
and staff aware of the students’ knowledge of some basic mathematical skills. Comments
included: ‘It lets you know how you are at maths from the beginning’, ‘It gives lecturers an
early indication of what level students are at’ and ‘Yes to let you know how much you know
or have forgotten’.
Spur to taking action/good reminder [86 students]. This theme consists of responses
which reported that the test indicated to the students that they needed to seek additional
support: ‘Yes, as people who need help can learn where to receive the help straight away’,
that the test highlighted areas for revision: ‘Yes because it means you get a good idea of
how much work you need to catch up on before you start’, that the test helped students to
International Journal of Mathematical Education in Science and Technology 63

remember mathematical skills: ‘Yes because it refreshes our memories of what we know’
and that the test restarted mathematical activity on the students’ part: ‘Yes. It gets you into
the mind frame of starting work and it’s on what you should already know so shouldn’t be
too challenging’.
Better to do the test as early as possible [85 students]. This subcategory contained sim-
ilar responses with slightly different reasoning. Some students noted that it was convenient
to have the test out of the way before the semester started, while others linked this to the
importance of being able to act early on the feedback from the test: ‘I think this is a good
time because if you are struggling with maths its better to deal with the problem as soon
as possible’. Included here are the ideas that (in DCU) the diagnostic test fitted in well
with the orientation week: ‘Yes there is good attendance during orientation week’, and, in
both institutions, that the test was held at a time that increased the likelihood of students
attending.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

Out of practice; away from mathematics over the summer [50 students]. The main
idea present in these responses is that the unannounced test, given when students had not
studied mathematics for at least three months, gives a fairer assessment of students’ core
knowledge – including that which they had retained from the LC cycle, for example, ‘No
one expected it and gives good feedback of level of maths without revision/study etc’ and
‘Everyone was at the same stage and a bit rusty so it was fair’.

4.1.2. Timing unsuitable


A total of 180 (25%) responses were negative, with 17 of these writing only ‘no’ or a
variation. Of the remaining 163 responses, we present the three main subcategories which
emerged.
Out of practice; away from mathematics over the summer [89 students]. The diagnostic
test takes place at the beginning of the academic year. Students reported having forgotten
a lot of mathematics, being ‘rusty’, that they were in ‘summer brain mode’ and ‘needed
practice’ before doing a mathematics test. Comments included: ‘No, it was straight after
our summer holidays! I had forgotten how to do Maths!!’ and ‘No because many students
haven’t been practicing Maths over the summer and may just have let the basics slip their
mind’.
Specific issues with test timing [71 students]. This subcategory contains responses
where students commented on a range of different specific practical issues with the timing
of the test. Some DCU students indicated that the test did not fit in with the other activities
of the orientation week, they had a full week, it was ‘too serious’ an activity to be held
during the week, and that ‘attendance was low due to social events’. Many students also
commented on the unexpected nature of the test, and some of these expanded on more
practical problems: ‘No, we weren’t prepared at all! Had no calculator or anything to help
me along’. Finally, students also suggested that an alternative date for the test would be
better. Most suggested a later date when they would be settled into university life and in
working mode: ‘Not really because your head isn’t really thinking in maths, I think the 2nd
week or so might be better because you would be in a better frame of mind perhaps’. Some
students put forward arguments for an earlier test: ‘It might be better to do it before the
course begins because if you find it difficult you then have time to revise some maths before
starting course’.
The diagnostic test was unsettling [12 students]. These responses represent the kinds
of student opinions that are at the heart of criticisms of diagnostic testing. The phrases that
emerged involve words like ‘unsettling’, ‘off-putting’ and ‘uncomfortable’, for example,
64 E. Nı́ Fhloinn et al.

Table 2. Aim of diagnostic testing in mathematics: summary of the responses to the question: ‘The
aim of diagnostic testing is to provide both staff and students with an immediate picture of which
important mathematical concepts are well-known and/or unknown to the student’. Do you think that
your diagnostic test achieved this, and why?

Response DCU NUIM Total

Achieved aim 354 119 473


Did not achieve aim 122 30 152
Mixed/neutral 25 12 37
Total 501 161 662
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

‘you’re not really expecting a test and sort of panic when you get it’ and ‘without previous
knowledge of the test. . .I found it annoying and struggled where I normally wouldn’t’.

4.1.3. Discussion of timing suitability


As noted above, the majority of students (505 of 713; 71%) who replied to this question
indicated that they considered the timing of the diagnostic test to be suitable. This pro-
vides solid support in the form of student opinion for our current practice. However, it is
concerning that a significant minority (180 of 713 respondents; 25%) do not consider the
timing suitable. Most of these students relate the unsuitability of the timing either to
the fact that they have been away from mathematics for a considerable period of time or to
the fact that the test is unexpected – although, interestingly, both of these were also reasons
given by students who considered the timing suitable. However, as administrators of the
diagnostic test, we need to better communicate to all students the aims of the test. With a
better understanding of these aims, we anticipate that students will be less inclined to be
concerned about the timing of the test.
Notably, there were only a small number of comments (1.6%) that indicated any kind
of negative emotional impact that it is sometimes feared diagnostic testing may have on
students.

4.2. Aim of diagnostic testing in mathematics


This section discusses Question 12 from the survey:

‘The aim of diagnostic testing is to provide both staff and students with an immediate picture
of which important mathematical concepts are well-known and/or unknown to the student.’ Do
you think that your diagnostic test achieved this, and why?

The research objective was to gauge if students recognized the underlying purpose of
diagnostic testing. When the tests are issued in both institutions, students are informed of
the purpose of the test. When GIA was applied to the 662 responses, there was a clear
divide between the positive (Achieved aim) and negative (Did not achieve aim) answers
which is highlighted in Table 2. While some of the responses may have been influenced by
the phrasing of the question, many cannot be seen as having been directly prompted by the
wording due to the additional themes introduced by students in their responses.
International Journal of Mathematical Education in Science and Technology 65

4.2.1. The diagnostic test achieved its aim


The majority (473 responses or 71.5%) of responses were coded as positive using GIA
analysis. The fact that the statement of the aim of diagnostic testing was included with the
question may explain why 126 responses only stated ‘yes’ or a variation. In the remaining
347 responses several themes emerged.
A good indicator of current knowledge for staff and/or students [151 students]. This
subcategory was the most common response from students. All responses coded here
mention ‘level of understanding’, ‘standards’ and ‘skills’ without referring to specific
areas of mathematics, for example, ‘I think it did. It gives the tutors and lecturers an idea
of the classes overall maths skills’, ‘Yes I think the test achieved this as I got a clear, honest
understanding of my knowledge of the maths I know and don’t know’ and ‘Yes it exposed
displayed weaknesses and strengths’.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

Highlights areas for improvement [78 students]. Although similar to the previous theme,
the responses here included specific comments on how both staff and students can use the
results to focus on certain areas for improvement in teaching and learning:

Yes, because everyone is at different levels of mathematical knowledge then others and because
of this test, the lecturer will take it into account of the varied groups of people within the lectures
and teach to try and suit all.

Yes as it informs the staff of the overall weak areas of the students and what area they need to
focus on in their lectures and it helps the students realise that they may need to do some work
and refresh their memories if they want to know the basic information if they want to do well
in their maths course!

Good indicator because of the test structure [75 students]. These students all com-
mented that the test achieved its aims either because of the timing of the test: ‘Yes, as no
one has any time for preparation so it acts as a sort of equalizer’, ‘Yes, as it was a surprise
we had not prepared for it, so it was clear to see problem areas’ or because of the type of
questions that were asked:

I think it did as it questioned us on the basic concepts and not hard complicated equations. I
think it’s more important that we understand the basics when starting mathematics in college.

4.2.2. The diagnostic test did not achieve its aim


A significant minority of respondents (152 or 23%) gave a negative response and 26 of these
stated ‘no’ or a variation but did not elaborate on their answer. Several themes emerged
from the analysis of the remaining 126 comments.
Problems with the test structure [55 students]. Students commented on logistical prob-
lems such as the test layout: ‘I don’t because some question were laid out differently to the
way I had learned in secondary school so I didn’t understand a lot of the questions asked’
and perceived problems with the follow-up test support: ‘No I don’t because I signed up
for the proficiency course as a result and it doesn’t seem to have anything to do with the
test’. Students also commented on the fact that the material covered in the diagnostic test
was different from that in lectures: ‘No, as the maths done in this test was not the same as
the maths I’m studying which is much harder’.
Out of practice; away from mathematics over the summer [39 students]. These students
all responded that the test did not give a fair reflection of their abilities as they were rusty
66 E. Nı́ Fhloinn et al.

Table 3. Idea of diagnostic testing in mathematics: summary responses to the question: Do you think
that diagnostic testing in mathematics is a good or bad idea? Please give reasons for your answer.

Response DCU NUIM Total

Good idea 490 153 643


Bad idea 28 2 30
Mixed/neutral 23 10 33
Total 541 165 706

after the summer/unprepared, etc.: ‘No because I didn’t get answers to questions I know I
knew how to do before but because of the summer break had forgotten how to do them’ and
‘No because with it just being sprung on you, you were under pressure and unprepared’.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

4.2.3. Discussion of the aim of diagnostic testing


Again, the majority of students felt that the diagnostic test achieved its aim; however, a
significant minority (23%) of respondents felt that the test did not achieve its aim, feeling
that it was unfair as they were not prepared and did not give a fair reflection of their
abilities. There is overlap between the justifications given for both the positive and the
negative responses, in particular the unannounced nature of the test. Students are made
aware of the importance of mastering the basic material tested upon, and how this material
is intimately connected to lecture material. However, the fact that many students still do not
seem to recognize this is an interesting area which requires further investigation.

4.3. The idea of diagnostic testing in mathematics


This issue was dealt with in Question 13 of the survey:

Do you think that diagnostic testing in mathematics is a good or bad idea? Please give reasons
for your answer.

The research objectives are transparent: we wish to find out what students think about
diagnostic testing from the perspective of whether or not they consider it to be a good idea.
When GIA was applied to the 706 responses, again there was a very clear split between the
positive (Good idea) and negative (Bad idea) responses which is evident from Table 3.

4.3.1. Good idea


A sizeable majority of the students reported that they considered diagnostic testing in
mathematics to be a good idea (643 students; 91%). 157 students simply replied ‘yes’ or a
variation. A number of clearly distinct themes emerged from the analysis of the remaining
486 responses.
A useful gauge of current knowledge [255 students]. This category comprises the
commonest reason that students gave in support of diagnostic testing. It includes responses
that saw the gauging of ‘level of ability’ or ‘where you are at’ as a positive end in itself,
without elaborating on why it is useful to have this information: ‘Good – gives you an
idea of your maths level’. It also covers responses which identify the test’s role in gauging
International Journal of Mathematical Education in Science and Technology 67

current knowledge for students and for academic staff: ‘Yes, because this way, you can get
insight on the students’ mathematical background without being too invasive’.
Identifies problem areas [101 students]. In this category students elaborate on their
answers and give reasons why it was useful to have a way of gauging their level of mathe-
matics. The main themes that emerge are that the test indicated weaknesses, problem areas
and areas where revision is needed: ‘Yes, as it lets you know what concepts are hardest for
students’ and ‘Yes to provide the student with an accurate assessment of what areas need
more focusing before tackling University lecture material’.
Positive structural springboard [59 students]. These responses addressed a key aspect
of diagnostic testing and almost unanimously state that the test can indicate the need to get
additional help in mathematics: ‘Good – it can push you to try get help’ and ‘Good idea as it
proves to the person that they need help’. Several students commented on diagnostic testing
as being positive in terms of potentially providing a platform for ongoing improvement, for
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

example, ‘Good [idea], people should be tested constantly to keep them working’.
The diagnostic test as a form of help [21 students]. This category includes responses
from a small but not insignificant number of students that indicated that the test itself helped
them to recall mathematical knowledge and skills, or helped them in some other unspecified
way: ‘Good, it reminds you of the basics that you had forgotten’.

4.3.2. Bad idea


Only a small number of students (N = 30; 4% of respondents) reported that they considered
diagnostic testing in mathematics to be a bad idea. Of these, five students said just ‘no’ or a
variation, and while the number of the remaining responses was low (N = 25), we applied
GIA as the reasons given were frequently strongly expressed and reinforced concerns about
diagnostic testing.
Negative structural problems [13 students]. This category includes responses criticizing
the timing of the test: ‘Bad because you were not prepared’ and the various test rules
including negative marking: ‘Negative marking put me off attempting questions I might
have known’ and the prohibition of calculators: ‘Bad, no calculator, students are used to
using them’.
Negative feelings [nine students]. This category refers to responses that report negative
feelings (worry, pressure, reinforcing negative beliefs about one’s mathematical ability,
affecting confidence) as the result of diagnostic testing: ‘It would have just showed me
what I already know I didn’t know’ and ‘I got a bad result and my confidence in maths has
dropped’. At the extreme end, one student stated that ‘I wanted to leave the college’.1

4.3.3. Discussion on the idea of diagnostic testing


As noted above, a large majority of students indicated that they consider diagnostic testing
to be a good idea. We note that some of the reasons for this may have been prompted
by the previous question on the survey, which gave an explicit articulation of the aims of
diagnostic testing. Nonetheless, we take the high rate of positive responses to Question
13 as providing evidence of a broad level of support for the use of diagnostic testing.
Furthermore, reasons given for considering it to be a good idea went beyond those that
may have been directly prompted by the wording of Question 12. Most significant here –
given our perspective that diagnostic testing is an important part of mathematical support
provided by both institutions – is the idea that the test allows students to identify a need
68 E. Nı́ Fhloinn et al.

Table 4. Feedback on performance: summary of the responses to the question: Were you given
sufficient feedback on your performance in the test?

Response DCU NUIM Total

Enough feedback 249 91 340


Not enough feedback 168 53 221
Mixed/neutral 39 3 42
Total 456 148 604

for accessing support and/or revising particular areas of mathematics. Interestingly, some
students also found that the test itself provides the means for such revision.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

While the number of negative responses is low, they deserve particular attention. These
responses frequently come from students expressing explicitly negative feelings (in terms
of ability, negative pressure, confidence) in relation to mathematics. The responses here
are similar to those which characterize ‘Math Anxiety’.[17] These responses highlight the
need for administrators of diagnostic tests to identify students in whom such feelings are
engendered by the test, and to take steps to support them in overcoming their anxiety.

4.4. Feedback on test performance


This issue was dealt with in Question 17 of the survey:

Were you given sufficient feedback on your performance in the test?

For this question, our research objective was to find out whether students felt that the
feedback mechanisms currently in place for diagnostic testing were adequate. Test results
and solutions are communicated to students in both institutions within a short space of time,
so that students are aware of which questions they answered incorrectly and the correct
answer to that question (although they do not receive feedback as to the specific errors they
made within each question). They are also sent information about specific interventions
such as revision classes or online material with which they are advised to engage. However,
we were interested to see whether students were satisfied with this level of feedback. Again,
there was an obvious split between the positive and negative responses in this question, but
although the majority (55.3%) of the 609 responses to this question were positive, this is
by far the lowest positive response of any question asked in the survey. A summary of the
response categories is given in Table 4.

4.4.1. Enough feedback [340 students]


As might be expected, given the phrasing of the question, 83.5% (284) of those who regarded
the feedback as sufficient gave no further comment. Two main themes emerged from the
remaining positive answers.
Results alone sufficient [31 students]. Although students were being asked specifically
about the feedback they received, a number of students reported that their results were all
the feedback they needed: ‘I just got the percentage but that was enough’ and ‘I received
my results which is really all I wanted to know. I was able to make my own decision after
that on whether or not I needed extra help’.
International Journal of Mathematical Education in Science and Technology 69

Results and advice/solutions given [16 students]. This category included responses that
specifically mentioned that they were given solutions to the test along with their results:
‘Yes, we got our results and were given the answers to see where we went wrong’. A small
number of students mentioned receiving their results and advice on what to do next:

I was given my mark and given information about the maths support that is available in the
college. We were advised if we got less than 20 marks to go to the maths support centre.

4.4.2. Not enough feedback [221 students]


221 students (36.6% of the respondents) felt that they did not receive sufficient feedback
on their diagnostic test, and of these, 61.9% (137) said ‘no’ or a variation. Analysis of the
remaining responses yielded two main categories.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

Only results given [35 students]. Some students reported that they were only given a
grade – ‘No actual feedback was given, just results’ – which they did not consider to be
feedback.
No idea where errors were made [32 students]. There was a strong feeling among some
students that they did not know where they had gone wrong in the test, so that even though
they received a result, they did not know where they had made any errors: ‘No I just got a
grade. I did not find out which answers were right and I would have liked to have known
this’ and ‘No only a score, we were not able to analyse the paper after’.

4.4.3. Discussion on performance feedback


The majority of responses to this question were coded as positive, although it gave the lowest
proportion of positive responses among the questions considered in this paper. Students
stated that they got their results, results and advice, or results and solutions and that this was
all they required to make their own decision. In the case of students who achieved higher
than the threshold pass mark for the diagnostic test, it would be expected that a result would
suffice for them, and this may explain the number of students commenting in this way.
There was a high percentage (36.6%) of negative responses, though the reasons given
were quite similar to the reasons for the positive responses. Students commented that they
just got their results, no feedback. So, they did not consider results as feedback, whereas
positive responses explicitly mentioned that the result alone was all the feedback they
required. Students also commented that they received no indication of where they had
made errors. In both institutions, the results and solutions are returned to the students very
soon after they take the test, and advice is given on the supports that they should avail of. The
fact that such a high percentage reported negatively on the feedback provided indicates that
this message was not getting through. Some steps taken to address this issue are discussed
in Section 5.

4.5. Final comments on diagnostic testing


Finally, students were invited to give any further comments they had on diagnostic testing.
The research objective of this question was to find out if there were any outstanding issues
that students wanted to raise. We expected that students would also use this opportunity to
emphasize the most important issues raised for them by the survey, and of the 67 responses,
a large number simply reiterated sentiments expressed earlier. Using GIA the responses fell
into the three broad categories outlined in Table 5.
70 E. Nı́ Fhloinn et al.

Table 5. Final comments: summary responses to the question ‘Any other comments on diagnostic
testing?’

Response DCU NUIM Total

Positive about diagnostic 11 14 25


testing
Negative about diagnostic 13 11 24
testing
Suggestions 11 5 16
Total 35 32 67

4.5.1. Positive about diagnostic testing [25 students]


Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

Thirty-seven point three per cent (25) of the comments made were positive about diagnostic
testing. These included general positive comments such as ‘Worthwhile’ and ‘It is a good
idea’. Other responses made specific reference to the test structure: ‘Good not to tell anyone
until they are in the lecture’, while more mentioned the impact of the test: ‘Makes students
aware of supports’.

4.5.2. Negative about diagnostic testing [24 students]


Thirty-five point eight per cent (24) of the comments made were negative about diagnostic
testing. A small number were general negative comments such as ‘It was a waste of time’,
but most were more specific in nature. Students disagreed with the use of negative marking:
‘Negative marking is pointless’, objected to the lack of warning about the test: ‘I didn’t
think that I reflected all the maths that I did know because I had forgotten how to do some
of the maths’ and were unhappy with the test structure which did not allow calculators: ‘I
didn’t have a calculator and I needed one’. Only two respondents made reference to the
lack of feedback or follow-up support: ‘It is a good idea but just needs more effort put in
by the staff afterwards’.

4.5.3. Suggested improvements [16 students]


A number of students took the opportunity to suggest improvements to the diagnostic
testing process currently undertaken, with several suggesting additional tests: ‘Good idea,
perhaps could be done at various stages throughout the year, or optional tests for those
who wish to gauge progress’ and others suggesting a more difficult test: ‘Should be more
difficult as the modules are much more difficult than the exam’.

5. Discussion and conclusions


In this paper, we have reported on students’ perspectives on diagnostic testing by applying
GIA to five open questions which investigated their views on the best time to take the test,
the aims of diagnostic testing, if diagnostic testing is a good idea, and if they feel that they
got sufficient feedback and support after the test. In general, the vast majority of the themes
which emerged were very positive and this is consistent with the feedback received on the
remainder of the questionnaire – see [18] for further details.
It is clear from Section 4 that the separate application of GIA to each individual
question gave rise to a considerable degree of overlap in the themes which emerged from
International Journal of Mathematical Education in Science and Technology 71

the responses to the different questions. This overlap was also evident from the reasons
given for positive and negative responses to the questions. For example, it is interesting to
note that similar reasons (being away from mathematics for a considerable period of time or
the test was unexpected) are given by different students when indicating both the suitability
and the unsuitability of the timing of the test. These overlaps indicate the suitability of the
GIA approach to such data: one can consider the responses to the survey as a whole as well
as to individual questions.[17]
Overall, the authors have been surprised by the high level of positive responses received
in this questionnaire. It is encouraging that the majority of students thought that diagnostic
testing was a good idea, that they recognized the academic staff ’s motivation for carrying
out diagnostic testing and that they considered the timing of the diagnostic test to be suitable.
This provides solid support in the form of student opinion for our current practice. Indeed,
some respondents reported that the diagnostic test acted as a spur to seeking support, while
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

others felt that the diagnostic test was an end in itself, helping them to recall mathematical
knowledge and skills.
Prior to the analysis of the data, we were expecting a significant number of comments
reporting that diagnostic testing induces negative feelings in students, an outcome which lies
at the heart of criticisms of diagnostic testing.[10] However, we unexpectedly found very low
levels of reporting of such feelings. This provides evidence that such negative outcomes may
not be as widespread as previously thought. As noted above, the corresponding responses
are indicative of Math Anxiety [17] and highlight the need for focused intervention with
relevant students. We note that the relevant students are few enough in number such that,
when identified, the provision of targeted support should not present a significant drain on
support resources.
The authors were also surprised by the high percentage of responses which indicated
that students were not happy with the level of feedback and subsequent support that they
received after the test. In both institutions, students are issued with results, solutions and
advice on appropriate supports and who should access them shortly after the test. They are
also made aware of the impact that regular engagement with these supports can have on
their grades and progression [15]. One possible reason for this high negative response rate
may be that, as raised in a separate answer, students do not see the connection between
the material covered by both the test and subsequent supports, and their lecture material.
One step taken to try and address this issue of engagement with subsequent supports has
been the introduction of a student monitor in NUIM. This monitor is assigned to ensure
that students who fail the test engage appropriately with the follow-up support (MPC) and
research has shown that engagement levels have increased significantly.[14]
We conclude by highlighting the principal overarching themes that we have identified
in the students’ responses.

5.1. Positive reception of diagnostic testing


We have commented on this in different ways above, but feel that it is worth highlighting
as one of the principal conclusions of this research: a significant majority of students agree
that diagnostic testing is a good idea, are aware of its aims and are supportive of crucial
aspects of the implementation of the test in DCU and in NUIM. Academic staff who pursue
diagnostic testing are clearly supportive of the activity as evidenced by their continued use
of such testing: this is the case for the present authors. The results above provide evidence
that, at least in our institutions, support for diagnostic testing is also present to a significant
72 E. Nı́ Fhloinn et al.

degree among the relevant students. We feel that this is an important point that has a role to
play in the debate about the merits and demerits of diagnostic testing in mathematics.[10]

5.2. The need for improved communication


The survey has indicated that there is a need to improve our communication with the
students in relation to the diagnostic test and follow-up support in a few different areas. At
the ‘input’ end, improved communication of the aims of the test would counteract negative
opinions on the timing of the test and some students’ perceptions of their own under-
preparedness. At the ‘output’ end, improved communication has a role to play in increasing
the students’ awareness of and participation in follow-up support that is prompted by the
test. The model in NUIM, discussed above, provides one way to approach this. Outside this,
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

it is worth mentioning that a significant amount of effort has already gone into the provision
of appropriate feedback and advice. We suggest that the difficulties encountered are part of
a wider picture in relation to providing feedback that will influence student behaviour: as
noted by Mutch, the provision of feedback is a complex task:

[it] is a social practice that demands attention . . . to the conditions of production, distribution
and reception.[19,p.25]

That said, in our future work on diagnostic testing, we intend to consider the area of
communication and feedback in greater detail.

Note
1. This was a male student on the ‘Open Opportunities’ route to an engineering degree. He had
not studied higher level mathematics and was advised to seek support. (This information was
gleaned from the profiling questions on the survey.)

References
[1] Cork Regional Technical College, Department of Mathematics and Computing. Report on
the basic mathematical skills test of 1st year students in Cork R.T.C. in 1984. IMS Newslett.
1985;14:33–43.
[2] Hurley D, Stynes M. Basic mathematical skills of U.C.C. students. IMTA Bull. 1986;17:
68–79.
[3] Brennan M. The 2nd -3rd level mathematics interface. In: Cawley S, editor. A mathematics
review. Dublin: Blackhall Publishing; 1997. p. 3–15.
[4] O’Donoghue J. An intervention to assist at risk students in service mathematics courses at the
University of Limerick. Limerick: University of Limerick; 1999.
[5] London Mathematics Society (LMS). Tackling the mathematics problem. Southend-on-Sea:
LMS; 1995.
[6] Savage M, Kitchen A, Sutherland R, Porkess R. In: Hawkes T, editor. Measuring the mathematics
problem. London: Engineering Council; 2000.
[7] Hassler B, Atkinson R, Quinney D, Barry M. The experience of fresher students in mathematics
diagnostic testing. MSOR Connect. 2004;4:17–23.
[8] Lawson D, Croft T, Halpin M. Good practice in the provision of mathematics support centres.
Available from: http://www.sigma-cetl.ac.uk/index.php?section=22 (17 September 2012); 2003.
[9] Heck A, Van Gastel L. Mathematics on the threshold. Internat J Math Ed Sci Eng Tech.
2006;37:925–945.
[10] LTSN MathsTEAM Project. Diagnostic testing for mathematics. Available from: http://www
.mathstore.ac.uk/mathsteam/packs/diagnostic_test.pdf. (17 September 2012); 2003.
International Journal of Mathematical Education in Science and Technology 73

[11] Gill O, O’Donoghue J. The mathematical deficiencies of students entering third level: An
item by item analysis of student diagnostic tests. In: Close S, Corcoran D, Dooley T, editors.
Proceedings of Second National Conference on Research in Mathematics Education (MEI2).
Dublin: St. Patrick’s College; 2007.
[12] Cleary J. Diagnostic testing – An evaluation 1998–2007. In: Close S, Corcoran D, Dooley T,
editors. Proceedings of Second National Conference on Research in Mathematics Education
(MEI2). Dublin: St. Patrick’s College; 2007.
[13] N Fhloinn E. Diagnostic Testing in DCU – A five-year review. In proceedings of Third National
Conference on Research in Mathematics Education (MEI3); Dublin: St. Patrick’s College; 2009.
p. 367–378.
[14] Burke G, Mac an Bhaird C, O’Shea A. The impact of a monitoring scheme on engage-
ment in an online course. Teaching Mathematics and Its Applications. 2012 [Accessed 2012
August 21]. Available from: http://teamat.oxfordjournals.org/content/early/2012/05/25/teamat.
hrs010.short?rss=1
[15] Mac an Bhaird C, Morgan T, O’Shea A. The impact of the mathematics support centre on
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

the grades of first year students at the National University of Ireland Maynooth. Teaching
Mathematics and Its Applications 2009;28(3):117–122.
[16] Thomas D. A general inductive approach for analyzing qualitative evaluation data. Am J Eval.
2006;27(2):237–246.
[17] Wigfield A, Meece JL. Math anxiety in elementary and secondary school students. J Educ
Psychol. 1988;80(2):210–216.
[18] N Fhloinn E, Mac an Bhaird C, Nolan B. Appropriate settings and supports for third-level
diagnostic testing in mathematics. In proceedings of the SMEC Conference; Dublin: DCU;
2012. p. 10–15.
[19] Mutch A. Exploring the practice of feedback to students. Active Learn Higher Educ.
2003;4(1):24–38.

Appendix 1

The Diagnostic Test Questionnaire


(1) Course Name/Code:
(2) Mathematics Module Name/Code:
(3) Gender: Male Female
(4) Leaving Certificate Mathematics Level: (Please circle)
Higher Ordinary Other
(5) Leaving Certificate Mathematics Grade (if applicable): (Please circle)
A1 A2 B1 B2 B3 C1 C2 C3 D1 D2 D3 Other
(6) If you started off doing Leaving Certificate Higher Level Mathematics, but changed to
Ordinary Level, roughly when did that happen? (Please circle)
Before Christmas in 5th year Before the end of 5th year
Before Christmas in 6th year After the Mocks in 6th year
(7) Are you registered as a mature student? Yes No
(8) When did you complete the diagnostic test? (Please circle)
During Orientation Week During the first week of university
Other Don’t remember
(9) Do you think that this is the most suitable time to sit the test? Please give a reason for your
answer.
(10) Were you told about the diagnostic test beforehand?
Yes, by a staff member Yes, by a student No
(11) Was the room where you took the test suitable?
74 E. Nı́ Fhloinn et al.

(12) “The aim of diagnostic testing is to provide both staff and students with an immediate picture
of which important mathematical concepts are well-known and/or unknown to the student.”
Do you think that your diagnostic test achieved this, and why?
(13) Do you think that diagnostic testing in mathematics is a good or bad idea? Please give reasons
for your answer.
(14) Were you advised to avail of additional supports because of your results in the diagnostic
test? Yes No
(15) If so, did you avail of these supports and which ones?
(16) Please comment on the support available to students after the diagnostic test.
(17) Were you given sufficient feedback on your performance in the test?
(18) Were you given sufficient time to complete the test? Yes No
(19) How easy/difficult did you find your test?
Very difficult Difficult Neutral Easy Very easy
(20) Any other comments on diagnostic testing.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014

También podría gustarte