Documentos de Académico
Documentos de Profesional
Documentos de Cultura
To cite this article: Eabhnat Ní fhloinn, Ciarán Macan Bhaird & Brien Nolan (2014) University
students’ perspectives on diagnostic testing in mathematics, International Journal of Mathematical
Education in Science and Technology, 45:1, 58-74, DOI: 10.1080/0020739X.2013.790508
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever
or howsoever caused arising directly or indirectly in connection with, in relation to or
arising out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
International Journal of Mathematical Education in Science and Technology, 2014
Vol. 45, No. 1, 58–74, http://dx.doi.org/10.1080/0020739X.2013.790508
1. Introduction
The poor core mathematical skills of a large number of students entering higher education
continue to be a cause of concern for mathematics educators. This concern has been high-
lighted in numerous studies dating back over 20 years in Ireland alone.[1–4] Similarly, major
studies have been undertaken in the United Kingdom (UK) on the mathematical prepared-
ness of new undergraduates.[5,6] They have shown strong evidence of a ‘steady decline’
in basic mathematical skills and ‘increasing inhomogeneity in mathematical attainment
and knowledge’, and have recommended that ‘students embarking on mathematics-based
degree courses should have a diagnostic test on entry’.[6] However, they were also at pains
to point out that diagnostic testing is a means to an end:
Diagnostic testing should be seen as part of a two-stage process. Prompt and effective follow-up
is essential to deal with both individual weaknesses and those of the whole cohort.[6,p.iii]
This is echoed in the analysis of Hassler et al. [7,p.25] who ‘suggest that diagnostic tests
are evaluated in as much detail as possible, so one can advise upon remedial measures’.
Without some form of support for students identified as being at-risk of struggling with
their mathematics module, there is a distinct possibility that diagnostic testing may not be
∗
Corresponding author. Email: eabhnat.nifhloinn@dcu.ie
C 2013 Taylor & Francis
International Journal of Mathematical Education in Science and Technology 59
In situations where students are simply told their test result and advised to revise certain topics
on their own, there is little evidence that this happens.[8,p.8]
On the other hand, Heck and van Gastel [9] reported the positive effects that can accrue
from interventions that are informed by the results of diagnostic testing.
Considerable time and effort is invested by academics in developing and administering
diagnostic tests. They are designed to determine students’ mathematical knowledge on
entry, to provide an early indication of which students are likely to need additional help,
and to encourage such students to avail of extra support mechanisms at an early stage.
However, it is not clear that students fully recognize these intentions. Therefore, a student
questionnaire was developed to further investigate students’ perspectives of diagnostic
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
testing.
The questionnaire was issued in two Irish universities, Dublin City University (DCU)
and National University of Ireland Maynooth (NUIM), who conduct their diagnostic testing
of first-year students differently: in DCU, it runs during the orientation week, where it is
directly linked to the Mathematics Learning Centre (MLC) and in NUIM, it runs during
mathematics lectures in the first week of the semester. Both universities offer targeted
support systems based on the results of the diagnostic tests. These results, when taken with
the students’ Leaving Certificate mathematics results, are used to determine if a student
is at-risk or not. The Leaving Certificate (LC) is a high-stakes examination at the end of
secondary school in Ireland which is used to determine entry to higher education. The
students in question in both universities are first-year service-mathematics students, who
have a diverse profile in terms of prior achievement in mathematics. Certain programmes,
such as science, have a minimum entry requirement for mathematics (identical in the
two universities); others simply require the student to have passed LC mathematics. The
questionnaire covered a broad range of topics relating to student opinion on the diagnostic
tests and the related feedback follow-up – see Section 3 for more details. We report here on
our analysis of the responses to the questionnaire, concentrating on the items that dealt with
the timing of the test, the aims of the test, the value of diagnostic testing and the adequacy
of post-test feedback.
The diagnostic testing regime in both institutions is briefly described in Section 2. In
Section 3, we discuss the project methodology including both the design and implementation
of the questionnaire, and how the data were analyzed. Section 4 is the results section of
the paper which contains the analysis of the student responses. In Section 5, we discuss
the main themes that emerged in answers to each question and we expand on the overlap
between answers. This section also includes our conclusions.
Diagnostic testing provides a positive approach to a situation. For the student it provides a
constructive method, which leads to ongoing support, and for the academic it is an indication
60 E. Nı́ Fhloinn et al.
of ‘what is needed’ in terms of teaching and curriculum changes. As the number of institutions
implementing these tests increases it is becoming an integral part of mathematical education
for first year students.[10,p.7]
Diagnostic tests fall into two main delivery types: paper-based and computer-based.
The optimal choice is generally dependent upon internal resources within each university.
A lack of sufficiently reliable computing facilities can mean that a paper-based test is the
only sensible choice, particularly at the start of the academic year when many students
may not yet be set up on the computing system. Although a large number of diagnostic
tests are multiple-choice, enabling speedy return of marks to students, some universi-
ties, such as the University of Limerick (UL), have opted instead for open-ended ques-
tions. This provides them with greater information about the mathematical deficiencies in
question:
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
The test was designed for marking by hand so that one could investigate the specific errors that
students make and identify where the gaps in student knowledge lie. . .As the students were
provided with rough work areas, it was possible to determine why students were making the
type of mistakes they were.[11,p.228]
The majority of diagnostic tests are given during the orientation week or in the first
weeks of the academic year [10,p.4]; in some cases, such as the Institute of Technology in
Tralee [12] and the University of Amsterdam, [9], the tests are repeated several weeks later
to assess students’ improvements.
test whether or not they have mastered the material. There is one follow-up workshop per
week which is run by an experienced MSC tutor.
3. Methodology
Although considerable importance is attached to diagnostic testing by academic staff, it
is unclear whether students also experience it as a valuable exercise, or indeed, whether
it might adversely affect their mathematical confidence and engagement with support.
Therefore, in 2009, the first two authors and Dr Olivia Fitzmaurice from UL developed an
anonymous student questionnaire to investigate students’ perspectives on this issue. This
consisted of 20 questions, seven of which related to the profile of the respondent, with
the remainder addressing different aspects of student opinion on diagnostic testing. The
questionnaire was piloted with a small group of students in each university, after which
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
responses were reviewed with a statistician and the questions adjusted accordingly in order
to ensure greater clarity.
The questionnaire was issued to first-year service mathematics students in DCU and
NUIM midway through the first semester of 2009–2010. In DCU, the paper-based ques-
tionnaire was issued during mathematics lectures, resulting in 663 responses from students
enrolled in a range of different courses, including Science, Engineering, Business, Account-
ing and Finance, Computing, and Science Education. In NUIM, the questionnaire was made
available online via Moodle to 337 science and 174 mathematical studies students. There
were 131 responses from science and 74 from mathematical studies. It was not possible
to get class time in NUIM to issue the questionnaire due to a number of previous surveys
having already taken place. The authors are aware of the restrictions of issuing an online
questionnaire, and that the respondents may not form a representative sample of the students
who sat the diagnostic test. However, the results from the online questionnaire were very
similar to the paper-based feedback, so we feel that their inclusion is valid. Any significant
differences are reported upon when they arise.
The responses received to the open questions contained rich data, so in order to gain
a proper insight into the answers given, we applied General Inductive Analysis (GIA).[16]
GIA provides an approach to Grounded Theory analysis of data, and can be summarized by
the following stages of analysis: (i) preparation of raw data; (ii) close reading (identification
of ‘meaning units’ in text); (iii) creation of categories (upper level/parent categories, often
determined by research objectives, or lower level/child categories, often determined in vivo
during analysis of raw data); (iv) checking of text extracts that should appear in more than
one category; (v) revision/refinement of the category system; and (vi) reliability testing.
This summary is based on that given in [16], which may be consulted for more details.
GIA was applied to each question separately with an emphasis on the themes which
emerged from the responses rather than on the responses themselves. The analysis was
carried out independently by the three authors, and the results compared to verify their
authenticity. As mentioned previously, the themes and responses from both institutions
were typically consistent, and any anomalies are discussed below.
4. Analysis of results
In this paper, we present the results of the analysis of the five key open questions, namely
Questions 9, 12, 13, 17 and 20 (see Appendix 1). These five questions focused on students’
attitudes towards the diagnostic test and, as such, the responses were intimately connected
to the key objectives of this project. The main themes which emerged from this process
62 E. Nı́ Fhloinn et al.
Table 1. Suitability of test timing: summary of the responses to the question ‘Do you think that this
is the most suitable time to sit the test? Please give a reason for your answer’.
are presented and discussed below along with sample quotations of students’ responses.
Further discussion of these main themes and the overlap of categories between questions
are included in Section 5. A minority of responses, labelled as Mixed/Neutral, could not be
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
accurately placed within a single specific parent category, but were consistent with other
coded responses.
Do you think that this is the most suitable time to sit the test? Please give a reason for your
answer.
Our research objective was to find out how students felt about the timing of the test.
Note that students in DCU take the test during the orientation week, students in NUIM
take the test during the first week of the term and students in both institutions receive no
advance notification about the test from the staff. The students’ last mathematical activity
(apart from a small number of mature students and direct-entry students) would have been
the LC examination at the beginning of June. Following the method of GIA, the principal
split in the parent categories identified was between those respondents who answered ‘yes’
(Timing suitable) to the direct question and those who answered ‘no’ (Timing unsuitable).
A summary of the response categories is given in Table 1.
remember mathematical skills: ‘Yes because it refreshes our memories of what we know’
and that the test restarted mathematical activity on the students’ part: ‘Yes. It gets you into
the mind frame of starting work and it’s on what you should already know so shouldn’t be
too challenging’.
Better to do the test as early as possible [85 students]. This subcategory contained sim-
ilar responses with slightly different reasoning. Some students noted that it was convenient
to have the test out of the way before the semester started, while others linked this to the
importance of being able to act early on the feedback from the test: ‘I think this is a good
time because if you are struggling with maths its better to deal with the problem as soon
as possible’. Included here are the ideas that (in DCU) the diagnostic test fitted in well
with the orientation week: ‘Yes there is good attendance during orientation week’, and, in
both institutions, that the test was held at a time that increased the likelihood of students
attending.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
Out of practice; away from mathematics over the summer [50 students]. The main
idea present in these responses is that the unannounced test, given when students had not
studied mathematics for at least three months, gives a fairer assessment of students’ core
knowledge – including that which they had retained from the LC cycle, for example, ‘No
one expected it and gives good feedback of level of maths without revision/study etc’ and
‘Everyone was at the same stage and a bit rusty so it was fair’.
Table 2. Aim of diagnostic testing in mathematics: summary of the responses to the question: ‘The
aim of diagnostic testing is to provide both staff and students with an immediate picture of which
important mathematical concepts are well-known and/or unknown to the student’. Do you think that
your diagnostic test achieved this, and why?
‘you’re not really expecting a test and sort of panic when you get it’ and ‘without previous
knowledge of the test. . .I found it annoying and struggled where I normally wouldn’t’.
‘The aim of diagnostic testing is to provide both staff and students with an immediate picture
of which important mathematical concepts are well-known and/or unknown to the student.’ Do
you think that your diagnostic test achieved this, and why?
The research objective was to gauge if students recognized the underlying purpose of
diagnostic testing. When the tests are issued in both institutions, students are informed of
the purpose of the test. When GIA was applied to the 662 responses, there was a clear
divide between the positive (Achieved aim) and negative (Did not achieve aim) answers
which is highlighted in Table 2. While some of the responses may have been influenced by
the phrasing of the question, many cannot be seen as having been directly prompted by the
wording due to the additional themes introduced by students in their responses.
International Journal of Mathematical Education in Science and Technology 65
Highlights areas for improvement [78 students]. Although similar to the previous theme,
the responses here included specific comments on how both staff and students can use the
results to focus on certain areas for improvement in teaching and learning:
Yes, because everyone is at different levels of mathematical knowledge then others and because
of this test, the lecturer will take it into account of the varied groups of people within the lectures
and teach to try and suit all.
Yes as it informs the staff of the overall weak areas of the students and what area they need to
focus on in their lectures and it helps the students realise that they may need to do some work
and refresh their memories if they want to know the basic information if they want to do well
in their maths course!
Good indicator because of the test structure [75 students]. These students all com-
mented that the test achieved its aims either because of the timing of the test: ‘Yes, as no
one has any time for preparation so it acts as a sort of equalizer’, ‘Yes, as it was a surprise
we had not prepared for it, so it was clear to see problem areas’ or because of the type of
questions that were asked:
I think it did as it questioned us on the basic concepts and not hard complicated equations. I
think it’s more important that we understand the basics when starting mathematics in college.
Table 3. Idea of diagnostic testing in mathematics: summary responses to the question: Do you think
that diagnostic testing in mathematics is a good or bad idea? Please give reasons for your answer.
after the summer/unprepared, etc.: ‘No because I didn’t get answers to questions I know I
knew how to do before but because of the summer break had forgotten how to do them’ and
‘No because with it just being sprung on you, you were under pressure and unprepared’.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
Do you think that diagnostic testing in mathematics is a good or bad idea? Please give reasons
for your answer.
The research objectives are transparent: we wish to find out what students think about
diagnostic testing from the perspective of whether or not they consider it to be a good idea.
When GIA was applied to the 706 responses, again there was a very clear split between the
positive (Good idea) and negative (Bad idea) responses which is evident from Table 3.
current knowledge for students and for academic staff: ‘Yes, because this way, you can get
insight on the students’ mathematical background without being too invasive’.
Identifies problem areas [101 students]. In this category students elaborate on their
answers and give reasons why it was useful to have a way of gauging their level of mathe-
matics. The main themes that emerge are that the test indicated weaknesses, problem areas
and areas where revision is needed: ‘Yes, as it lets you know what concepts are hardest for
students’ and ‘Yes to provide the student with an accurate assessment of what areas need
more focusing before tackling University lecture material’.
Positive structural springboard [59 students]. These responses addressed a key aspect
of diagnostic testing and almost unanimously state that the test can indicate the need to get
additional help in mathematics: ‘Good – it can push you to try get help’ and ‘Good idea as it
proves to the person that they need help’. Several students commented on diagnostic testing
as being positive in terms of potentially providing a platform for ongoing improvement, for
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
example, ‘Good [idea], people should be tested constantly to keep them working’.
The diagnostic test as a form of help [21 students]. This category includes responses
from a small but not insignificant number of students that indicated that the test itself helped
them to recall mathematical knowledge and skills, or helped them in some other unspecified
way: ‘Good, it reminds you of the basics that you had forgotten’.
Table 4. Feedback on performance: summary of the responses to the question: Were you given
sufficient feedback on your performance in the test?
for accessing support and/or revising particular areas of mathematics. Interestingly, some
students also found that the test itself provides the means for such revision.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
While the number of negative responses is low, they deserve particular attention. These
responses frequently come from students expressing explicitly negative feelings (in terms
of ability, negative pressure, confidence) in relation to mathematics. The responses here
are similar to those which characterize ‘Math Anxiety’.[17] These responses highlight the
need for administrators of diagnostic tests to identify students in whom such feelings are
engendered by the test, and to take steps to support them in overcoming their anxiety.
For this question, our research objective was to find out whether students felt that the
feedback mechanisms currently in place for diagnostic testing were adequate. Test results
and solutions are communicated to students in both institutions within a short space of time,
so that students are aware of which questions they answered incorrectly and the correct
answer to that question (although they do not receive feedback as to the specific errors they
made within each question). They are also sent information about specific interventions
such as revision classes or online material with which they are advised to engage. However,
we were interested to see whether students were satisfied with this level of feedback. Again,
there was an obvious split between the positive and negative responses in this question, but
although the majority (55.3%) of the 609 responses to this question were positive, this is
by far the lowest positive response of any question asked in the survey. A summary of the
response categories is given in Table 4.
Results and advice/solutions given [16 students]. This category included responses that
specifically mentioned that they were given solutions to the test along with their results:
‘Yes, we got our results and were given the answers to see where we went wrong’. A small
number of students mentioned receiving their results and advice on what to do next:
I was given my mark and given information about the maths support that is available in the
college. We were advised if we got less than 20 marks to go to the maths support centre.
Only results given [35 students]. Some students reported that they were only given a
grade – ‘No actual feedback was given, just results’ – which they did not consider to be
feedback.
No idea where errors were made [32 students]. There was a strong feeling among some
students that they did not know where they had gone wrong in the test, so that even though
they received a result, they did not know where they had made any errors: ‘No I just got a
grade. I did not find out which answers were right and I would have liked to have known
this’ and ‘No only a score, we were not able to analyse the paper after’.
Table 5. Final comments: summary responses to the question ‘Any other comments on diagnostic
testing?’
Thirty-seven point three per cent (25) of the comments made were positive about diagnostic
testing. These included general positive comments such as ‘Worthwhile’ and ‘It is a good
idea’. Other responses made specific reference to the test structure: ‘Good not to tell anyone
until they are in the lecture’, while more mentioned the impact of the test: ‘Makes students
aware of supports’.
the responses to the different questions. This overlap was also evident from the reasons
given for positive and negative responses to the questions. For example, it is interesting to
note that similar reasons (being away from mathematics for a considerable period of time or
the test was unexpected) are given by different students when indicating both the suitability
and the unsuitability of the timing of the test. These overlaps indicate the suitability of the
GIA approach to such data: one can consider the responses to the survey as a whole as well
as to individual questions.[17]
Overall, the authors have been surprised by the high level of positive responses received
in this questionnaire. It is encouraging that the majority of students thought that diagnostic
testing was a good idea, that they recognized the academic staff ’s motivation for carrying
out diagnostic testing and that they considered the timing of the diagnostic test to be suitable.
This provides solid support in the form of student opinion for our current practice. Indeed,
some respondents reported that the diagnostic test acted as a spur to seeking support, while
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
others felt that the diagnostic test was an end in itself, helping them to recall mathematical
knowledge and skills.
Prior to the analysis of the data, we were expecting a significant number of comments
reporting that diagnostic testing induces negative feelings in students, an outcome which lies
at the heart of criticisms of diagnostic testing.[10] However, we unexpectedly found very low
levels of reporting of such feelings. This provides evidence that such negative outcomes may
not be as widespread as previously thought. As noted above, the corresponding responses
are indicative of Math Anxiety [17] and highlight the need for focused intervention with
relevant students. We note that the relevant students are few enough in number such that,
when identified, the provision of targeted support should not present a significant drain on
support resources.
The authors were also surprised by the high percentage of responses which indicated
that students were not happy with the level of feedback and subsequent support that they
received after the test. In both institutions, students are issued with results, solutions and
advice on appropriate supports and who should access them shortly after the test. They are
also made aware of the impact that regular engagement with these supports can have on
their grades and progression [15]. One possible reason for this high negative response rate
may be that, as raised in a separate answer, students do not see the connection between
the material covered by both the test and subsequent supports, and their lecture material.
One step taken to try and address this issue of engagement with subsequent supports has
been the introduction of a student monitor in NUIM. This monitor is assigned to ensure
that students who fail the test engage appropriately with the follow-up support (MPC) and
research has shown that engagement levels have increased significantly.[14]
We conclude by highlighting the principal overarching themes that we have identified
in the students’ responses.
degree among the relevant students. We feel that this is an important point that has a role to
play in the debate about the merits and demerits of diagnostic testing in mathematics.[10]
it is worth mentioning that a significant amount of effort has already gone into the provision
of appropriate feedback and advice. We suggest that the difficulties encountered are part of
a wider picture in relation to providing feedback that will influence student behaviour: as
noted by Mutch, the provision of feedback is a complex task:
[it] is a social practice that demands attention . . . to the conditions of production, distribution
and reception.[19,p.25]
That said, in our future work on diagnostic testing, we intend to consider the area of
communication and feedback in greater detail.
Note
1. This was a male student on the ‘Open Opportunities’ route to an engineering degree. He had
not studied higher level mathematics and was advised to seek support. (This information was
gleaned from the profiling questions on the survey.)
References
[1] Cork Regional Technical College, Department of Mathematics and Computing. Report on
the basic mathematical skills test of 1st year students in Cork R.T.C. in 1984. IMS Newslett.
1985;14:33–43.
[2] Hurley D, Stynes M. Basic mathematical skills of U.C.C. students. IMTA Bull. 1986;17:
68–79.
[3] Brennan M. The 2nd -3rd level mathematics interface. In: Cawley S, editor. A mathematics
review. Dublin: Blackhall Publishing; 1997. p. 3–15.
[4] O’Donoghue J. An intervention to assist at risk students in service mathematics courses at the
University of Limerick. Limerick: University of Limerick; 1999.
[5] London Mathematics Society (LMS). Tackling the mathematics problem. Southend-on-Sea:
LMS; 1995.
[6] Savage M, Kitchen A, Sutherland R, Porkess R. In: Hawkes T, editor. Measuring the mathematics
problem. London: Engineering Council; 2000.
[7] Hassler B, Atkinson R, Quinney D, Barry M. The experience of fresher students in mathematics
diagnostic testing. MSOR Connect. 2004;4:17–23.
[8] Lawson D, Croft T, Halpin M. Good practice in the provision of mathematics support centres.
Available from: http://www.sigma-cetl.ac.uk/index.php?section=22 (17 September 2012); 2003.
[9] Heck A, Van Gastel L. Mathematics on the threshold. Internat J Math Ed Sci Eng Tech.
2006;37:925–945.
[10] LTSN MathsTEAM Project. Diagnostic testing for mathematics. Available from: http://www
.mathstore.ac.uk/mathsteam/packs/diagnostic_test.pdf. (17 September 2012); 2003.
International Journal of Mathematical Education in Science and Technology 73
[11] Gill O, O’Donoghue J. The mathematical deficiencies of students entering third level: An
item by item analysis of student diagnostic tests. In: Close S, Corcoran D, Dooley T, editors.
Proceedings of Second National Conference on Research in Mathematics Education (MEI2).
Dublin: St. Patrick’s College; 2007.
[12] Cleary J. Diagnostic testing – An evaluation 1998–2007. In: Close S, Corcoran D, Dooley T,
editors. Proceedings of Second National Conference on Research in Mathematics Education
(MEI2). Dublin: St. Patrick’s College; 2007.
[13] N Fhloinn E. Diagnostic Testing in DCU – A five-year review. In proceedings of Third National
Conference on Research in Mathematics Education (MEI3); Dublin: St. Patrick’s College; 2009.
p. 367–378.
[14] Burke G, Mac an Bhaird C, O’Shea A. The impact of a monitoring scheme on engage-
ment in an online course. Teaching Mathematics and Its Applications. 2012 [Accessed 2012
August 21]. Available from: http://teamat.oxfordjournals.org/content/early/2012/05/25/teamat.
hrs010.short?rss=1
[15] Mac an Bhaird C, Morgan T, O’Shea A. The impact of the mathematics support centre on
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014
the grades of first year students at the National University of Ireland Maynooth. Teaching
Mathematics and Its Applications 2009;28(3):117–122.
[16] Thomas D. A general inductive approach for analyzing qualitative evaluation data. Am J Eval.
2006;27(2):237–246.
[17] Wigfield A, Meece JL. Math anxiety in elementary and secondary school students. J Educ
Psychol. 1988;80(2):210–216.
[18] N Fhloinn E, Mac an Bhaird C, Nolan B. Appropriate settings and supports for third-level
diagnostic testing in mathematics. In proceedings of the SMEC Conference; Dublin: DCU;
2012. p. 10–15.
[19] Mutch A. Exploring the practice of feedback to students. Active Learn Higher Educ.
2003;4(1):24–38.
Appendix 1
(12) “The aim of diagnostic testing is to provide both staff and students with an immediate picture
of which important mathematical concepts are well-known and/or unknown to the student.”
Do you think that your diagnostic test achieved this, and why?
(13) Do you think that diagnostic testing in mathematics is a good or bad idea? Please give reasons
for your answer.
(14) Were you advised to avail of additional supports because of your results in the diagnostic
test? Yes No
(15) If so, did you avail of these supports and which ones?
(16) Please comment on the support available to students after the diagnostic test.
(17) Were you given sufficient feedback on your performance in the test?
(18) Were you given sufficient time to complete the test? Yes No
(19) How easy/difficult did you find your test?
Very difficult Difficult Neutral Easy Very easy
(20) Any other comments on diagnostic testing.
Downloaded by [Chulalongkorn University] at 06:30 25 December 2014