Documentos de Académico
Documentos de Profesional
Documentos de Cultura
doi:10.1111/bjet.12678
Abstract
Numerous research publications have mentioned the need to prepare teachers to design
and enact technologically enhanced learning experiences for students, thus increasing
their academic performance and their engagement when learning. This paper addresses
the crucial issue of how to train pre-service and novice teachers effectively to become
good designers of Moodle-based units of learning (UoLs), and to improve their critical
thinking skills when engaging in peer-review sessions. A teacher training professional
development approach for teachers that requires trainees to elicit, depict, reflect on,
share design ideas, and co-create small UoLs in a Moodle learning management system
is proposed. The training process follows the phases of the Think–Pair–Share learning
strategy, and makes use of the features of the CADMOS graphic learning design editor
that allows designers to see a preview of their unit of learning in Moodle. The findings of
an evaluation study with 28 trainees showed that the proposed approach was easy to
follow, and led to the creation of good-quality, re-usable designs for Moodle-based UoLs
and enhanced trainees’ design thinking skills. Transformations in the way in which the
newly minted learning designers thought, acted, and reflected were observed.
Introduction
Due to the rapidly expanding and constantly changing amount of learning technology now
available, students are being offered various technology-enhanced learning (TEL) experiences.
Given the additional expectations for teachers to apply more TEL methods in their practice,
teachers should be able to make meaningful choices regarding which learning activities to pro-
vide and how they should orchestrate them with the support of the proper learning resources
and tools in order to meet the new, higher expectations of the students (Kalantzis & Cope, 2010).
Emphasis is now placed on transforming teachers into creators of effective learning designs for
the needs of a learner-centered TEL environment, rather than on their past roles as curricu-
lum implementers and lecturers who follow a specific syllabus and textbook (Kirschner, 2015;
Laurillard, 2012). According to Donald et al. (2009), a learning design “documents and describes
a learning activity in such a way that other teachers can understand it and use it in their own
© 2018 British Educational Research Association
1060
British Journal of Educational Technology Vol 49 No 6 2018
Practitioner notes
What is already known about the topic
• Numerous research publications have mentioned the need to prepare teachers to design
and enact technology-enhanced learning experiences for students, thus increasing
their academic performance and their attention, concentration and motivation when
learning.
• An open research question is how to select and orchestrate training activities effectively
during teachers’ professional development (TPD) with regard to learning design?
What this paper adds
• The proposed teachers’ professional development approach aims to guide pre-service
and novice teachers during the process of creating a prototype for the design of a short
technology-enhanced learning unit (three to four didactic hours) that could be enacted
via Moodle.
• The evaluation study of the proposed TDP approach has novel characteristics because it
was focused on investigating not only whether the trainees had progressively created
better designs, but also whether their critical skills in evaluating the quality of their own
learning designs as well as those of their peers had improved.
Implications for practice and/or policy
• A promising “techno-pedagogical” approach for training pre-service and novice teach-
ers in learning design was presented. The proposed approach required trainees to elicit,
depict, reflect on, share their ideas, and co-create high-quality designs for UoLs for
Moodle.
• Directions for future research on teachers professional development about learning de-
sign are presented.
context. Typically a learning design includes descriptions of learning tasks, resources and sup-
ports provided by the teacher.”
A teacher/learning designer has the role of working toward the specifics of a unit of learning (UoL);
in other words, the coordination of the roles of the students, groups, and/or teachers who offer
guidance and scaffolding, activities, and associated environments (learning resources, tools, and
services) that allow learners to meet learning objectives while taking certain prerequisites into con-
sideration (Koper & Tattersall, 2005). A unit of learning can be seen as a general name for a course,
a workshop, a few hours of lessons, and so on, which can be instantiated and reused many times for
different people and in various settings, including in an on-line environment (Koper, 2005).
It is acknowledged that pre-service and novice teachers should be trained in such activity-
centered learning and planning approaches in order to be better able to create new TEL environments
that engage learners and address learner diversity (Persico & Pozzi, 2013). The crucial question is
how to select and orchestrate training activities effectively during teachers’ professional develop-
ment (TPD) with regard to learning design. However, follow-up questions might also arise, such
as whether teachers should use specific authoring tools for learning designs (Bennet, Agostinho, &
Lockyer, 2015; Bower, Craft, Laurillard, & Masterman, 2011), such as LAMS, Learning Designer,
WebCollage and the like during this LD creation process. If so, the next question is whether such
tools add value to the TPD process and empower trainees during the design process, which is seen
simultaneously as a creative practice for (co)producing blueprints and sketches that will be an
imagined end product and a process of inquiry (Dillenbourg, 2013; Krippendorff, 2006).
© 2018 British Educational Research Association
Training Novice teachers to design 1061
There have been various interesting TPD approaches to learning design that have had positive
outcomes, such as the Learning Design Studio, the OLDS MOOC by the UK Open University, the
Carpe Diem workshops by the University of Leicester, the 7Cs of Learning Design by University
of Leicester, the JISC-funded SPEED project, and the METIS EU-funded project. The proposed
approaches highlight key elements that an effective TPD process should include (Asensio-
Pérez, Dimitriadis, Hernández-Leo & Pozzi, 2015; Cross, Galley, Brasher & Weller, 2012; Kali &
McKenney 2012; Mor & Mogilevsky, 2013; Brasher & Mor, 2013; Fink, 2013; Garreta-Domingo,
Hernández-Leo, Mor, & Sloep, 2015):
• it should encourage teachers to collaborate and share their LD ideas, as well as to obtain feed-
back from peers to reinforce learning,
• it should offer a combination of theory and hands-on practice in order to allow teachers to
address learning design challenges,
• graphic LD tools to improve the representations of scenarios should be used, and
• it should foster a culture of professional collaboration for the co-construction of knowledge.
This paper presents a new TPD approach that requires pre-service and novice teachers to be en-
gaged in a fast-track learning design program that lasts for 5 weeks, and which includes all the
aforementioned principles. The flow of the proposed TPD follows the phases of the Think–Pair–
Share learning strategy that consist of individual work, and requests the trainees to be engaged
into peer reviewing, sharing, re-using, and co-constructing LDs.
This paper is novel in two aspects. First, the proposed TPD aims to guide pre-service and novice
teachers during the process of creating a prototype for the design of a short UoL (three to four
didactic hours) that could be enacted via Moodle. For this goal, the trainees used the CADMOS
authoring tool, which not only allowed them to make graphic representations of their LDs easily,
but also enabled them to have a preview of the appearance of their design in the Moodle learning
management environment. As a result, trainees could become better Moodle course designers
following an iterative design process since they could work on the graphic design representa-
tion of their UoL using the CADMOS tool elements, while simultaneously seeing how their UoL
would appear in a Moodle environment. In addition, having used the CADMOS tool, the trainees
followed a consistent and concise representation style for their LD (Katsamani & Retalis, 2013;
Katsamani et al., 2012). Second, the evaluation study of the proposed TPD approach has novel
characteristics because it was focused on investigating not only whether the trainees had created
better designs progressively, but also on whether their design thinking skills in self-evaluating and
peer-reviewing the quality of submitted LDs had improved during the training cycle. Findings
from research studies reveal that peer review benefits trainees by helping them to develop critical
thinking skills (Brill & Hodges, 2011). Thus, the proposed TPD approach was validated by 28
pre-service and novel teachers in order (i) to provide a better understanding of how the tool’s fea-
tures best supported the design process, and (ii) to investigate whether the teachers appreciated
the various steps in the design process.
The structure of this paper is as follows. The proposed TPD approach is described in the follow-
ing section. Subsequently, the evaluation case study and the findings thereof are presented.
Concluding remarks and directions for future research are provided in the last section of the paper.
work, emphasizing the co-construction, sharing, and reusing of learning designs, thus promot-
ing interactions among teacher-designers.
The trainees had access to selected learning resources pertaining to learning strategies and to
exemplar learning designs. They also had the support of experienced tutors via an asynchro-
nous online learning environment based on the Moodle platform, which allowed them to become
accustomed to its features.
The LD process was also scaffolded by the CADMOS graphic learning design tool (Katsamani &
Retalis, 2013). CADMOS guides teachers through the process of creating a technology-enhanced
learning design that can be previewed and exported as a Moodle-based UoL for enactment. The
use of CADMOS enabled trainees to make iterations of and refinements to their UoL designs in
order for them to be well structured on a Moodle platform. Various studies have shown that the
CADMOS tool adds value to the LD process (Katsamani et al., 2012), since it
1. is appealing to novice designers because it offers guidance regarding the design process,
2. allows the practitioner to design learning activities from different perspectives and in different
layers, and
3. enables teachers to see a preview of their designs as Moodle-based UoLs before they are ex-
ported as Moodle course files.
The flow of the training process required trainees to work both as individuals and as members of
a group, following an adapted Think–Pair–Share (TPS) collaborative learning strategy.
elegance, innovation and aesthetic impact” (Kleiman, 2009). More specifically, according to
Kleiman (2009), a good design for learning should be
• Elegant and esthetic: “Aesthetics involve the psychological and physiological effects of line,
shape, colour, texture, tone, composition, context etc. an artefact, constructed within a frame.
It has form and structure. An important aspect of aesthetics is the notion of integrity: does it
‘hang together’ and work as whole?”
• Consistent right down to the details: “Every element and procedure needs to be thought through
and implemented in such a way that ‘it works’—both in itself and relation to the whole.”
• Creative and innovative: “[The] best designers are able to provide creative solutions whilst
sometimes working within very restrictive frameworks.”
Agostinho et al. (2009) also advocated that the “completeness of the LD description” with easily
understood notation/formalism and the “expressiveness of the learning design elements” were
essential because “it can be easily, yet comprehensively understood in terms of its original con-
text and thus potentially reused by a teacher in their particular educational context.” Following
the aforementioned work, an evaluation rubric with a rating scale from 1 (poor performance)
to 3 (excellent performance) was created, and was used to assess the quality of learning designs
according to the following eight criteria:
• Completeness of the description of the learning design (presence of well-written goals and
prerequisites, defined roles, well-described metadata in learning activities/resources, and full
correspondence between activities and resources).
• Clarity and consistency of the learning design elements as described via the conceptual model.
• Clarity and consistency of the orchestration of the learning design as described via the flow
model.
• Expressiveness of the learning design elements (names of the learning activities, the learning
resources, and phases and rules).
• Creativity of the learning design in relation to activities, resources, strategies, and rules, thus
facilitating high-quality learning experiences.
• Suitable and clear alignment of the goals, activities, and resources for each role of the learning
design.
• Elegant presentation and esthetics for visualizing the learning scenario.
• Innovation in the learning design in relation to the way it can promote collaboration, active
learning, and the quality of interaction.
To further examine the reliability of the rubric and improve the wording of the criteria, two
learning design exemplars, were evaluated by three experienced designers with both academic
and practical knowledge of learning design, deep understanding of how learning tools/services
are evolving, and proven experience in the development of Moodle-based UoLs to determine
whether their evaluations produced similar results with rubric.
LD skills and performance (Asensio-Pérez et al., 2015; Hernández-Leo et al., 2006; Ronfeldt et al.,
2015). Again, the co-creation of the learning designs was performed using the CADMOS tool.
CADMOS LD tool
CADMOS graphic learning design tool allows users to create a learning design for UoLs that can
be exported to the Moodle learning environment for enactment. The CADMOS tool adopts the
“separation of concerns” logic from the modern computer-aided web-engineering field (Rossi
et al., 2008). Not only does it provide a simple and intuitive user interface that allows the designer to
drag and drop learning activities and resources onto the design canvas, the user can also organize
them as a learning flow and preview how this UoL design will look when deployed on the Moodle
learning management system. The CADMOS iterative design process involves three interrelated steps:
1. The construction of the Conceptual Learning Activity Model (CLAM), in which the designer
describes the learning activities that he/she thinks the learning actors (individual students,
groups, or the tutor) should perform in order to accomplish the desired learning objectives.
He/she also specifies the learning resources and tools that support these activities; for exam-
ple, if a student will be asked to read a theory, the designer will define the related learning
resources and format, which could be an online document, a video, or similar. Figure 1 shows
Figure 2: Screenshot of a Learning Flow Model for a UoL design in CADMOS
[Colour figure can be viewed at wileyonlinelibrary.com]
the CLAM for a learning scenario. For each learning activity, we defined metadata (title, de-
scription, learning goals, prerequisites, responsible actor, and so on) that are not shown in the
figure. Specifying the types of the activities gives designers the opportunity to have an overall
idea of the nature of the learning design (for example, more emphasis is placed on theory, or
students have to complete many assessments, no collaboration, and so on) by clicking on the
“Statistics” button of the tool. Each learning activity is linked to a learning resource for which
metadata need to be given (title, author, description, type, copyright, and resource file).
2. The construction of the Learning Flow Model (LFM), which stems from the CLAM. This model
shows the order in which the students should perform the activities and whether there are any
specific rules; for example, a student must read the theory before moving on to a self-assessment
test in which s/he has to attain a score higher than 70% before proceeding with the following
activity. Figure 2 shows the LFM for a UoL in which there are three different swim lanes, one
for each learning actor (individual students, teachers, and groups of students). The activities are
positioned along the vertical axis according to chronological order and are arranged in phases.
In the students’ lane, the first two activities are grouped inside a rectangle in order to show that
they belong to a composite activity, as specified in the conceptual model. Moreover, the self-as-
sessment has a time limit rule that specifies that this activity has a specific duration.
3. The construction of a Moodle preview for the designed UoL. CADMOS converts the CLAM and
the LFM into a Moodle course that is ready to be deployed by binding the learning tasks and
rules to Moodle elements (resources/activities, topics, rules, and so on) according to a specific
mapping schema that is explained in detail in Boloudakis, Katsamani, Retalis, & Georgiakakis
(2012). The learning script is now ready to be exported as a Moodle course and can be deployed
© 2018 British Educational Research Association
1066
British Journal of Educational Technology Vol 49 No 6 2018
Figure 3: Screenshot of the Moodle preview for the UoL design in CADMOS
[Colour figure can be viewed at wileyonlinelibrary.com]
on a Moodle platform for enactment. An exported file in the format of a Moodle course backup
file (.mbz) can be created, which could then be imported into Moodle through the course res-
toration process for the needs of enactment (Figure 3).
CADMOS is a user-friendly tool that enables novice teachers/designers to portray their UoL ideas
as meaningful visual representations to share, reuse, and/or to be adapted by peers (Katsamani
& Retalis, 2013). In the proposed study, trainees’ acceptance of the CADMOS tool and its added
value will be examined.
Research Questions
To achieve the above-mentioned goals, we formulated the following research questions:
RQ1. Do we have any indication that the trainees had improved their practical skills in learning design?
An expert evaluator assessed the trainees’ deliverables using the LD rubric. We hypothesized
that, if the trainees managed to create better LDs as a result of their collaborative interaction
during the TPD process than they did during the initial individual phase, this would lead to an
improvement in LD skills (eg, Johnson & Johnson, 2014).
RQ2. Do we have any indication that the proposed TPD activities promoted trainees’ design thinking
skills in terms of their ability to generate creative, innovative, and effective LDs? The trainees were
asked to assess the quality of their own deliverables, both in the initial and in the final phase of
the TPD process, using the LD rubric. The scores given by trainees as evaluators of their own
deliverables were compared with the scores given by the expert. We hypothesized that, if the
trainees’ scores in the final phase of the TPD process had better convergence with the scores
of the expert than they did in the initial phase, this would be an indication of improvement in
terms of their design thinking skills.
RQ3. Was the flow of the TPD process accepted well by the trainees? The trainees were asked to an-
swer an online questionnaire at the end of the TPD process. The first set of closed-ended ques-
tions investigated whether the participants believed that the proposed flow of the training
process enabled them to create pedagogically flexible and reusable designs for UoLs on Moodle.
RQ4. Did CADMOS add value to the TPD process and empower trainees during the process? The sec-
ond part of the online questionnaire was related to the measurement of the usability of the
CADMOS philosophy and tool.
Evaluation
Research question Evaluation method toolkit Evaluators
Evaluation
Research question Evaluation method toolkit Evaluators
RQ2 Pre/post-comparison of the LD Evaluation Expert and Trainees
convergence of the scores Rubric
given by both the trainees
and the expert as evaluators
of the same deliverables
(initial and final LDs)
RQ3 Post-evaluation of the Online Trainees
trainees’ perceptions Questionnaire
RQ4 Post-evaluation of the Online Trainees
trainees’ perceptions Questionnaire
In order to answer the last two research questions (RQ3 and RQ4), we used a post-evaluation
method at the end of the TPD process. The instrument was an online questionnaire that trainees
were asked to complete anonymously. The questionnaire consisted of 10 questions (eight closed-
ended questions that were rated using a 5-point Likert scale—Disagree, Slightly Agree, Strongly
Agree, Very Strongly Agree—and two open-ended questions). The questions were designed to
investigate the trainees’ perceptions regarding the usefulness and usability of the TPD process
and CADMOS, as well as obtain ideas for their enrichment.
Evaluation Findings
Research Question 1: Do we have any indications that the trainees improved their practical skills in
learning design?
As discussed, all the trainees’ deliverables were assessed by the expert evaluator using the LD
rubric. Table 1 shows the trainees’ performances (% scores) in the initial (pre-test) and final
(post-test) phases of the TPD process. It was evident that this process helped the trainees improve
their LDs. The last column in Table 1 shows the number of criteria we identified as improvements
based on the LD rubric. As can be seen, there are cases in which the improvement concerned
more than one criterion.
By applying a paired samples t-test in SPSS between the initial (pre-test) and final (post-test) LD
scores of the groups, we found a strong and positive correlation (r = 0.825, p < .022), as well as
a significant average difference (t6 = −4,092, p < .006) between them (Table 2). Moreover, the
value of Cohen’s d showed a “large” effect size (dCohen = 1.019) (Cohen, 1988).
RQ2: Do we have any indication that the proposed TPD activities promoted trainees’ design thinking skills
in terms of their ability to generate creative, innovative, and effective LDs?
Figure 4 shows the assessment scores given by trainees regarding the quality of their own deliv-
erables using the LD rubric, as well as the assessment scores given by the expert. In this phase,
Table 1: Evaluation of the submitted LDs
Figure 4: Assessment of the Initial Deliverables (pre-test measurement) [Colour figure can be viewed at
wileyonlinelibrary.com]
N Correlation Sig.
Pair 1 Initial & Final 7 .825 .022
Paired differences
95% confidence
interval of the
difference
Std. Std. error Sig.
Mean deviation mean Lower Upper t df (2-tailed)
the trainees were also asked to assess the deliverables of their peers in the same group and as
part of their activity to select the best LD that could form the basis of the group’s deliverables.
Each asterisk (*) shown in the figure characterizes the trainee whose deliverable was selected by
the underlying group. The average scores given by the other members of each group for each
individual LD deliverable are shown in Figure 4 as “Peers.” We can see that, in most cases, the
group selection for the best LD was not the LD that was rated best by the expert or by the peers.
Comparing the three sets of scores, we can notice that trainees showed greater confidence in their
own deliverables and gave themselves much higher scores than the expert did. They also gave
their own deliverables a higher score than they gave to their peers’ deliverables.
Table 3: The level of agreement among trainees and the expert (pre-test measurement)
Table 4: The level of agreement for the criteria (pre-test measurement)
Criteria /Groups C-1 C-2 C-3 C-4 C-5 C-6 C-7 C-8
Furthermore, a more focused analysis was performed in order to determine the design think-
ing skills of the trainees as evaluators of LDs, and to examine the level of agreement between
the scores given by them and by the expert. Our method was based on Kendall’s Concordance
Coefficient W, which is a coefficient of concordance that measures the agreement among rates
(Kendall, 1938). We computed Kendall’s W using SPSS.
Table 1 summarizes the results of Kendall’s W values for each group. In order to simplify the
findings, the different values for Kendall’s W can be categorized as follows (Significance level 5%):
Kendall’s W Agreement
Thus, Table 4 is a simplified version of Table 1, showing the level of agreement. The table shows
that there was no agreement between the overall scores of the trainees in all the groups and the
expert. However, a further analysis (see Table 4) revealed some agreement between the students
and the expert, but only for criteria 2, 3, 4, and 7; some of these findings were expected, such as
the expressiveness of the learning design elements, as well as the elegant presentation and esthet-
ics in visualizing the learning scenarios.
In the final phase, the trainees were asked to complete a self-evaluation of their group’s delivera-
bles (post-test measurement) using the LD rubric prior to the expert’s assessment. At first glance,
it is clear that the gap between the scores of the trainees and the expert had decreased dramati-
cally (see Figure 5).
The findings showed an improvement in the level of agreement between the trainees and the
expert, which strengthens the argument that the proposed TPD promotes trainees’ design think-
ing skills. Table 5 summarizes the results of the Kendall’s W values for each group. The level of
Figure 5: Peer Assessment of Group Deliverables (post-test measurement) [Colour figure can be viewed at
wileyonlinelibrary.com]
agreement, although not perfect, has improved significantly. Further analysis (Table 6) revealed
that the main agreement points were for criteria 5, 6, and 8, which were closely correlated with
the trainees’ design thinking skills and their ability to generate creative, innovative, and effective
LDs.
RQ3: Was the flow of the TPD process accepted well by the trainees?
Twenty-six (26) of the 28 trainees answered the post-questionnaire. Their comments were ex-
tremely positive. With regard to the question “How satisfied are you with the approach followed
in the on-line course?”, the vast majority of the participants (92%, or 24 of 26) claimed that they
were highly satisfied with the approach. For the question “How satisfied are you with the design
approach that uses diagrams rather than text?”, all the participants answered that they were
highly satisfied. Similarly, high satisfaction rates were given by the participants for the questions
regarding the degree of satisfaction with:
All trainees said that the use of CADMOS was simple, and that they were able to complete the
learning design easily and quickly, while 85% of them (22 of 26) stated that they were highly
satisfied with the guidance that was provided during the training process.
The trainees were asked to state the most important element of the TPD process that helped them
to improve their practical skills in learning design. Figure 6 shows that, according to the trainees,
the top four most helpful elements were:
Trainees Strong Very Strong Strong Strong Strong Weak Very Strong
1072
Table 6: The level of agreement for the criteria (post-test measurement)
Figure 6: Trainees’ statements regarding the most important elements in the TPD process that helped them
improve their practical skills in learning design [Colour figure can be viewed at wileyonlinelibrary.com]
Figure 7: Trainees’ feedback on the CADMOS tool [Colour figure can be viewed at wileyonlinelibrary.com]
Factors such as “Peer exchange of learning designs in groups,” “Self-assessment of the learning
design,” and “Peer evaluation of learning designs” were not rated highly. We can justify these scores
from the answers to the open-ended questions about the overall TPD process. The aforementioned
activities were characterized as the most demanding tasks in the TPD process. By contrast, the activ-
ities concerning the use of the CADMOS tool were characterized as easy, undemanding activities.
RQ4: Does CADMOS add value to the TPD process and empower trainees during the process?
The trainees’ answers showed that, with the aid of the CADMOS learning design tool, they could
complete their design tasks easily. Overall, they were highly satisfied (see Figure 7). The tool
guided them when creating the LD via three main steps:
© 2018 British Educational Research Association
1074
British Journal of Educational Technology Vol 49 No 6 2018
1. the creation of a conceptual model in which they were required to specify the learning activ-
ities to meet the identified learning goals of the UoL, as well as the learning objectives, tools,
and services related to these activities,
2. to orchestrate the learning activities with rules and conditions in order to accomplish the de-
sired learning objectives following the principles of a desired learning strategy, and
3. to obtain a preview of the UoL design as it would appear on Moodle.
The trainees mentioned that the tool provided flexibility during the leaning design process. They
were able to revise the design based on feedback and comments from peers, as well as to modify
the conceptual model or the flow model when the Moodle preview was not as expected.
Concluding remarks
As Laurillard (2012) argued, teaching is now becoming a design science and an aspect of teach-
ers’ practice. In this paper, a three-step collaborative approach for training pre-service and nov-
ice teachers in learning design was presented. The proposed approach required trainees to elicit,
depict, reflect on, share their ideas, and co-create high-quality designs for UoLs for Moodle. The
technological cornerstone of the approach was the CADMOS graphic design tool that offers scaf-
folding during the complex process of (co)creating and sharing, as well as a preview of the de-
signs as Moodle-based UoLs.
The findings of the case study revealed that the proposed TPD seems promising for enriching
trainees’ learning design thinking skills. Further studies with more teachers are necessary in
order to provide a solid framework for teacher training in LD. Moreover, directions for future
research should include:
• Offering this TPD as two full successive cycles with the same participants. The main idea would
be to measure whether the design thinking and critical reasoning skills had been maintained.
Such an experiment would provide adequate experience to evaluate and optimize the TPD pro-
cess. Somewhat similar suggestions for the successive cycles have been made by Warburton
and Mor (2015), and by Papanikolaou et al. (2017).
• Enactment and validation of the LDs in authentic environments, such as the approach adopted
by colleagues who participated in the METIS EU-funded project (2012).
References
Agostinho, S., Bennett, S., Lockyer, L., Jones, J., & Harper, B. (2013). Learning designs as a stimulus and
support for teachers’ design practices. In H. Beetham, & R. Sharpe (Eds.), Rethinking pedagogy for a digital
age: Designing for 21st century learning (pp. 119–132). New York, NY: Routledge.
Agostinho, S., Bennett, S. J., Lockyer, L., Kosta, L., Jones, J., & Harper, B. (2009). An examination of learn-
ing design descriptions in a repository. In R. Atkinson, & C. McBeath (Eds.), Same places, different spaces.
Proceedings of the 26th Annual Ascilite International Conference (pp. 11–19). Auckland, New Zealand:
University of Auckland, Auckland University of Technology, and Australasian Society for Computers in
Learning in Tertiary Education.
Agostinho, S., Oliver, R., Harper, B., Hedberg, J., & Wills, S. (2002). A tool to evaluate the potential for
an ICT-based learning design to foster “high-quality learning”. In A. Williamson, A. Young, C. Gunn,
& S. Clear (Eds.), Winds of change in the sea of learning. Proceedings of the 19th Annual Conference of
the Australasian Society for Computers in Learning in Tertiary Education, 8–11 December 2002 (pp.
29–38). Auckland, New Zealand: UNITEC Institute of Technology.
Asensio-Pérez, J. I., Dimitriadis, Y., Hernández-Leo, D., & Pozzi, F. (2015). Teacher Continuous Professional
Development and full lifecycle Learning Design: First reflections. Proceedings of the workshop “Design for
Learning in Practice”, EC-TEL, Toledo, September 18, 2015. Heerlen, The Netherlands.
Benade, L. (2015). Teachers’ critical reflective practice in the context of twenty-first century learning. Open
Review of Educational Research, 2(1), 42–54.
Bennet, S., Agostinho, S., & Lockyer, L. (2015). Technology tools to support learning design: Implications
derived from an investigation of university teachers’ design practices. Computers & Education, 81,
211–220.
Bennett, S., Agostinho, S., & Lockyer, L. (2005). Reusable learning designs in university education. In
T. C. Montgomerie, & J. R. Parker (Eds.), Proceedings of the IASTED International Conference on Education
and Technology (pp. 102–106). Anaheim, CA: ACTA Press.
Boloudakis, M., Katsamani, M., Retalis, S., & Georgiakakis, P. (2012). Orchestrating learning activities
with Cadmos: From the design to the enactment. International Conference of the Learning Sciences
(ICLS), The Future of Learning, 2–6 July 2012, Sydney, Australia.
Bower, M., Craft, B., Laurillard, D., & Masterman, L. (2011). Using the Learning Designer to develop a
conceptual framework for linking learning design tools and system. In L. Cameron, & J. Dalziel (Eds.),
Proceedings of the 6th International LAMS & Learning Design Conference 2011: Learning design for a
changing world (pp 61–71). 8–9 December 2011, Sydney: LAMS Foundation.
Brasher, A., & Mor, Y. (2013). Report 2 on meetings with user groups: Early feedback on candidate best
practices for teacher training on learning design (D3.1). METIS project deliverable.
Brill, J. M., & Hodges, C. B. (2011). Investigating peer review as an intentional learning strategy to foster
collaborative knowledge-building in students of instructional design. International Journal of Teaching
and Learning in Higher Education, 23(1), 114–118.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences, 2nd ed. Hillsdale, NJ: Erlbaum.
Cross, S., Galley, R., Brasher, A., & Weller, M. (2012). OULDI-JISC Project Evaluation Report: The impact
of new curriculum design tools and approaches on institutional process and design cultures. OULDI
Project (Open University). https://oro.open.ac.uk/34140/
Dillenbourg, P. (2013). Design for classroom orchestration. Computers & Education, 69, 485–492.
Donald, C., Blake, A., Girault, I., Datt, A., & Ramsay, E. (2009). Approaches to learning design: Past the
head and the hands to the HEART of the matter. Distance Education, 30(2), 179–199.
Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to designing college courses.
San Francisco, CA: Jossey- Bass/John Wiley & Sons.
Garreta-Domingo, M., Hernández-Leo, D., Mor, Y., & Sloep, P. (2015).Teachers’ perceptions about the
HANDSON MOOC: A learning design studio case. In G. Conole, T. Klobučar, C. Rensing, J. Konert, &
E. Lavoué (Eds.), Design for teaching and learning in a networked world. Lecture notes in computer science, Vol.
9307. Cham: Springer.
Hernández-Leo, D., Villasclaras-Fernández, E. D., Asensio-Pérez, J. I., Dimitriadis, Y., Jorrín-Abellán,
I. M., Ruiz-Requies, I., & Rubia-Avi, B. (2006). COLLAGE: A collaborative Learning Design editor based
on patterns. Educational Technology & Society, 9(1), 58–71.
Johnson, D. W., & Johnson, R. T. (2014). Using technology to revolutionize cooperative learning: An opin-
ion. Frontiers in Psychology, 5, 1156. https://doi.org/10.3389/fpsyg.2014.01156
Kalantzis, M., & Cope, B. (2010). The teacher as designer: Pedagogy in the new media age. E-Learning and
Digital Media, 7(3), 200–222.
Kali, Y., & McKenney, S. (2012). Teachers as designers of technology enhanced learning. In J. vanAalst,
K. Thompson, M. J. Jacobson, & P. Reimann (Eds.), The future of learning: Proceedings of the 10th international
conference of the learning sciences, Vol. 2 (pp. 582–583). Sydney: International Society of the Learning
Sciences.
Katsamani, M., Retalis, S., & Boloudakis, M. (2012). Designing a moodle course with the CADMOS learn-
ing design tool. Educational Media International, 49(4), 317–331.
Katsamani, M., & Retalis, S. (2013). Orchestrating learning activities using the CADMOS learning de-
sign tool. Research in Learning Technology, 21. Retrieved May 14, 2018, from https://www.learntechlib.
org/p/133265/
Kendall, M. G. (1938). A new measure of rank correlation. Biometrika, 30(1), 81–93.
Kirschner, P. A. (2015). Do we need teachers as designers of technology enhanced learning? Instructional
Science, 43(2), 309–322.
Kleiman, P. (2009). Design for learning. 2nd ed. Lancaster: PALATINE.
Koper, R. (2005). An introduction to learning design. In R. Koper & C. Tattersall, (Eds.) Learning Design. A
Handbook on Modeling and Delivering Networked Education and Training. The Netherlands: Springer.
Koper, R., & Tattersall, C. (2005). Preface to learning design: A handbook on modelling and delivering
networked education and training. Journal of Interactive Media in Education, 1, 3–20.
Krippendorff, K. (2006). The semantic turn: A new foundation for design. Boca Raton, FL: CRC Press.
Laurillard, D. (2012). Teaching as a design science: Building pedagogical patterns for learning and technology.
London: Routledge.
McKenney, S., & Reeves, T. C. (2015) Educational design and construction: Processes and technologies. In
B. Gros, Kinshuk, and M. Maina (Eds.) The architecture of ubiquitous learning: Designs for emerging pedago-
gies (pp. 131–151). Heidelberg, Germany: Springer Verlag.
Mor, Y., & Mogilevsky, O. (2013). The learning design studio: Collaborative design inquiry as teachers’ profes-
sional development. Research in Learning Technology, 21. https://doi.org/10.3402/rlt.v21i0.22054
Papanikolaou, K., Makri, K., & Roussos, P. (2017). Learning design as a vehicle for developing TPACK
in blended teacher training on technology enhanced learning. International Journal of Educational
Technology in Higher Education, 14, 34.
Persico, D., & Pozzi, F. (2013). The role of representations for the development of a participatory culture of
Learning Design among educators. Proceedings of the ATEE Winter Conference, Learning & Teaching
with Media & Technology (pp. 365–372). Genoa, Italy: ATEE AISBL.
Ronfeldt, M., Farmer, S. O., McQueen, K., & Grissom, J. (2015). Teacher collaboration in instructional teams
and student achievement. American Educational Research Journal, 52(3), 475–514.
Rossi, G., Pastor, O., Schwabe, D., & Olsina, L. (2008). Web engineering: Modelling and implementing web ap-
plications. London: Springer-Verlag.
Sampson, D. G., Zervas, P., & Sotiriou, S. (2011). From learning objects repositories to learning design
repositories: The COSMOS learning design repository. Proceedings of the IEEE 11th International
Conference on Advanced Learning Technologies (pp. 285–289).
Warburton, S., & Mor, Y. (2015). Double loop design: Configuring narratives, patterns and scenarios in
the design of technology enhanced learning. In Y. Mor, M. Maina, & B. Craft (Eds.), The art and science of
learning design. Rotterdam/Boston/Taipei: Sense Publishers.