Está en la página 1de 14

Journal of http://jmd.sagepub.

com/ Marketing Education

Enhancing Learning Outcomes: The Effects of Instructional Technology, Learning Styles, Instructional Methods, and Student Behavior
Mark R. Young, Bruce R. Klemz and J. William Murphy Journal of Marketing Education 2003 25: 130 DOI: 10.1177/0273475303254004 The online version of this article can be found at: http://jmd.sagepub.com/content/25/2/130

Published by:
http://www.sagepublications.com

Additional services and information for Journal of Marketing Education can be found at: Email Alerts: http://jmd.sagepub.com/cgi/alerts Subscriptions: http://jmd.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav Citations: http://jmd.sagepub.com/content/25/2/130.refs.html

>> Version of Record - Aug 1, 2003 What is This?

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

AUGUST JOURNAL ARTICLE 2003 OF MARKETING EDUCATION

Enhancing Learning Outcomes: The Effects of Instructional Technology, Learning Styles, Instructional Methods, and Student Behavior
Mark R. Young, Bruce R. Klemz, and J. William Murphy

The delivery of marketing education seems to be rapidly shifting toward pedagogy rich in experiential learning and strongly supported with educational technology. This study integrates and extends previous research efforts and investigates the simultaneous effects of multiple influences of technology and nontechnology factors on learning outcomes. Responses were obtained across a marketing curriculum with technology-accustomed students. The findings suggest that the use of preferred instructional methods will enhance each of the three different measures of learning outcomes, while encouraging supportive class behaviors can increase selfreport performance and course grade. Regardless of the dependent outcome measure, only one of the five instructional technology variables proved significant, suggesting that in contrast to previous studies that examined technology in isolation, when analyzed relative to other learning factors, technologys influence is secondary. Implications are discussed with practical suggestions for the classroom and direction for further investigation.

Keywords: pedagogy; instructional technology; learning styles; student behavior; learning outcomes

Online media-rich e-books, Internet-enhanced cases, chat

rooms, electronic bulletin boards, CD-ROMs, electronic libraries, laptop computers, and an ever-expanding array of instructional technologies promising to engage and motivate students, accelerate learning, and increase the economic worth of students sounds enticing, but does it work? Certainly the practice of marketing companies and entire industries has been transformed in effectiveness and efficiencies by the deployment of information technology; will the same be true in marketing education? The answer to both these questions rests on the scholarly investigation of the impact that various

educational tools, pedagogies, and other learning-related factors have on learning outcomes. Initial scholarly investigation has produced some informative guidance on factors that influence the selection of instructional technology resources (Strauss and Frost 1999), recommendations on technology tools to achieve specific student outcomes (Clarke, Flaherty, and Mottner 2001), types of student behaviors that affect performance (Brokaw and Merz 2000), preferred learning styles of marketing majors (Stewart and Felicetti 1992), and how pedagogical preference affects attitudes toward the major (Davis, Misra, and Van Auken 2000). Many of these current studies attempt to identify how specific types of instructional technology or pedagogical factors affect learning. However, the reality of most classroom environments is that there is a multitude of instructional factors that produce a joint effect on learning, thereby limiting the usefulness of the reported effects of a specific instructional technology examined in isolation. Further limitations of previous research include singleitem measures, lack of comparisons to nontechnology pedagogies, measuring only attitudes and not performance, sampling from a single technology-based course, and examining a narrow set of predictors of performance. The purpose of this study is to provide an exploratory next step in this evolving research by extending and integrating these previous research efforts on the investigation of the impact of instructional technologies, learning styles, instructional methods, and student behaviors on learning outcomes as presented in the conceptual framework in Figure 1. Specifically,
Mark R. Young is a professor of marketing, Bruce R. Klemz is an associate professor of marketing, and J. William Murphy is a professor of business education, all in the Department of Marketing at Winona State University, Winona, Minnesota. Journal of Marketing Education, Vol. 25 No. 2, August 2003 130-142 DOI: 10.1177/0273475303254004 2003 Sage Publications

130

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

JOURNAL OF MARKETING EDUCATION

131

Antecedents

Outcomes

Instructional Technology

Learning Styles
Instructional Methods

Learning Outcomes Learning Performance Pedagogical Affect Course Grade

Student Behaviors

FIGURE 1:

Conceptual Framework of Factors Affecting Learning Outcomes

these impacts are examined across the marketing curriculum rather than a single class, multi-item measures of both affect and learning performance are employed, both technology and nontechnology pedagogies are included, as well as student behavior and preferred learning styles. DEFINITIONS OF LEARNING OUTCOMES The use of multiple outcome variables in an educational setting is recommended to help ensure that the multiple goals and the multiple dimensions of outcomes in the classroom environment are represented (Marks 2000; Williams 1992). Many measures of learning outcomes have been used in educational research including course grade (Brokaw and Merz 2000; Devadoss and Foltz 1996; Romer 1993), student perceptions of overall learning, ability to get a job and expected performance on the job (Clarke, Flaherty, and Mottner 2001), task performance and goal achievement (Deeter-Schmelz, Kennedy, and Ramsey 2002), overall course value perceptions (Marks 2000), and exam scores (Hamer 2000; Ritchie and Volkl 2000). However, from a theoretical standpoint, learning may be viewed as knowledge acquisition through cognitive processing of information acquired both from being part of society and from individual thought processes (Bandura 1986). In addition, performance can be defined as a multidimensional construct involving the behaviors or actions that are relevant to the goals of the course with three primary determinants of relative variance: (1) declarative knowledge and procedures that are prerequisites for successful task performance, (2) procedural knowledge and skills, and (3) volitional choice or effort expended (McCloy, Campbell, and Cudeck 1994). Therefore, combining the two conceptual definitions of

learning and performance provides an outcome variable called learning performance, which will be defined as students self-assessment of their overall knowledge gained, their skills and abilities developed, and the effort they expended in a particular class relative to other classes. In addition, favorable attitudes or affect have been shown to be the result of using instructional methods that are congruent with preferred learning styles (Goodwin 1996; Davis, Misra, and Van Auken 2000) and have been correlated to other measures of course achievement (Dunn et al. 1990). Therefore, learning outcomes in our study were represented with the two self-report outcome variables learning performance and pedagogical affect in addition to the commonly used course grade outcome variable. Each of the two selfreport variables has appeared in prior marketing education literature and has been shown to have exhibited sound psychometric properties involving multi-item scales (Davis, Misra, and Van Auken 2000; Young 2001). HYPOTHESES DEVELOPMENT: ANTECEDENTS TO LEARNING OUTCOMES
Learning Styles

The manner and process in which knowledge is acquired, skills developed, and abilities refined distinctly vary among individuals, producing a typology of learning styles. Kolbs (1984) experiential learning theory describes a four-stage sequential process for creating knowledge through the transformation of experience. A persons preference for which stage of the learning cycle he or she prefers and which stages he or she tends to avoid creates a four-category learning style typology (Convergers, Assimilators, Accommodators, and

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

132

AUGUST 2003

Divergers). Petkus (2000) and Young (2002) provide recent overviews of the experiential learning cycle, along with examples of its application in marketing education. Kolb (1988) suggests that students with similar learning styles prefer academic disciplines and teachers with methods of teaching that are most congruent with their learning style. In addition, there is empirical evidence that learning styles are also highly related to work preference (Lashinger and Boss 1984), educational involvement, motivation and learning (Honey and Mumford 1992), and student performance (Brokaw and Merz 2000; Holley and Jenkins 1993; Okebukola 1986; Roach, Johnston, and Hair 1993). Hence, we propose the following hypotheses; however, given the exploratory nature of this study, we do not hypothesize about particular learning styles and particular preferred methods or technologies. Instead, post hoc analyses will be undertaken if support for the general hypotheses is provided.
Hypothesis 1a: Learning style will account for variation in preferred instructional technologies. Hypothesis 1b: Learning style will account for variation in preferred instructional methods. Hypothesis 1c: Learning style will account for variation in learning outcomes. Instructional Technology

dents have had to the technology. The use of instructional technologies has the potential to more actively involve and motivate students, thereby enhancing student learning outcomes. Consequently, we hypothesize the following:
Hypothesis 2: When student preferred instructional technologies are used, student learning outcomes will increase. Instructional Methods

As technological capabilities expand, academics and businesses are rapidly integrating technology into their classrooms and operations to provide a competitive edge. The study of individual reactions to computer technology and Internet usage in business has been researched from a variety of theoretical perspectives, including rate of adoption (Hill, Smith, and Mann 1987), diffusion of innovations (Compeau and Meister 1997), theory of reasoned action (Webster and Martocchio 1992), and social cognitive theory (Compeau and Higgins 1995). Reactions to integrating technology into the classroom have been primarily anecdotal or at an aggregate level of performance. John Schacter (1999) provides a comprehensive review of research regarding the impact of technology on student learning. Evidence from Schacters review indicates that both positive and negative outcomes can be realized when technology is integrated into the learning environment. More recently in the marketing education literature, a positive relationship was found between self-reported overall learning and 9 of 14 educational tools (instructor home page, Internet project, online homework assignments, online lecture outlines, online syllabus, online student roster age, online student grade page, Web project page, and technology lectures) (Clarke, Flaherty, and Mottner 2001). In addition, Stttinger and Schlegelmilch (2002) reported that students perceive instructional technologies to be advantageous based on their perceptions of the course and career-related benefits of using the technology and the amount of exposure the stu-

A preponderance of marketing education literature suggests a shift from passive, knowledge-transfer instructional methods to interactive, experiential learning (Frontczak 1998). Empirical evidence supports that business students prefer pedagogies that are active and concrete (Nulty and Barrett 1996), prefer learning with other students (Matthews 1994), and prefer instructional pedagogies that are stimulating and real-world oriented (Karns 1993). Numerous specific instructional methods have been investigated such as the use of in-class exercises, cases, and lectures that produced a favorable global attitude toward the marketing major (Davis, Misra, and Van Auken 2000); combining writing and electronic media (McNeilly and Ranney 1998); group research projects (Bridges 1999); group projects and teamwork (McCorkle et al. 1999), and the effect of class activities on student learning (Hamer 2000). Evidence also suggests that favorable attitude toward teaching style leads to higher achievement (Johnson 1996) and that matching instructional methods with learning styles results in greater learning (Dunn et al. 1990). Therefore, we hypothesize the following:
Hypothesis 3: When student-preferred instructional methods are used, student learning outcomes will increase. Student Behavior

Learning outcome, typically measured by course grade, has been directly related to supportive-type class behaviors such as class attendance (Devadoss and Foltz 1996; Romer 1993), in addition to the number of hours spent studying per week, lectures attended, reading the textbook, and taking optional exams (Brokaw and Merz 2000). On the other hand, the amount of competing time activities such as number of hours worked, the hours spent socializing or in sports, and total credit hours taken during the term were found to be negatively related to learning outcomes (Brokaw and Merz 2000; Erekson 1992). Each of these studies has found that classroom-related student behaviors can be empirically related to learning outcomes, suggesting the following hypotheses:
Hypothesis 4a: Student behaviors that are course supportive will be positively related to student learning outcomes. Hypothesis 4b: Student behaviors that are competing time activities will be negatively related to student learning outcomes.

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

JOURNAL OF MARKETING EDUCATION

133

METHOD The data were collected at the end of fall semester 2001 by administering an in-class survey to each section of Principles of Marketing, Market Analysis, Marketing Planning, and Marketing Management, the required core courses in the Marketing curriculum at a midwestern 4-year public university. The university requires all students to lease laptop computers and provides complementary computer projection and communication technology for most classrooms. The primary use of instructional technology in the Principles of Marketing course is to assist the instructor in lecture presentations and communication of assignments and grades to the students. Students are not required to bring their laptops to class; however, many of the students do use their laptops to complete homework assignments outside of class. In contrast, the other three marketing classes require laptops in the classroom, and class activities typically are based on computer usage. Examples of computer applications range from statistical analysis, Internet searches, presentation creation, and computer simulations. The sequence of courses is also designed to systematically expose students to a variety of instructional methods. Market analysis is structured around group research projects, marketing planning uses Internet research to analyze cases, and marketing management is structured around decision making based on computer simulations; in addition, all classes require written communications and oral presentations. In summary, the curriculum does expose students to each of the instructional methods and instructional technologies being investigated in this study. A typical absenteeism rate on the day of the survey produced a response rate of approximately 78%, yielding an effective sample of 207. The distribution of the completed sample across classes was Principles of Marketing (three sections), n = 122 (59%); Market Analysis (two sections), n = 39 (19%); Marketing Planning, n = 29 (14%); and Marketing Management, n = 17 (8%). Demographically, the sample can be described as traditional undergraduates, 42% female, 31% marketing majors, and 16% marketing minors; in addition, the Principles of Marketing students closely mirrored the College of Businesss distribution of majors (accounting 19%, business administration 42%, marketing 18%, and other business 21%). VARIABLES
Learning Outcomes: Dependent Variables

in other classes) to very low (much below that of other classes), which is a modification of a performance scale reported by Young (2001). Pedagogical affect. Affect represents the positive thoughts or feelings toward the instructional methods deployed in a particular class. The statement Overall, in this class, the methods of instruction were . . . was responded to with four semantic differential-type items measured on a 7-point scale. The four scales (effective/ineffective, useful/useless, satisfactory/unsatisfactory, good/bad) for evaluating this overall affect were created from a scale developed by Mitchell and Olsen (1981) and then adopted by Davis, Misra, and Van Auken (2000) to measure the overall affect of marketing majors toward instructional effectiveness and program quality. Course grade. The instructor-assigned grade in the course is also used as a measure of learning outcome. Following the definition and scaling of course grades used by Brokaw and Merz (2000), grades had a range of 0 (an F) to 4 (an A) and are treated as a metric variable.
Independent Variables

Learning styles. The Kolb Learning Style Inventory (Kolb 1984) measures students learning style preference by having the students rank four statements for each of the 12 items making up the inventory. Two primary dimensions are created from the four stages, ACCE is the dimension created by subtracting the scores for the Concrete Experience (CE) scale from the Abstract Conceptualization (AC) scores, while the AERO dimension represents the difference between the Active Experimentation (AE) scores and the Reflective Observation (RO) scores. The four quadrants created by the two dimensions represent the four types of learning styles: Convergers (high ACCE and high AERO scores), Assimilators (high ACCE and low AERO scores), Accommodators (low ACCE and high AERO scores), and Divergers (low ACCE and low AERO scores). Instructional technology. Instructional technology covers a broad spectrum of options ranging from videotapes to sophisticated computer-based instructional programs. Five instructional technologies (e-mail, Internet access, PowerPoint presentation, Blackboard course management software, and laptop computers) listed in Grasha and Yangarber-Hicks (2000) and that had been deployed across the courses sampled in this study were rated on a 7-point effective/ineffective semantic differential scale. The five instructional technologies were evaluated based on the question In general, for any class, which technologies do you find most effective in helping you learn? Instructional methods. Nine commonly used teaching methods (Davis, Misra, and Van Auken 2000) were rated on a

Learning performance. Learning performance was operationalized using six items (knowledge you gained, skills you developed, effort you expended, your ability to apply the material, your desire to learn more about this subject, your understanding of this subject) measured with 6-point scales verbally anchored with extremely high (a level rarely attained

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

134

AUGUST 2003
30

Abstract Conceptualization - Concrete Experience Scale

Assimilating
20

Converging

10

-10

-20

Diverging
-30 -30 -20 -10 0 10

Accommodating
20 30

Active Experimentation Reflective Observation Scale

FIGURE 2:

Distribution of Kolbs Learning Styles (N = 207)

7-point effective/ineffective semantic differential scale as to the statement In general, for any class, which methods of instruction do you find most effective in helping you learn? The nine instructional methods are instructor lectures, cases, computer simulations, group projects, individual projects, exams, class discussions, in-class exercises, and written assignments. Student behavior. Two supportive student behaviors (average class attendance, hours studied for this class) and two competing time behaviors (hours involved with social or sports organizations, hours worked) were measured with fillin-the-blank responses as used by Brokaw and Merz (2000). RESULTS AND DISCUSSION Before the overall model displayed in Figure 1 was estimated, we investigated the potential preference for instructional methods and instructional technologies based on the four underlying learning styles. Each student was classified into one of the four learning styles based on Kolbs Learning Inventory method with a graphic overview of the sample presented in Figure 2. Each of the four learning styles is adequately represented, ranging from 19% to 36% of the sample. One-way analysis of variance was used to test Hypotheses 1a and 1b, and the results are displayed in Table 1. Differences between learning styles did not significantly (.05 level of significance) account for variation in preferences for instructional technology. Therefore, we cannot accept Hypothesis 1a and conclude that preference for different instructional technologies is not dependent on a students preferred learning style. The lack of a significant relationship between learning style and instructional technologies may

suggest students view the technology simply as a tool that is involved in implementing the instructional method. In addition, a particular instructional technology can be employed with great variation within different instructional methods. For example, PowerPoint may be used to assist an instructor with the traditional lecture, or it may be used by student groups to present findings from their experiential learning activities. Whereas the literature seems to be lacking in the investigation of learning style and instructional technology preference, this study suggests that a students preference for instructional technology is not inherently based on fundamental learning style. Three instructional methods (lectures, exams, and written assignments) had significant differences based on learning styles, supporting Hypothesis 1b. In particular, accommodators (prefer concrete experiences and active experimentation) rated lectures and exams lower than the students with other preferred learning styles. Interestingly, Brokaw and Merz (2000) suggest that accommodators tend to prefer marketing careers and, with the trend in marketing education toward experiential learning, these findings may be interpreted as support for the direction marketing education has taken. In addition, writing assignments were evaluated highest by assimilators (high abstract conceptualization and reflective observation). Writing assignments can encourage students to explore and incorporate abstract concepts into their learning and are typically the basis for reflection-type activities. These findings are congruent with the literature and, given the results (three of the nine learning methods), we find support for Hypothesis 1b that different learning styles can account for different preferences for instructional methods.

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

JOURNAL OF MARKETING EDUCATION TABLE 1 LEARNING STYLE DIFFERENCES: ONE-WAY ANOVA RESULTS (N = 207)
Learning Styles a Variable
Instructional technology E-mail Internet access PowerPoint presentation Blackboard software Laptop computer Instructional methods Instructor lectures Cases Computer simulations Group projects Individual projects Exams Class discussions In-class exercises Written assignments Learning outcomes Learning performance Pedagogical affect Course grade

135

Accommodator

Diverger

Assimilator

Converger

F-Value

5.08 5.97 5.62 4.51 5.62 4.26 4.82 4.72 5.21 4.92 4.10 5.72 5.72 4.41 4.52 5.54 3.35

4.51 5.83 5.71 4.54 5.27 5.41 5.10 5.05 5.29 4.98 4.59 6.02 5.98 4.68 4.26 5.75 2.93

4.64 5.88 5.57 4.31 4.96 5.21 4.77 4.64 4.73 5.00 5.04 5.59 5.61 5.01 4.17 5.47 3.40

4.65 5.88 5.31 3.69 4.92 4.98 5.12 4.50 4.96 5.10 4.73 5.67 5.79 4.48 4.29 5.58 3.17

1.10 0.81 0.78 2.17 1.59 6.50* 0.89 0.98 1.47 0.17 3.31* 1.89 0.87 2.47* 2.39 .79 2.34

a. Means. Degrees of freedom: between groups 3, within groups 203, except for course grade (n = 93). *p < .05.

Knowing a students learning style may assist the instructor in selecting appropriate teaching methods (Brokaw and Merz 2000) or recognize that multiple instructional methods must be incorporated into classes with wide distributions of learning styles. This may be particularly relevant with courses that involve students with different cultural backgrounds in that Jaju, Kwak, and Zinkhan (2002) reported significant differences in learning styles between different cultures. The conceptual model displayed in Figure 1 was operationalized with three metric criterion variables (learning performance, pedagogical affect, and course grade) that were analyzed with a set of predictor variables composed of three metric covariates (instructional technology, instructional methods, and student behavior) and one nonmetric factor (learning styles) having four levels. Regression analysis was used to reveal which instructional technologies, instructional methods, student behaviors, and learning styles covary with learning outcomes and can reveal which explanatory variables are most determinant of learning outcomes. Given that the two self-report criterion variables are correlated (r = .45, p = .000), performing separate regression analyses would not incorporate the information provided by the interrelationship among these criterion variables and would defeat the purpose of having multiple criterion measures. Therefore, multivariate multiple regression analysis was performed using the general linear model multivariate procedure in the Statistical Package for the Social Sciences software.

First, the two self-report outcome variables, pedagogical affect and learning performance, were assessed for internal consistence and reliability. The results, presented in Table 2, indicate a Cronbachs alpha of .89 and .80, suggesting robust scales as compared with Nunnallys (1978) recommendation of at least a .70 level. The factor loadings present evidence of the dimensionality of the two constructs. Two factors were extracted using principle components analysis and varimax rotation. The total variance explained was 62%, and each item did load on the expected dimension with all but two loadings above Fornells (1982) recommendation of .70 or higher for retaining items, since they explain almost 50% of the variance in a particular construct. In summary, the reliability of the two outcome scales seems satisfactory. These two dependent variables were also examined for departures from multivariate normality by performing Kolmogorov-Smirnovs (Lilliefors significant correction) test of normality and by examining normal Q-Q plots. The results (learning performance, p = .001; pedagogical affect, p = .000) of these tests suggest no departures from normality. In addition, Boxs test of equality of covariance matrices of the dependent variables across groups (p = .258) and Levenes test of equality of error variances across groups for each of the dependent variables (performance, p = .242; affect, p = .445) could not be rejected; therefore, it seems reasonable to proceed with the multivariate analysis. The following analyses were performed both with the two outcome vari-

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

136

AUGUST 2003 TABLE 2 PEDAGOGICAL AFFECT AND LEARNING PERFORMANCE SCALE DESCRIPTIONS (N = 207)
Mean Standard Deviation Pedagogical Affect a Learning Performancea

Overall, in this class the methods of instruction were . . . effective/ineffective. useful/useless. satisfactory/unsatisfactory. good/bad. Evaluate this class on . . . the knowledge you gained. the skills you developed. the effort you expended. your ability to apply the material. your desire to learn more about this subject. your understanding of this subject. % of variance explained (eigenvalue) of subscales

5.44 5.49 5.61 5.72 4.39 4.11 4.06 4.35 4.30 4.52

1.08 0.99 1.06 1.05 0.79 0.84 1.20 0.91 1.10 0.83

.86 .77 .88 .87 .71 .71 .70 .70 .72 .61 16.92 (1.69) .80

45.31 (4.53) .89

a. Principal components analysis, Varimax rotation with Kaiser normalization.

ables factor scores and with the variables represented as an average of the items for each scale. Virtually identical results were obtained, and therefore, the simpler and more intuitive averaging of items to represent the outcome variables is presented. Next, an examination of the predictor variables correlation matrix was performed and revealed several moderately high correlations (e-mail and Web access, r = .62; lectures and exams, r = .40; simulations and cases, r = .48; individual projects and writing assignments, r = .41; class exercises and class discussions, r = .44), suggesting the potential of multicollinearity and requiring caution in the interpretation of the regression results. It seems intuitive that the correlated pairs of technologies and pedagogies are most likely available and used in combinations in the classroom, and therefore, would be expected to be correlated. These variables could be reduced through factor analysis to solve the statistical problem of multicollinearity; however, the explicit relationships among the variables would be lost. The robustness of regression analysis to multicollinearity for variables with correlations below .50 is typically accepted (Tull and Hawkins 1990). However, a resulting consequence of examining many variables simultaneously that exhibit multicollinearity is that the standard errors of the regression coefficients will tend to be large, thereby artificially lowering their t-values (Dillion and Goldstein 1984). Whereas specific significance levels are reported in the tables, we provide the following interpretations based on a .10 level of significance to compensate for the inflated standard errors. Given the exploratory nature of this study, examining these multiple factors simultaneously may produce results that provide valuable insights despite the more lenient interpretation of statistical assumptions and significance.

The results of the multivariate regression analysis are presented in Tables 3 and 4,1 and according to the results, learning performance is driven (R 2 = .18)2 by project-oriented instructional methods (both group and individual project coefficients are significant, supporting Hypothesis 3), the use of Blackboard course management software (supporting Hypothesis 2), and the amount of time students spend studying (positive relationship, supporting Hypothesis 4a) and working (negative relationship, supporting Hypothesis 4b). Recall that performance was defined to have three primary dimensions: volitional choice or effort (e.g., hours spent studying and working), ability to apply knowledge (e.g., projects for pedagogy), and knowledge gained (e.g., feedback on tests and assignments using Blackboard), which seems consistent with these results. Selecting student-preferred, project-based pedagogies enhances involvement and motivation for learning (Stttinger and Schlegelmilch 2002), which supports learning performance. The significant coefficients for group and individual project-type instructional methods suggests that the trend in marketing education toward the application of marketing knowledge (Karns 1993) and experiential learning (Frontczak 1998) is appropriate. The use of course management software Blackboard offers the ability to provide online syllabi, readings, outlines, assignments, grade information, and students rosters, all of which have shown to be related to students perceptions of overall learning (Clarke, Flaherty, and Mottner 2001) and can be very effective in providing timely feedback on performance also shown to enhance learning (Bransford, Brown, and Cocking 1999). Thus, course management software seems to be effective in enhancing self-report learning performance by communicating direction, expectations, and status of performance.

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

JOURNAL OF MARKETING EDUCATION TABLE 3 MULTIVARIATE MULTIPLE REGRESSION ANALYSIS PARAMETER ESTIMATES (N = 206)
Dependent Variable
2

137

Parameter

B 2.802 6.104E-02 4.970E-03 4.460E-02 6.421E-02 8.423E-02 9.732E-03 1.305E-02 4.508E-02 2.492E-02 2.879E-02 2.266E-02 2.278E-02 6.137E-02 1.702E-02 8.740E-03 2.662E-02 3.388E-03 6.426E-04 .197 .172 .181 a 0 1.691 9.932E-02 7.900E-02 3.430E-02 9.468E-02 8.526E-02 5.628E-02 6.493E-02 9.923E-02 2.725E-02 1.657E-02 6.315E-02 .112 4.166E-02 3.634E-02 5.105E-03 3.757E-03 5.347E-03 3.340E-04 3.485E-02 1.088E-02 .132 a 0

SE .428 .042 .038 .035 .033 .046 .033 .045 .046 .043 .034 .045 .041 .027 .028 .004 .009 .005 .003 .141 .136 .120 .539 .052 .048 .044 .041 .057 .041 .056 .057 .054 .043 .057 .051 .034 .036 .005 .011 .006 .004 .177 .171 .151

t 6.552 1.468 .131 1.275 1.973 1.849 .297 .291 .989 .580 .835 .498 .557 2.253 .602 2.424 3.109 .714 .229 1.398 1.267 1.509 3.139 1.896 1.656 .778 2.309 1.485 1.364 1.150 1.727 -.503 .381 1.102 2.182 1.214 -1.020 1.124 .348 .895 .094 .197 .064 .872

Significance
.000 .144 .896 .204 .050* .066* .767 .771 .324 .563 .405 .619 .578 .025* .548 .016 .002* .476 .819 .164 .207 .133 .002 .060* .099* .438 .022* .139 .174 .252 .086* .615 .703 .272 .030* .226 .309 .263 .728 .372 .925 .844 .949 .384

Learning performance (R = .18) Intercept LECTURE CASES SIMULATION GROUPPROJECT INDIVPROJECT EXAMS CLASSDISCUSSION CLASSEXERCISE WRITINGASSIG EMAIL WEBACCESS POWERPOINT BLACKBOARD LAPTOP WORKHOURS STUDYHOURS PARTYHOURS CLASSATTEND ACCOMMODATING DIVERGING ASSIMILATING CONVERGING Pedagogical affect (R = .28)
2

Intercept LECTURE CASES SIMULATION GROUPPROJECT INDIVPROJECT EXAMS CLASSDISCUSSION CLASSEXERCISE WRITINGASSIG EMAIL WEBACCESS POWERPOINT BLACKBOARD LAPTOP WORKHOURS STUDYHOURS PARTYHOURS CLASSATTEND ACCOMMODATING DIVERGING ASSIMILATING CONVERGING

NOTE: Estimation method: general linear model multivariate procedure (Statistical Package for the Social Sciences). a. This parameter is set to zero because it is redundant. * Significant at .10.

Consistent with the literature, these results also indicate the importance of student behavioral determinants on performance even when preferred instructional methods and technology are provided. It is clear that students must have the ability (time available for studying) and the willingness (time

spent studying) to raise their learning performance. The ability to devote time to studying may be influenced through advising on what constitutes a reasonable course load, work schedule, and extra curricula commitments in addition to clearly specifying expectation on the time commitment

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

138

AUGUST 2003 TABLE 4 COURSE GRADE REGRESSION ANALYSIS PARAMETER ESTIMATES (N = 93)

Dependent Variable
Course grade (R = .14)
2

Parameter
Intercept LECTURE CASES SIMULATION GROUPPROJECT INDIVPROJECT EXAMS CLASSDISCUSSION CLASSEXERCISE WRITINGASSIG EMAIL WEBACCESS POWERPOINT BLACKBOARD LAPTOP WORKHOURS STUDYHOURS PARTYHOURS CLASSATTEND ACCOMMODATING DIVERGING ASSIMILATING CONVERGING

B 3.392 6.041E-02 6.021E-02 3.619E-03 .175 .133 2.291E-02 5.556E-03 8.322E-02 9.851E-02 1.149E-02 6.787E-02 5.703E-02 9.711E-03 2.267E-02 5.137E-03 2.065E-02 5.502E-03 6.620E-03 .310 .103 .232 a 0

SE .883 .065 .058 .052 .055 .072 .049 .056 .062 .076 .054 .063 .061 .043 .050 .005 .011 .007 .007 .194 .207 .174 .

t 3.842 .925 -1.046 .069 -3.174 1.841 .465 .100 1.346 1.288 .214 1.071 .936 .226 -.455 .959 1.871 .747 -.900 1.599 .497 1.332 .

Significance
.000 .358 .299 .945 .002* .070* .643 .921 .183 .202 .831 .288 .352 .822 .650 .341 .065* .457 .371 .114 .621 .187 .

NOTE: Estimation method: general linear model multivariate procedure (Statistical Package for the Social Sciences). a. This parameter is set to zero because it is redundant. * Significant at .10.

required for a particular course. The time spent studying was only correlated with instructional methods that involved computer simulations and exercises. Apparently, certain instructional methods either require more study time or provide more motivation to study; however, whether because of or in spite of instructional pedagogies, student behaviors must be accounted for in explaining learning performance. Pedagogical affect is primarily explained (R 2 = .28) by preferences for different types of instructional methods (lecture, cases, group projects, and class exercises all having significant coefficients). It should be noted that the nonsignificant instructional methodology variables had high pairwise correlation with the significant variables, and therefore their coefficients may be the result of multicollinearity, suggesting that the whole mix of instructional methods drives pedagogical affect. The only instructional technology variable that provided a significant coefficient was PowerPoint, which was used to support instructor lecturing and aid in student presentations. It must be noted that these results are based on responses from students who have been acclimated to various instructional technologies throughout their college education, which reduces potential Hawthorne Effects of one course or onetime exposure to new technology or pedagogy. These results seem in contrast to findings of Stttinger

and Schlegelmilch (2002) that positive attitudes toward instructional technology are strongly correlated with technology exposure. This suggests that this relationship may not be a simple linear relationship but instead an inverted U-shape, meaning that at some point, with very high exposure, students perceptions of the benefits of technology diminish. A corollary explanation may be that instructional methods are the most important factor and that instructional technology is simply a tool to carry out instructional methods, thereby reducing its influence when examined relative to instructional methods. In summary, pedagogical affect seems to be primarily driven by instructional methods (supporting Hypothesis 3) with secondary effects of technology (supporting Hypothesis 2) and is not significantly influenced by other nonpedagogical aspects (student behaviors or learning styles) of the class. Parameter estimates obtained from the multivariate regression analysis for course grade are presented in Table 4. Student respondents were given the option of including their technical identification number for reasons of anonymity, which resulted in a subsample of 93 students whose grades were able to be matched to the rest of the variables. The distribution of grades for this subsample was 1%, Ds; 7%, Cs; 58%, Bs; and 34%, As. Correlations between grades and

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

JOURNAL OF MARKETING EDUCATION

139

learning performance and pedagogical affect were insignificant. This lack of correlation may be due to the sampled courses being upper-level and, as can be seen in the grade distribution, the variable is highly skewed toward As and Bs, whereas the other two dependent variables have normal distributions. In addition, the upper-level courses, where grades were reported, were team taught by two or three faculty members, which probably further reduced students ability to accurately estimate course grades. If one were to assume selfreport learning performance represents a students judgment for his or her expected grade, we could expect poorer students to overestimate their performance and better students to underestimate their performance (Kennedy, Lawton, and Plumlee 2002), which, when range restricted to As and Bs, would produce insignificant or nonmeaningful correlations. To keep consistent in reporting the results, a general linear model that incorporated the three dependent variables in the model was formulated. The analysis produced the same parameter estimates for course grade as a univariate regression, given the lack of significant correlations among course grade and learning performance and pedagogical affect. Similar to the learning performance results, the instructional methods that were significant are group and individual projects. In addition, number of hours spent studying was significant, while none of the instructional technologies or learning styles produced significant coefficients. The results make intuitive sense given this sample and the heavy use of projectbased pedagogies and the substantial out-of-class effort required to complete the projects. Interestingly, course grade seems independent of students preference for different instructional technologies when examined relative to other antecedents of learning outcomes. Overall, these results add further support for Hypotheses 3 (instructional methods) and 4a (supportive behaviors). Noteworthy, the factor learning styles was found not to be significant in explaining learning performance, pedagogical affect, or course grade. Even analyzing (ANOVA) learning performance, pedagogical affect, and course grade without the covariates and only using the factor learning styles, no significant variation was accounted for in the outcomes. On the basis of both ANOVA and multivariate regression results, we do not find support for Hypothesis 1c that learning styles will account for variation in learning outcomes and, given the previous conclusions for Hypotheses 1a and 1b learning styles, seems lacking in its ability in predicting learning outcomes. A possible explanation, for this particular sample, may be that much effort has been made by the faculty to incorporate aspects from each of the four stages of the experiential learning cycle specifically trying to providing opportunities for each leaning style. With ample opportunity for students to learn in their own preferred style and by exposing all students to all four stages of the learning cycle, the learning styles variable may simply wash out in this particular sample.

CONCLUSIONS AND IMPLICATIONS As a whole, the combined analyses using three different measures of learning outcomes imply that the use of preferred instructional methods will enhance each of the different measures of learning outcomes, while encouraging supportive class behaviors and limiting competing time activities can enhance self-report performance and course grades. Regardless of the performance outcome measure, only one of the five instructional technology variables proved significant, suggesting that in contrast to previous studies that examined technology in isolation, when analyzed relative to other learning factors, technologys influence is secondary. Also in contrast to the literature was the lack of influence of learning style on learning outcomes; however, once again, the issue of analyzing a single factor relative to multiple influences may account for these findings. Preliminary insight from this exploratory synthesis and extension of previous research suggests that caution should be used in interpreting findings based on technology tools or other antecedents of learning examined in isolation. Although more evidence is needed to draw a definitive conclusion, we believe these results indicate that learning is a two-way street where the primary contribution from the instructor is appropriate instructional methods and the primary contribution from the student is study time. Note that study time was significant in both performance-type outcomes. The role of technology is probably a moderator that can assist or distract from the instructional methods and the time students spend studying primary course concepts. Understanding learning styles can help instructors design appropriate instructional methods, while technology proficiency can leverage students study time. From a marketing educators perspective, the results lead to the following teaching implications. First, we recommend that project-based instructional methods be used to enhance involvement and motivation leading to enhanced performance. In particular, our results suggest that group projects were significant in enhancing affect, self-report performance, and course grade. Most experiential learning techniques incorporate some form of projects, suggesting that the reported trend toward experiential learning in marketing education seems appropriate. Group project-based learning also encourages collaborative learning and can change the role of the instructor from a formal authority role to more of an informal coach, which facilitates student-faculty interaction. Second, the importance of student behavior, particularly study time, should be recognized, and efforts to create proper expectations of time on task and study habits should be a primary consideration in course design. In this study, we found that the use of computer simulations increased the number of hours students reported studying for a class. It may be that the simulations provided motivation for increased studying or simply required more out-of-class work.

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

140

AUGUST 2003

Third, we recommend using instructional technology such as Blackboard software that will assist in communicating high expectations and can provide prompt feedback and monitoring of performance. The adage that what gets measured gets attention also seems to hold true in education. We believe that the lack of significant coefficients for learning styles in this study is a research artifact given our facultys specific efforts to systematically incorporate a range of teaching methods within courses and across the curriculum that addresses the variety of learning styles. Therefore, our final recommendation based on the education literature is to design courses with a variety of pedagogical approaches to teach to the diversity of learning styles and to expose students to all four stages of Kolbs learning cycle for major concepts. The above recommendations provide a student-centered learning environment that incorporates Chickering and Gamsons (1987) seven principles for good practice in undergraduate education. Although this article provides a first empirical attempt to incorporate multiple influences on learning outcomes, further research is clearly needed. Overcoming the potential limitations of this study provides guidance for further research. First, this study was based on a sample from one university that has relatively high exposure to instructional technology. Student samples with limited technology exposure may realize Hawthorne effects and skew the results either artificially high because of their technology focus or artificially low because of their perceived problems of technology adoption. Replicating this study in different educational environments with different levels of technology and instructional methods is needed to assess the generalizability of our findings. Student exposure or familiarity with the technology should be explicitly accounted for in future studies. Second, direct extensions of this study would be the inclusion of additional antecedent variables and the refinement of the measurement of existing variables. In particular, refinement of how a particular instructional technology is deployed and its interaction with the instructional methods should be developed. The choice of the dependent variable as an affect construct, a self-report performance construct, and as the instructor-assigned course grade does provide different insight into the effect of various technologies, instructional methods, and behaviors. We expected to find learning performance to be significantly related to pedagogical affect; however, the insignificant relationship between instructorassigned course grade and both self-report learning performance and pedagogical affect was unexpected. Whereas this study was not intended to address this particular issue, it does point out the sensitivity of the results to the selection of the dependent variable and the necessity for further investigation into appropriateness of particular dependent variables for specific research questions. Third and perhaps the most critical event to guide this stream of research is the adoption or formulation of a broad-

based learning theory to direct systematic investigation and provide assistance in the interpretation of findings. Psychologists have suggested a variety of theories to understand and explain how people learn. Basic theoretical perspectives of learning include behaviorists theories, developmental theories, and cognitive theories. In particular, social cognitive theory provides a conceptual framework for clarifying the psychological mechanisms through which social-structural factors are linked to performance (Bandura 1986). Behavior, personal factors, and cognitions, as well as environmental events interact bidirectionally so that people are both products and producers of their environment. Social cognitive theory provides not only explanatory and predictive power but also explicit guidelines about how to equip people with competencies and the sense of efficacy that will enable them to enhance their accomplishments (Wood and Bandura 1989). The quest for enhancing our ability to teach effectively and increase student learning is gaining importance as an identifiable research stream and is evolving in its academic scholarship. Clearly defined outcomes and examining multiple influences simultaneously seem to be critical in advancing our understanding of technology and other educational pedagogies. As the conceptual rationale for technology and teaching pedagogies continues to develop and the empirical evidence grows, our understanding of their effects on learning and teaching will help prepare both faculty and students for their respective careers. NOTES
1. The analysis was also conducted using only the Principles of Marketing students to determine if the findings and conclusions would differ given varying levels of exposure to the explanatory variables. Consistent with the total sample results, learning styles were insignificant for both performance and affect outcomes (not supporting Hypothesis 1), instructional technologies were insignificant for both outcomes (not supporting Hypothesis 2), instructional methods (group projects) were significant for both outcomes (supporting Hypothesis 3), and student behaviors (work hours negatively related to performance and study hours positively related to pedagogy affect) results supporting Hypothesis 4. Our conclusion that appropriate instructional methods and student behaviors are the primary determinants with technology as a probably moderator does not change. 2. The reported summary of overall model fit is the adjusted coefficient of determination (R2). This fit statistic not only represents the proportion of variability in the response variable that is accounted for by the regression model but it takes into account the number of predictors presented in the model. Whereas the multiple coefficient of determination (R2) can be artificially increased by adding explanatory variables, the adjusted R2 will only increase if the t-value of the newly added variable is greater than one (Dillion and Goldstein 1984). The magnitude of the reported R2s should be expected to be relatively low given these models incorporate more than 20 variables with the majority of t-values less than 1. The intent of this analysis was to simultaneously examine the predictors and not to build a parsimonious model with a high R2. In context, Davis, Misra, and Van Auken (2000) report R2s ranging from .24 to .38 after stepwise variable reduction in predicting pedagogical preference; Deeter-Schmelz, Kennedy, and Ramsey (2002) esti2 mated an R = .07 for teamworks prediction of performance; Adrian and Palmer (1999) used three variables to explain grades with an R2 = .59; and Nonis and Swift (1998) report R2s ranging from .06 to .35 in examining classroom behavior. Thus, the magnitude of our reported R2s is consistent with the literature.

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

JOURNAL OF MARKETING EDUCATION

141

REFERENCES
Adrian, C. Mitchell, and G. Dean Palmer. 1999. Toward enhancing the quality and quantity of marketing majors. Journal of Marketing Education 21(1): 25-33. Bandura, Albert. 1986. Social foundations of thought and action: A socialcognitive view. Englewood Cliffs, NJ: Prentice Hall. Bransford, John D., Ann L. Brown, and Rodney R. Cocking, eds. 1999. How people learn: Brain, mind, experience and school. Washington, DC: National Academy Press. Bridges, Eileen. 1999. Experiential learning and customer needs in the undergraduate marketing research course. Journal of Marketing Education 21 (1): 51-59. Brokaw, Alan J., and Thomas E. Merz. 2000. The effects of student behavior and preferred learning style on performance. Journal of Business Education 1 (spring): 44-53. Chickering, Arthur W., and Zelda F. Gamson. 1987. Seven principles for good practice in undergraduate education. Wingspread Journal 9 (2): 1-16. Clarke, Irvine III, Theresa B. Flaherty, and Sandra Mottner. 2001. Student perceptions of educational technology tools. Journal of Marketing Education 23 (3): 169-77. Compeau, Deborah, and Christopher Higgins. 1995. Computer self-efficacy: Development of a measure and initial test. MIS Quarterly 19 (2): 189211. Compeau, Deborah, and D. B. Meister. 1997. Measurement of perceived characteristics of innovating: A reconsideration based on three empirical studies. Paper presented at the annual meeting of the Diffusion Interest Group on Information Technology, 15 December, Atlanta, GA. Davis, Richard, Shekhar Misra, and Stuart Van Auken. 2000. Relating pedagogical preference of marketing seniors and alumni to attitude toward the major. Journal of Marketing Education 22 (2): 147-154. Deeter-Schmelz, Dawn R., Karen Norman Kennedy, and Rosemary P. Ramsey. 2002. Enriching our understanding of student effectiveness. Journal of Marketing Education 24 (2): 114-124. Devadoss, S., and J. Foltz. 1996. Evaluating factors influencing student class attendance and performance. American Journal of Agricultural Economics 78 (August): 499-507. Dillion, William R., and Matthew Goldstein. 1984. Multivariate analysis: Methods and applications. New York: John Wiley. Dunn, Rita, Mary C. Giannitti, John B. Murray, and Ino Rossi. 1990. Grouping students for instruction: Effects of learning style on achievement and attitudes. Journal of Social Psychology 130:485-94. Erekson, O. H. 1992. Joint determination of college student achievement and effort: Implications for college teaching. Research in Higher Education 33 (4): 433-446. Fornell, C. R. 1982. A second generation of multivariate analysis. Volumes 1-2, Methods. New York: Praeger. Frontczak, Nancy T. 1998. A paradigm for the selection, use and development of experiential learning activities in marketing education. Marketing Education Review 8 (3): 25-34. Goodwin, Donna. 1996. Effects of matching student and instructor learning style preferences on academic achievement in english. Ph.D. dissertation, University of Arkansas. Grasha, Anthony F., and Natalia Yangarber-Hicks. 2000. Integrating teaching styles and learning styles with instructional technology. College Teaching 48 (1): 2-10. Hamer, Lawrence O. 2000. The additive effects of semistructured classroom activities on student learning: An application of classroom-based experiential learning techniques. Journal of Marketing Education 22 (1): 25-34. Hill, T., N. D. Smith, and M. F. Mann. 1987. Role of efficacy expectations in predicting the decision to use advanced technologies: The case of computers. Journal of Applied Psychology 72 (2): 307-313. Holley, J. H., and E. K. Jenkins. 1993. The relationship between student learning style and performance on various test question formats. Journal of Education for Business 68 (5): 301-308.

Honey, Peter, and Alan Mumford. 1992. The manual of learning styles. Berkshire, UK: Maidenhead. Jaju, Anupam, Hyokjin Kwak, and George M. Zinkhan. 2002. Learning styles of undergraduate business students: A cross-cultural comparison between the US, India, and Korea. Marketing Education Review 12 (2): 49-62. Johnson, Rayneld R. 1996. An analysis of learner variables related to achievement in an introductory graduate statistics course. Ph.D. dissertation, Wayne State University, Detroit, MI. Karns, Gary L. 1993. Marketing student perceptions of learning activities: Structure, preferences, and effectiveness. Journal of Marketing Education 15 (1): 3-10. Kennedy, Ellen J., Leigh Lawton, and E. Leroy Plumlee. 2002. Blissful ignorance: The problem of unrecognized incompetence and academic performance. Journal of Marketing Education 24 (3): 243-252. Kolb, D. A. 1984. Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall. . 1988. Learning styles and disciplinary differences. California Management Review 18 (3): 22-31. Lashinger, H. K., and M. W. Boss. 1984. Learning styles in nursing students and career choices. Journal of Advanced Nursing 9:375-80. Marks, Ronald B. 2000. Determinants of student evaluations of global measures of instructor and course value. Journal of Marketing Education 22 (2): 108-19. Matthews, Doris B. 1994. An investigation of studentslearning styles in various disciplines in colleges and universities. Journal of Humanistic Education and Development 33 (2): 65-74. McCloy, Rodney A., John P. Campbell, and Robert Cudeck. 1994. A confirmatory test of a model of performance determinants. Journal of Applied Psychology 79 (4): 493-516. McCorkle, Denny E., James Reardon, Joe F. Alexander, Nathan D. King, Robert C. Harris, and R. Vishwanathan Iyer. 1999. Understanding marketing students, group projects, and teamwork: The good, the bad, and the ugly? Journal of Marketing Education 21 (2): 106-17. McNeilley, Kevin M., and Frances J. Ranney. 1998. Combining writing and the electronic media in sales management courses. Journal of Marketing Education 20 (3): 226-235. Mitchell, Andrew A., and Jerry C. Olsen. 1981. Are product attribute beliefs the only mediators of advertising affects on brand attitude? Journal of Marketing Research 18:318-32. Nonis, Sarath A., and Cathy Owens Swift. 1998. Deterring cheating behavior in the marketing classroom: An analysis of the effects of demographics, attitudes, and in-class deterrent strategies. Journal of Marketing Education 20 (3): 188-99. Nulty, Duncan D., and Mary A. Barrett. 1996. Transitions in students learning styles. Studies in Higher Education 21:333-45. Nunnally, Jum C. 1978. Psychometric theory. 2d ed. New York: McGrawHill. Okebukola, Peter Akinsola. 1986. The influence of preferred learning styles on cooperative learning in science. Science Education 70:509-17. Petkus, Ed, Jr. 2000. A theoretical and practical framework for service-learning in marketing: Kolbs experiential learning cycle. Journal of Marketing Education 22 (1): 64-70. Ritchie, Donn, and Chris Volkl. 2000. Effectiveness of two generative learning strategies in the science classroom. School Science and Mathematics 100 (2): 83-89. Roach, S. S., M. W. Johnston, and J. F. Hair Jr. 1993. An exploratory examination of teaching styles currently employed in marketing education: Developing a typology and its implications for marketing students. Journal of Marketing Education 18 (3): 32-38. Romer, D. 1993. Do students go to class? Should they? Journal of Economic Perspectives 7 (3): 167-74. Schacter, John. 1999. The impact of education technology on student achievement. Milken Exchange on Education Technology. Retrieved from http://www.milkenexchange.org/

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013

142

AUGUST 2003
Webster, J., and J. Martocchio. 1992. Microcomputer playfulness: Development of a measure with workplace implications. MIS Quarterly 16 (2): 201-26. Williams, Tim. 1992. Validating a degree: Subject relevance and assessment issues in the development of a new degree. Education and Training 34 (3): 31-33. Wood, Robert, and Albert Bandura. 1989. Social cognitive theory of organizational management. Academy of Management Review 14 (3): 361-384. Young, Mark R. 2001. Windowed, wired and webbedNow what? Journal of Marketing Education 23 (1): 45-54. . 2002. Experiential learning = hands-on + minds-on. Marketing Education Review 12 (1): 43-52.

Stewart, Karen L., and Linda A. Felicetti. 1992. Learning styles of marketing majors. Educational Research Quarterly 15 (2): 15-23. Stttinger, Barbara, and Bodo B. Schlegelmilch. 2002. Information and communication technologies in tertiary education: A customer perspective. Marketing Education Review 12 (2): 63-72. Strauss, Denise T., and Raymond D. Frost. 1999. Selecting instructional technology media for the marketing classroom. Marketing Education Review 9 (1): 11-20. Tull, Donald S., and Del I. Hawkins. 1990. Marketing research: Measurement and method. 5th ed. New York: Macmillan.

Downloaded from jmd.sagepub.com at UNIV DE LOS ANDES on April 30, 2013