Facilitating Student Engagement Through PDF
Facilitating Student Engagement Through PDF
Facilitating Student Engagement Through PDF
Melissa Bond
University of Oldenburg, Germany
Katja Buntins
University of Duisburg-Essen, Germany
Olaf Zawacki-Richter
University of Oldenburg, Germany
Michael Kerres
University of Duisburg-Essen, Germany
Introduction
With the documented need for higher education graduates to be proficient in using educational technology
(EdTech) in their professional lives (e.g., Organization for Economic Cooperation and Development
[OECD], 2015; Redecker, 2017), as well as acquiring twenty-first century skills during their studies (Claro
& Ananiadou, 2009; Oliver & Jorre de St Jorre, 2018), the use of EdTech in higher education has attracted
increased interest from researchers, for example, in the technologies that students find helpful or unhelpful
in their studies (Henderson, Selwyn, & Aston, 2017; Selwyn, 2016) and patterns of media usage of
126
Australasian Journal of Educational Technology, 2020, 36(4).
(non)traditional students (Dolch & Zawacki-Richter, 2018). Furthermore, research has found that the
pedagogically informed use of technology can also support student engagement (e.g., Schindler,
Burkholder, Morad, & Marsh, 2017), a concept that has been gaining importance recently, as it links the
individual student’s internal constitution and external environment, leading to overall improved student
outcomes (see Bond & Bedenlier, 2019).
Whilst there are ongoing conversations about the nature of student engagement, researchers agree that it is
an enigmatic and complex construct (Appleton, Christenson, & Furlong, 2008; Kahu, 2013), with three
generally accepted dimensions; behavioural, affective, and cognitive (Fredricks, Blumenfeld, & Paris,
2004). Each dimension houses facets (called indicators by some researchers) of engagement and
disengagement (Appendix A), which students experience on a continuum (Coates, 2007; Payne, 2017) of
high or low activation and positive or negative valence (Pekrun & Linnenbrink-Garcia, 2012). This study
is guided by the following definition:
Student engagement is the energy and effort that students employ within their learning
community, observable via any number of behavioral, cognitive or affective indicators across
a continuum. It is shaped by a range of structural and internal influences, including the
complex interplay of relationships, learning activities and the learning environment. The
more students are engaged and empowered within their learning community, the more likely
they are to channel that energy back into their learning, leading to a range of short and long
term outcomes, that can likewise further fuel engagement (Bond, Buntins, Bedenlier,
Zawacki-Richter, & Kerres, 2020, p. 3).
Given that the nexus between SE and technology use in higher education has not yet been comprehensively
researched, a systematic review was conducted into this topic, comprising a total of 243 empirical studies
(see Bond et al., 2020). A keyword-based search focused on “systematic review” OR “meta analysis” OR
“literature review” AND “educational technology”, conducted in April 2019, yielded a small number of
studies that addressed the specific context of English as a second language (ESL) and English as a foreign
language (EFL), as a discipline within arts and humanities (A&H) (United Nations Educational, Scientific
and Cultural Organization Institute for Statistics [UNESCO] Institute for Statistics, 2015). However, the
focus of three meta-analyses identified was not on the greater concept of student engagement, but rather
primarily on achievement (Chang & Lin, 2013; Chiu, Kao, & Reynolds, 2012; Cho, Lee, Joo, & Becker,
2018), whilst other reviews found blended learning in ESL/EFL to be one way to enhance motivation
(Albiladi & Alshareef, 2019), and another reviewed how language learning, mediated through digital
games, influenced student learning outcomes on different levels (Hung, Yang, Hwang, Chu, & Wang,
2018). Therefore, in order to gain a broader overview, and to deepen insights into technology use within
the field of A&H, the following research questions guide this analysis:
1. What are the characteristics (countries, educational settings, study population, technology tools
used) of and methods used in research on SE and EdTech in higher education, within the field of
A&H, and how do they compare to the overall sample?
2. How is research within the field of A&H theoretically grounded?
3. Which facets of student engagement and disengagement are affected as a result of using EdTech
in the field of A&H?
Method
As part of a larger research project, a systematic review was conducted into the relation of EdTech and
student engagement in higher education (Gough, Oliver, & Thomas, 2012). Clear inclusion and exclusion
criteria were applied (Table 1), limiting the search to publications from 2007, to ensure that technology
tools were not outdated. In order to provide transparency of the review process, a review protocol was
created, which can be retrieved from ResearchGate at https://www.researchgate.net/project/Facilitating-
student-engagement-with-digital-media-in-higher-education-ActiveLeaRn, alongside the full data set.
Screening 18,068 titles and abstracts led to 4,152 remaining references for potential inclusion (Figure 1).
Due to time limitations and the large number of relevant articles in the population, a sample size estimation
was carried out with the R Package MBESS (Kelley, Lai, Lai, & Suggests, 2018), to identify a sample from
this reference corpus for further analysis (Kupper & Hafner, 1989). Under acceptance of a 5% error range,
127
Australasian Journal of Educational Technology, 2020, 36(4).
a percentage of 50%, and an alpha of 5%, 349 articles were sampled. Taking into consideration the
increased diversity of EdTech over the past years and the likewise increasing prominence of the concept of
student engagement (Zepke, 2018), the articles were stratified by their year of publication. The 349 articles
were screened on full text and 232 were then included, some of which reported on more than one study.
Subsequently, 243 individual studies were coded, applying an inclusive code scheme.
Table 1
Final inclusion and exclusion criteria
Inclusion Criteria Exclusion Criteria
Published between 2007-2016 Published before 2007
English language Not in English
Higher education Not higher education
Empirical, primary research Not empirical, primary research (e.g., review)
Indexed in ERIC, Web of Science, Scopus or Evaluation or a description of a tool
PsycINFO No educational technology
Educational technology No learning setting
Student engagement No student engagement
Figure 1. Systematic review PRISMA flow chart, slightly modified after Brunton and Thomas (2012, p.
86) and Moher, Liberati, Tetzlaff, & Altman (2009, p. 8)
Due to the sheer number of educational technology tools and applications identified across the 243 studies,
a problem also shared by other reviews (e.g., Lai & Bower, 2019), the decision was made to employ
Bower’s (2016) typology of learning technologies (Appendix B), in order to group tools that share the same
“structure of information” (Bower, 2016, p. 773). Whilst some of the tools could be classified into more
than one type within the typology, for example wikis can be used for collaborative tasks, knowledge
organisation and sharing, or for individual website creation, “the type of learning that results from the use
of the tool is dependent on the task and the way people engage with it rather than the technology itself”
therefore “the typology is presented as descriptions of what each type of tool enables and example use cases
rather than prescriptions of any particular pedagogical value system” (Bower, 2016, p. 774). For a deeper
explanation of each category, please see Bower (2015).
128
Australasian Journal of Educational Technology, 2020, 36(4).
Results
Study characteristics
Of the 42 A&H studies, the sub-field of languages accounts for 36 studies (83.3%) (Appendix C). English
was the language most often researched (64.8%, n = 27), with six studies investigating other A&H subjects,
for example anthropology (Fukuzawa & Boyd, 2016) or women’s health and human rights (Carver, Davis,
Kelley, Obar, & Davis, 2012). The 42 studies were published in 41 articles, as Carver et al. (2012) reported
on four independent studies, two of which investigated disciplines pertaining to A&H. Studies in this
sample are cited 30.95 times (SD = 44.62) on average.
Geographical characteristics
A total of 21.4% (n = 9) were undertaken in Taiwan, followed by China (16.7%, n = 7) and the United
States (12.0%, n = 5) (Figure 2). This means a clear under-representation of the United States in the field
of A&H when compared to the overall sample (-28.4%). Less striking, but similar, this under-representation
also applies to European countries such as the United Kingdom (-10.1%), Spain (-4.0%), and Turkey (-
4.0%). Compared to the overall sample, it is the east Asian industrialised countries of Taiwan (17.0%),
China (15.7%), and Japan (9.5%) that are strikingly over-represented compared to other fields, so that it
can be concluded that the field of A&H consists primarily of studies on language learning in east Asian
advanced economies.
20%
15%
percent difference A&H vs. not A&H
10%
5%
0%
-5%
-10%
-15%
-20%
-25%
-30%
-35%
ESP
SWE
NoS
TWN
MYS
SGP
USA
IRI
ZAF
TUR
CAN
CHN
GBR
AUS
HKG
JAP
KOR
inter
Figure 2. Percentage deviation from the average relative frequencies of articles per country (≥ 3 articles
in the overall sample).
Note. NoS = not stated; AUS = Australia; CAN = Canada; CHN = China; HKG = Hong Kong; inter =
international; IRI = Iran; JAP = Japan; MYS = Malaysia; SGP = Singapore; ZAF = South Africa; KOR =
South Korea; ESP = Spain; SWE = Sweden; TWN = Taiwan; TUR = Turkey; GBR = United Kingdom;
USA = United States of America
Half of the studies (50%, n = 21) took place in a blended learning format, with purely online settings used
in 23.8% (n = 10) and face-to-face settings used in 19.0% (n = 8). However, four studies did not allow for
identification of the mode of delivery used (9.5%, n = 4). Compared to the non- A&H sample, the share of
blended learning is higher in A&H by 6.2%, whilst online and face-to-face delivery occur less often (see
Figure 3). The share of studies not specifying their mode of delivery is higher by 5.5% in the A&H sample
and reflects the need for further explanation of study context within future empirical research in the field.
Social collaborative learning (SCL) was employed in 76.2% (n = 32), and with 38.1% (n = 16), self-directed
learning (SDL) was used in less than half of the studies. In another three studies (14.3%), the learning
129
Australasian Journal of Educational Technology, 2020, 36(4).
scenario was not specified. Game-based learning (GBL) was used in two studies (4.8%) and personal
learning environments (PLE) in one (2.4%). In order to determine how often learning scenarios occurred
together, the number of common occurrences (𝑝𝑝𝐴𝐴𝐴𝐴 )were calculated relative to the maximum possible
number of common occurrences. In concrete terms, this means that in a contingency table, the cell that
indicated how often two learning scenarios occurred together is used (𝐴𝐴+ ∧ 𝐵𝐵 + ) and the number in this cell
was determined by the smaller number of respective learning scenarios (A ∧ 𝐵𝐵). Expressed as a formula:
𝐴𝐴+ ∩ 𝐵𝐵+
𝑝𝑝𝐴𝐴𝐴𝐴 =
min{𝐴𝐴, 𝐵𝐵}
Equation 1.
In 56% of possible cases, SCL and SDL appear in combination (n = 9). Half of the studies using GBL were
combined with SDL or SCL, with the study using PLE also having included SDL (Table 2). In the field of
A&H, SCL appears 21.1% more often than it does in the overall sample, whereas SDL occurs less often,
by 6.2%, than it does in the overall sample.
25%
20%
percent difference betwwen
15%
A&H vs. not A&H
10%
5%
0%
-5%
-10%
NS_Mode
SCL
GBL
PLE
other_LS
DE
FC
SDL
NS_LS
BL
F2F
Figure 3. Percentage deviation from the average relative frequencies of mode of delivery (n = 42).
Note. BL = blended learning; DE = distance education; F2F = face-to-face; NS_Mode = not stated; SDL =
self-directed learning; SCL = social collaborative learning; GBL = game-based learning; PLE = personal
learning environments; other_LS = other learning scenario; FC = flipped classroom; NS_LS = learning
scenario not stated
Study population
Interestingly, 31.0% of studies in this sample do not specify the study level of students. However, most
studies (59.5%, n = 25) were conducted with undergraduate students, whilst only five studies researched
graduate students (11.9%), and one study included both undergraduate and postgraduate students (Carver
et al., 2012). This distribution of study levels is significantly different from the distribution in the overall
2
sample (𝜒𝜒(𝑑𝑑𝑑𝑑=3) = 9.346, p < 0.05). However, upon exclusion of the studies not specifying their study level,
it is evident that there are fewer courses with both graduate and undergraduate students.
Controlling for the studies not specifying study level, the share of courses at postgraduate level in A&H is
at 3.4% and in the overall sample at 12.7% of studies. Controlling for this, the share of postgraduate courses
in A&H is lower by 12.2% than in the overall sample, whereas the share of undergraduate courses is almost
equal between A&H and the overall sample (higher by 2.9% in A&H). However, these differences are not
2
significant (𝜒𝜒(𝑑𝑑𝑑𝑑=2) = 2.523, p > 0.05) and might also have arisen due to differently structured higher
education systems across countries, which is not reflected in the coding of the studies.
130
Australasian Journal of Educational Technology, 2020, 36(4).
Table 2
Co-occurrence of learning scenarios across the sample (n = 42)
Other_LS
NS_LS
GBL
SDL
SCL
PLE
FC
Sum A&H 16 32 2 1 0 0 3
SDL 0.56 0.5 1 0
SCL 0.43 0.5 0 0
GBL 0.25 0.33 0 0
PLE 0.50 0.50 0 0
Other_LS 0.33 0.33 0 0
FC 0.57 0.57 0 0 0
NS_LS 0 0 0 0 0 0
Sum not A&H 89 110 12 6 3 7 29
Note. SDL = self-directed learning; SCL = social collaborative learning; GBL = game-based learning; PLE
= personal learning environments; Other_LS = other learning scenario; FC = flipped classroom; NS_LS =
learning scenario not stated
Looking at the frequency of EdTech tools, text-based tools were used most frequently (71.4%, n = 30),
followed by knowledge organisation and sharing tools (35.7%, n = 15) and multimodal production tools
(28.6%, n = 12). Website creation tools and learning software were each used in 19.0% of studies (n = 8),
assessment tools and social networking tools appeared in 14.3% of studies (n = 6), and mobile learning and
specific hardware (e.g. iPads) were explored in 9.5% of studies (n = 4) across the A&H sample.
In order to determine how often tools occurred together, the same method as that used in learning scenarios
was used (Equation 1). In 93% of cases, text-based tools were used jointly with another technology, and in
100% of cases when assessment tools were being used (Table 3). Text-based tools and learning software
were used together in 88% of possible cases, and knowledge organisation and sharing tools and assessment
tools in 83% of possible cases. Comparing these results with the non- A&H studies, it is evident that text-
based tools were used in A&H 17.7% more often, as were website creation tools (8.6%). A striking
difference between the two samples exists for learning software, which was used almost exclusively in
A&H (Figure 6). Out of the 201 studies that constitute the non-A&H sample, only one study (Gleason,
2012) used learning software. Thus, its share in A&H is higher by 18.6%. Social networking tools were
also frequently used in A&H (6.8%), whilst assessment tools (-15.1%) and multimodal production tools (-
9.7%) were used less frequently.
Methodological characteristics
Solely quantitative methods were used in 40.5% of studies (n = 17), followed closely by 38.1% combining
both qualitative and quantitative methods (n = 16), with the remaining 21.4% relying on solely qualitative
methods (n = 9). This means that the share of mixed methods is higher by 4.3% than in the overall sample,
whilst the shares of qualitative (-2.0%) and quantitative studies (-2.3%) are slightly smaller. The differences
2
are, however, not significant (𝜒𝜒(𝑑𝑑𝑑𝑑=3) = 0.284). The most frequently used data collection method was
surveys (see Table 4), followed by ability tests and observations, which included behavioural observation
of student participation online, examined e.g. through the number of posts in discussion forums (e.g.,
Kenny, 2008).
Similar to the overall sample (see Bond et al., 2020), document analysis was only used in 10 (24%) A&H
studies. Peterson (2012), for example, captured the chat logs of ESL students in Second Life and conducted
a discourse analysis, in order to explore how students interacted and whether use of the virtual world
promoted increased target language use. The majority of surveys were self-made and focused on, for
131
Australasian Journal of Educational Technology, 2020, 36(4).
example student satisfaction (e.g., Orawiwatnakul & Wichadee, 2016), student perceptions of the EdTech
used (e.g., Mejia, 2016) or course evaluations (e.g., Peterson, 2012), or were adapted from other studies
(e.g., Lu, Hou, & Huang, 2010).
Table 3
Co-occurrence of tools across the sample (n = 42) (≥ 3 articles)
Hardware
MOOCs
E-tutors
Games
KO&S
WCT
MPT
DAT
TBT
DST
SNT
SCT
VW
ML
AT
OL
LS
Sum A&H 30 12 8 15 0 0 6 6 5 4 0 2 8 0 4 2 0
TBT 0.75 0.63 0.93 1.00 0.67 0.80 0.75 1.00 0.88 0.50 1.00
MPT 0.57 0.25 0.33 0.50 0.33 0.40 0.50 0.00 0.25 0.50 0.00
WCT 0.33 0.33 0.38 0.00 0.50 0.20 0.00 0.00 0.13 0.00 0.00
KO&S 0.69 0.53 0.52 0.83 0.17 0.20 0.50 0.00 0.25 0.50 0.50
DAT 0.00 0.00 0.00 0.50
DST 1.00 1.00 1.00 1.00 0.00
AT 0.51 0.42 0.10 0.49 1.00 1.00 0.00 0.20 0.00 0.00 0.17 0.25 0.00
SNT 0.60 0.20 0.20 0.53 0.00 0.00 0.00 0.00 0.00 0.00 0.17 0.00 0.00
SCT 0.64 0.64 0.18 0.45 0.00 1.00 0.36 0.18 0.25 0.00 0.00 0.25 0.00
ML 0.17 0.33 0.17 0.50 0.00 0.00 0.17 0.17 0.00 0.00 0.25 0.25 0.00
MOOCs 1.00 0.67 0.33 0.67 0.00 0.00 0.33 0.33 0.00 0.00
VW 0.38 0.38 0.06 0.19 0.00 0.00 0.31 0.00 0.18 0.17 0.00 1.00 0.00 0.00
LS 1.00 0.00 0.00 0.00 0.00 0.00 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
OL 0.50 0.50 0.10 0.40 0.00 0.00 0.50 0.10 0.00 0.17 0.00 0.00 0.00
Hardware 0.36 0.45 0.18 0.45 0.00 1.00 0.18 0.00 0.36 0.17 0.00 0.00 0.00 0.00 0.00
E-tutors 0.33 0.00 0.00 0.17 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.17 1.00 0.00 0.00
Games 0.00 0.33 0.00 0.33 0.50 0.00 0.67 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Sum not A&H 108 77 21 89 2 1 59 15 11 6 3 16 1 10 11 6 3
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S
= knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools;
AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML =
mobile learning; VW = virtual worlds; LS = learning software; OL = online learning
Table 4
Data collection methods used
Method n Percentage
Surveys 23 54.8%
Ability tests 16 38.0%
Observations 11 26.2%
Document analysis 10 23.8%
Interviews 7 16.6%
Focus groups 4 9.5%
132
Australasian Journal of Educational Technology, 2020, 36(4).
25%
percent difference AuH vs. notAuH
20%
15%
10%
5%
0%
-5%
-10%
-15%
-20%
Games
DAT
VW
LS
DST
OL
WCT
KO&S
Mlearning
Hardware
MPT
SNT
E-tutors
TBT
AT
SCT
MOOCs
Figure 6. Percentage difference between A&H and non- A&H studies (n = 42)
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S
= knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools;
AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; MLearning
= mobile learning; VW = virtual worlds; LS = learning software; OL = online learning.
Only two studies (4.8%) provided a definition of student engagement (Fukuzawa & Boyd, 2016; Lu &
Churchill, 2014). Fukuzawa and Boyd (2016, p. 1) cited a definition by Smith, Sheppard, Johnson, and
Johnson (2005, p. 87), that student engagement means “the frequency with which students participate in
activities that represent effective educational practice”. Lu and Churchill (2014) drew on a definition by
Chapman (2003), and then focused on social and cognitive engagement. A theoretical framework was
applied in 20 studies (47.6%), with constructivism referred to in four studies (Garcia-Sanchez & Rojas-
Lizana, 2012; Lu & Churchill, 2014; Lu et al., 2010; Shi & Luo, 2016), social constructivism in two (Yang,
Gamble, & Tang, 2012; Zhang, Song, Shen, & Huang, 2014) and socio-cultural theory in two studies (Lin
& Yang, 2013; Peterson, 2009). Other studies drew on Bandura’s theory of social learning (Carver et al.,
2012), social presence theory (Yildiz, 2009) and technological pedagogical content knowledge (TPCK)
(Asoodar, Marandi, Atai, & Vaezi, 2014). Specific to language learning in particular, Smith and Craig
(2013) drew on the “theoretical framework of learner autonomy (LA) [which] is informed by principles
underpinning CALL” (p. 2) and Grgurovic and Hegelheimer (2007) used interactionist second language
acquisition theory.
Behavioural engagement was by far the most prevalent dimension (Table 5), followed by cognitive and
affective engagement, and in 50% (n = 21) of studies, all three dimensions of student engagement were
identified. Another 19% (n = 8) found two dimensions, and the remaining 31% (n = 13) indicated one
dimension of engagement. The six most frequently cited facets of student engagement were
participation/involvement/interaction, achievement, positive interactions with peers & teachers, enjoyment
and motivation, exactly replicating the top four student engagement indicators from the overall sample.
Two studies (4.8%) found that using EdTech enhanced engagement overall, without specifying which
dimensions and/or facets it referred to, which were then coded separately to the other facets. For example,
in Cheung’s (2015) study of mobile learning in undergraduate language study, students were asked to rate
the question “I think my level of engagement using the mobile learning module for language learning was
high”, from strongly disagree to strongly agree.
133
Australasian Journal of Educational Technology, 2020, 36(4).
Table 5
Student engagement frequency descriptive statistics
Frequency Percentage M SD
Behavioural Engagement 38 90% 1.63 0.79
Affective Engagement 24 57% 3.25 2.07
Cognitive Engagement 28 67% 1.79 1.20
The most frequently reported dimension of engagement was behavioural (Table 6), with
participation/interaction/involvement the most cited facet (52.4%, n = 22), which was particularly present
in studies using website creation tools (particularly blogs) and mobile learning (75%), closely followed by
assessment tools (Table 7). Studies often referred to the increased collaboration that online tools afforded
students, as well as the ability for students to see how others constructed their questions and responses,
which then enabled them to use the modelled language in their own contributions (Yang & Hsieh, 2015).
This was particularly helpful in a bilingual blog between undergraduate students at an Australian and a
Spanish university, where students could read the contributions of all participating students, which
“persuaded them to be more careful in their writing” (Garcia-Sanchez & Rojas-Lizana, 2012, p. 367), and
which resulted in most students interacting with each other beyond the required amount. This ability for
students to interact online, without the pressure of talking face-to-face, enhanced student confidence (Yang
& Hsieh, 2015; Yildiz, 2009), as did using the target language to describe their everyday life within
language courses (Mejia, 2016). Students were also more likely to interact and respond to student discussion
forum posts in a Japanese ESL course, when students replied to each others’ posts (Nielsen, 2013), which
was found to be more important than how often the teacher contributed.
Table 6
Top five engagement facets across the three dimensions
Rank Behavioural n % Affective n % Cognitive n %
engagement engagement engagement
1 Participation/ 22 52.4% Positive 14 33.3% Learning from 8 19.0%
interaction/ interactions peers
involvement with
peers/teachers
2 Achievement 17 40.5% Enjoyment 10 23.8% Self- 7 16.7%
Motivation regulation
3 Confidence 6 14.3% Interest 7 16.7% Deep learning 6 14.3%
4 Assume 5 11.9% Enthusiasm 6 14.3% Critical 5 11.9%
responsibility thinking
Understanding
5 Study habits 3 7.1% Sense of 5 11.9% Staying on 4 9.5%
connectedness task/focus
Satisfaction
Excitement
Three of the four studies that focused on mobile learning indicated that
participation/interaction/involvement was positively affected as a result (Cheung, 2015; Ramamuruthy &
Rao, 2015; Shi & Luo, 2016), with 45% of students in Cheung’s (2015) study agreeing or strongly agreeing
that mobile learning can enhance their overall academic performance, and foreign language students who
used WeChat in Shi & Luo’s (2016) study scored significantly higher (t = 2.05, P = 0.039 < 0.05) than
those who did not. Achievement, however, was especially found when synchronous collaborative tools,
multimodal production tools (MPT) and assessment tools are used, with knowledge organisation and
sharing tools and MPT having relatively high values as well. In a study examining the use and effect of
voice over instant messaging on English speaking proficiency in Taiwan using Skype (Yang et al., 2012),
the use of structured discussions facilitated by English teaching assistants was particularly effective, as they
further scaffolded learning for students.
134
Australasian Journal of Educational Technology, 2020, 36(4).
Table 7
Relative frequency (percentages) of behavioural engagement facets by technology type (in 3 or more
articles)
Not A&H
Hardware
KO&S
WCT
MPT
TBT
SNT
SCT
ML
AT
All
LS
Participation/
interaction/ 52% 53% 50% 75% 60% 67% 50% 60% 75% 50% 50% 48%
involvement
Study habits 7% 10% 8% 13% 7% 0% 17% 0% 0% 38% 0% 8%
Confidence 14% 17% 8% 25% 13% 0% 17% 0% 25% 0% 25% 15%
Assume
12% 10% 0% 25% 7% 0% 17% 0% 0% 13% 25% 6%
responsibility
Achievement 40% 40% 50% 25% 47% 50% 17% 60% 25% 38% 25% 44%
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S
= knowledge organisation and sharing tools; AT = assessment tools; SNT = social networking tools; SCT
= synchronous collaboration tools; ML = mobile learning; LS = learning software; not A&H = the rest of
the sample without arts and humanities
Affective engagement was noted through 10 different facets in relation to using EdTech (see Table 8), with
positive interactions with peers/teachers the most cited (33.3%, n = 14), and particularly prevalent when
using website creation tools and synchronous collaboration tools. Interestingly, affective engagement was
seldom reported when studies used assessment tools or m-Learning, although m-Learning did promote
enjoyment and motivation in half of the studies using it. In contrast, when using website creation tools,
affective engagement was reported relatively often, especially touching upon the facets of positive
interactions with peers/teachers, enjoyment and motivation. In comparison to the overall sample,
motivation was found more frequently in A&H, whilst no other striking differences exist between the two
samples.
Positive interactions with peers/teachers were identified in the studies by Garcia-Sanchez & Rojas-Lizana
(2012) and Peck (2012), based on how students addressed one another in a friendly manner in an
international, blog-supported language exchange course (Garcia-Sanchez & Rojas-Lizana, 2012) or
greeting their instructor more informally than would generally happen in a university course (Peck, 2012).
Using blogs to provide out of class peer feedback on each other’s EFL writing was very conducive to
fostering increased collaboration for in-class activities of Chinese undergraduates (Zhang et al., 2014).
Students from the Open University of Hong Kong evaluated mobile learning to be both motivating for self-
study and interesting, with 66% (n = 40) stating that they (strongly) agreed with this statement, and 86 out
of 87 students in the study by Garcia-Sanchez and Rojas-Lizana (2012) confirmed that the use of blogs
motivated them. Motivation, however, was also fostered through a continuously present teacher in the web-
based classroom (Lopera Medina, 2014), peer comments that refocused students on their written reflections,
and artifacts made available to the class online (Lu & Churchill, 2014), as well as students producing their
own videos in EFL learning, to show both their creativity as well as language skills (Mejia, 2016). One
student in the third study by Carver et al. (2012) commented that her level of enjoyment increased during
and due to contributing to a joint Wikipedia entry “because of working on something that was not limited
to our class” (p. 278) and also making it available to a larger audience. Tschirhart and Rigler (2009) found
that interactive language learning materials were enjoyed by most students in two different cohorts and
were perceived helpful for their learning.
135
Australasian Journal of Educational Technology, 2020, 36(4).
Table 8
Relative frequency (percentages) of affective engagement facets by technology type (in 3 or more articles)
Hardware
not A&H
KO&S
WCT
MPT
TBT
SNT
SCT
ML
AT
All
LS
Enthusiasm 14% 17% 17% 38% 20% 0% 0% 20% 0% 25% 0% 10%
Interest 17% 20% 25% 25% 20% 17% 0% 20% 25% 25% 25% 15%
Sense of belonging 10% 10% 0% 38% 13% 0% 33% 0% 0% 0% 0% 3%
Positive interactions with
33% 37% 33% 75% 33% 33% 50% 60% 0% 38% 25% 43%
peers & teachers
Positive attitude about
10% 10% 0% 25% 13% 0% 17% 0% 0% 13% 25% 16%
learning
Sense of connectedness 12% 13% 8% 50% 13% 0% 33% 20% 0% 13% 0% 10%
Satisfaction 12% 10% 8% 38% 7% 0% 0% 20% 0% 13% 0% 8%
Excitement 12% 17% 17% 38% 0% 0% 33% 20% 0% 13% 0% 7%
Enjoyment 24% 27% 33% 50% 20% 0% 17% 20% 50% 50% 25% 22%
Motivation 24% 27% 25% 50% 33% 17% 17% 20% 50% 38% 25% 11%
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S
= knowledge organisation and sharing tools; AT = assessment tools; SNT = social networking tools; SCT
= synchronous collaboration tools; ML = mobile learning; LS = learning software; not A&H = the rest of
the sample without Arts & Humanities
Whilst cognitive engagement was found slightly more frequently than affective engagement (Table 5), the
frequency was more dispersed across the eight facets and technology tool types (Table 9). The most often
identified facet was learning with peers, which occurred especially when studies used website creation tools
and learning software, although the study by Lu and Churchill (2012) found that social interactions in the
course did not ultimately lead to higher cognitive engagement per se. Weaker students were able to benefit
from stronger ones when applying peer questioning in online discussion boards (Yang & Hsieh, 2015), and
learning from peers also happened the other way around, with tutors in EFL classes learning from their
online tutees (Lin & Yang, 2013) in the sense of feeling more prepared to be a teacher and even expanding
their English vocabulary.
Self-regulation was found in studies, for example, where students’ positive attitude towards technology,
stemmed from the appreciation of learning by doing it on their own (Alshaikhi & Madini, 2016). The
integration of a self-study centre, as well as a reflective diary and mechanisms to track one’s individual
learning, led students to realise that “to find one's own way to study” (Smith & Craig, 2013, p. 9) is a central
feature of CALL. Deep learning was detected through the content analysis of student group discussions
(Kenny, 2008), with findings from an investigation into the knowledge construction patterns of Taiwanese
EFL students revealing that knowledge construction in online discussion occurs in relation to students’
learning styles; with learners that show Serial style, rather than Holist style, showing more variation in their
collaborative construction of knowledge and negotiation of meaning and disagreement (Wu, 2016).
Critical thinking was found to increase both in an EFL class offered in a traditional setting, but much more
so in a Facebook-enhanced one (dppc2 = 0.545) (Morris, 2008), after students were trained to answer
questions developed on a revised Bloom’s Taxonomy scale (Pattanapichet & Wichadee, 2015). Installing
courseware in the laboratory to enhance teacher-centered instruction, led students in the study by Tsai
(2012) to develop more critical thinking skills in the sense that they reported to have improved “abilities of
thinking, analysis and problem-solving” (p. 56). However, Cohen’s d = .085 revealed no differences
between the traditional and the enhanced settings.
136
Australasian Journal of Educational Technology, 2020, 36(4).
Table 9
Relative frequency (percentages) of cognitive engagement facets by technology type (in 3 or more articles)
Hardware
not A&H
WCT
MPT
KOS
TBT
SNT
SCT
ML
AT
All
LS
Deep learning 14% 13% 8% 0% 7% 17% 33% 20% 0% 0% 50% 19%
Self-regulation 17% 13% 25% 0% 13% 17% 33% 0% 0% 25% 0% 16%
Staying on
10% 13% 8% 25% 7% 0% 17% 20% 25% 25% 0% 9%
task/focus
Positive perceptions
7% 10% 17% 25% 7% 17% 17% 20% 0% 13% 0% 8%
of teacher support
Follow
through/Care/ 7% 10% 8% 13% 13% 0% 17% 0% 25% 0% 25% 7%
thoroughness
Learning from
19% 23% 8% 38% 13% 0% 17% 20% 0% 38% 0% 23%
peers
Critical Thinking 12% 7% 8% 0% 0% 0% 33% 0% 25% 0% 0% 10%
Understanding 12% 13% 8% 0% 13% 17% 17% 20% 25% 13% 0% 4%
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KOS =
knowledge organisation and sharing tools; AT = assessment tools; SNT = social networking tools; SCT =
synchronous collaboration tools; ML = mobile learning; LS = learning software; not A&H = the rest of the
sample without Arts & Humanities
Student disengagement and educational technology in the field of arts and humanities
Student disengagement (Table 10) was found considerably less often across the sample, which could be
due to studies seeking to identify positive engagement, although this could also be potentially due to a form
of publication or self-selection bias, due to infrequent publishing of studies with negative results. The three
disengagement facets most often indicated were frustration (n = 5, 11.9%), half-hearted/task incompletion
(n = 3, 7.1%), and pressured (n = 3, 7.1%).
Table 10
Student disengagement frequency descriptive statistics
Frequency Percentage M SD
Behavioural Disengagement 5 12% 2 1.41
Affective Disengagement 10 24% 1.6 0.84
Cognitive Disengagement 6 14% 1.5 0.84
Behavioural disengagement was indicated by six facets (Table 11), with the most frequent of these being
half-hearted/task incompletion (Table 12), which was the only facet identified in three or more studies. This
was related to students not completing their share of the group work in a blended or online class respectively
(Asoodar et al., 2014), not sufficiently contributing to group discussion forums (Wang, 2010) or using the
provided forum, chat or e-mail only very superficially (Lopera Medina, 2014). In the case of group work
in Wang (2010), groups were comprised of students from two different colleges, with students from one
college posting more or less nothing, which in turn led the students from the other college to interact among
themselves – thus, half-hearted participation did not stop interaction and dialogue, but rather gave it an
unintended direction. When looking at the facets of unfocused/inattentive and distracted, it is noteworthy,
albeit not surprising, that the Internet is cited as a prime reason for losing focus and being distracted, as one
EFL student in a blended learning course pointed out (Zhang & Han, 2012). The authors conclude that
students’ ability to learn autonomously needs to be increased, as well as the provision of teacher guidance.
137
Australasian Journal of Educational Technology, 2020, 36(4).
Table 11
Top five disengagement facets across the three dimensions
Rank BD n % AD n % CD n %
1 Half-hearted/task 3 7.1% Frustration 5 11.9% Pressured 3 7.1%
incompletion
2 Unfocused 2 4.8% Disinterest 2 4.8% Opposition/ 2 4.8%
Distracted Disappointment Rejection
Worry/Anxiety Other
Other
3 Giving up 1 2.4% Unwilling 1 2.4%
Mentally withdrawn
Poor conduct
Note. BD = Behavioural disengagement; AD = affective disengagement; CD = cognitive disengagement.
Table 12
Relative frequency (percentages) of behavioural disengagement facets by technology type (in 3 or more
articles)
Hardware
not A&H
KO&S
WCT
MPT
TBT
SNT
SCT
ML
AT
All
LS
Half-
hearted/task 7% 10% 8% 13% 13% 17% 0% 20% 0% 13% 0% 8%
incompletion
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S
= knowledge organisation and sharing tools; AT = assessment tools; SNT = social networking tools; SCT
= synchronous collaboration tools; ML = mobile learning; LS = learning software; not A&H = the rest of
the sample without Arts & Humanities
Four affective disengagement facets were coded alongside other (Table 11), with frustration the most
frequent (Table 13). Students were particularly frustrated by technical issues experienced with technology
(Ducate, Anderson, & Moreno, 2011), such as Google docs being occasionally unstable and lagging during
group work (Lin & Yang, 2013), and fellow students changing the background and font colours of online
group spaces, which made writing uncomfortable for some (Asoodar et al., 2014). For one student, the lack
of an option to hand in work without technology meant that “if you didn’t have easy access to technology
or had technological difficulties, you were disadvantaged” (Mejia, 2016, p. 90) and other students were
worried that late submission as a result might mean incurring a penalty. Students using Second Life in an
undergraduate ESL course found that interacting with fellow students could also be challenging, and noted
the need for good typing skills, however they also felt that the setting was less stressful than a regular
language class (Peterson, 2012).
Table 13
Relative frequency of affective disengagement facets by technology type (in 3 or more articles)
Hardware
not A&H
KO&S
WCT
MPT
TBT
SNT
SCT
ML
AT
All
LS
Frustration 12% 17% 25% 13% 7% 0% 0% 20% 25% 13% 25% 14%
Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S
= knowledge organisation and sharing tools; AT = assessment tools; SNT = social networking tools; SCT
= synchronous collaboration tools; ML = mobile learning; LS = learning software; not A&H = the rest of
the sample without Arts & Humanities
138
Australasian Journal of Educational Technology, 2020, 36(4).
Students who were disinterested in using EdTech included those who were negative or cynical about the
tools used, such as Ning (Peck, 2012). Students also expressed disappointment when interactions with peers
were made difficult through technology, due to not being able to read a person’s body language and interpret
their meaning in messages (Yildiz, 2009). Students in a web-based graduate ESL course “expressed a high
degree of anxiety when they did not receive an automated score from the exercises they had submitted”
(Lopera Medina, 2014, p. 97), as some of the tasks had to be graded manually by the instructor, which was
demotivating for students. Also causing concern for one student was the use of an interactive whiteboard
in a beginners’ Chinese course, as they did not want to embarrass themselves in front of other students (Xu
& Moloney, 2011), and a lack of technology skills and experience with online communication within
educational contexts caused difficulties for others (Lopera Medina, 2014; Wang, 2010).
As with affective disengagement, there were only four facets of cognitive disengagement coded in this
sample, alongside other (Table 11), with pressured the most cited (Table 14). Three studies that explored
blended learning in ESL classes (Asoodar et al., 2014; Sun, 2014; Zhang & Han, 2012) found that students
felt pressured because “they had to devote themselves to both the online environment and the classroom
environment and they had to spend much more time on English learning, which was stressful for them”
(Zhang & Han, 2012, p. 1967). This led one student to declare that “only in the traditional classroom, can
I acquire basic English” (Sun, 2014, p. 90). Another student felt overwhelmed, as their lack of a computer
at home meant they often had to go to the university campus, which “took a lot of [their] time” (Asoodar et
al., 2014, p. 540), and the use of M-Learning in a face-to-face beginners’ Spanish course was stressful for
some students, as technical problems caused difficulties in handing up assignments (Mejia, 2016). Students
were opposed to online group work in blended classes in two studies, as they preferred to “meet face-to-
face” (Asoodar et al., 2014, p. 538), with students in a blended undergraduate Linguistics course “openly
ridicul[ing]” (Peck, 2012, p. 83) the idea that they might develop offline friendships with fellow students,
and unwilling to engage in online discussions with the lecturer.
Table 14
Relative frequency (percentages) of cognitive disengagement facets by technology type (in 3 or more
articles)
Hardware
not A&H
KO&S
WCT
MPT
TBT
SNT
SCT
ML
AT
All
LS
Discussion
Grounding arts and humanities research in theory and methodologies used
A common theme within research on student engagement has been the complexity of the construct, and in
particular, its definition and measurement. This was particularly evident within the overall sample, but more
so within this subset, as only two studies (5%) included a definition. Whilst arguments will most likely
continue over the exact nature of student engagement, it is vital that each study investigating engagement
includes a definition of their own understanding, in order to locate and frame their findings, and to ensure
easier interpretation of results (Appleton et al., 2008; Christenson, Reschly, & Wylie, 2012). It is also
advised that studies relate the aspects of engagement under investigation to the wider framework of student
engagement (see e.g., Bond & Bedenlier, 2019), and to further consider the issue of disengagement when
using EdTech, given that engagement and disengagement exist on a continuum (Pekrun & Linnenbrink-
139
Australasian Journal of Educational Technology, 2020, 36(4).
Garcia, 2012), and few studies within this sample explored the negative effects of EdTech, despite the
valuable insight for educators that such investigations could provide.
More than half of the studies in this sample did not use a theoretical framework, which mirrors current
conversations and concerns within the wider field of EdTech (e.g., Crook, 2019; Hew, Lan, Tang, Jia, &
Lo, 2019). Studies that did, drew heavily on constructivism and socio-cultural theories of learning and
practice, with three quarters using social collaborative approaches, also reflective of the trend in EdTech
research (Bond, Zawacki-Richter, & Nichols, 2019). Interestingly, given the high number of studies relating
to language learning in the sample, only two studies drew on language specific theories. This perhaps
emphasises the importance now placed within research on language within social contexts and language
learning as a social endeavour (Chapelle, 2009; Thorne, Black, & Sykes, 2009), or, as Garrett (2009)
suggests, perhaps this is due to the normalisation of technology use within higher education.
Whilst the number of studies that used qualitative methods was not greatly different to the overall sample,
there were nevertheless a smaller amount of studies that used, for example, interviews and focus groups.
Given the large number of studies that used text-based tools (71.4%), and the focus of this sample on
language learning and use, it was also surprising that fewer studies used document analysis. As student
engagement is complex, it is important to also use data collection methods that provide thick descriptions
of student and teacher perceptions of using EdTech, rather than solely relying on quantitative data, which
often focuses more on behavioural engagement (Fredricks et al., 2004; Henrie, Halverson, & Graham,
2015). This more quantitative approach towards EdTech in A&H could perhaps be partially explained in
this sample, given the large amount of Asian studies present, and the fact that quantitative methods were
heavily used in Asian countries in the overall sample. Caution is advised, however, with using self-
developed surveys over previously validated instruments, as this can pose questions of validity and
reproducibility (Döring & Bortz, 2016).
There were also issues of missing contextual data within the studies in this sample. Without full contextual
information in empirical research, readers are unable to fully gauge whether studies could be applied to
their own context (Bond, Zawacki-Richter, & Nichols, 2019; Pérez-Sanagustín et al., 2017), and more
explicit study details should be included in future research.
The synthesised findings from the studies in this sample can be broadly read as an affirmation of the
argument that the mere use of technology as such does not make learning better, but is rather only one
factor in the design of a course or module (Tamim, Bernard, Borokhovski, Abrami, & Schmid, 2011). This
can be gathered from the fact that authentic and meaningful tasks, such as using Wikipedia to create a
contribution extending the classroom (e.g., Carver et al., 2012) were perceived by students to be more
enjoyable or that collaborative activities were found to be effective when using e.g. digital flashcards in
language learning (Hung, 2015).
Whilst blended learning was used in half of the studies, students in language learning courses (e.g., Zhang
& Han, 2012) reported being somewhat opposed to the online parts of the course, feeling pressured or
overwhelmed, and preferring the face-to-face environment for learning. However, studies also found that
using blogs and discussion forums allowed students to see examples of other students’ work and therefore
model language in their own responses, which reduced anxiety. They were also more likely to contribute
more if other students responded to their posts. Interestingly, whilst affective engagement appeared less
often in this A&H sample, affective disengagement occurred more frequently than behavioural and
cognitive disengagement (see Table 10). Practitioners, therefore, need to be particularly mindful of the
potential for students to become disengaged, and to be proactive in taking preventative measures.
Frustration was primarily related to technical problems and failures (e.g., Ducate et al., 2011), such as
unstable connections or programs not functioning as expected (e.g., Lin & Yang, 2013), and disengagement
also particularly manifested in students not completing group work or contributing to discussion forums
(e.g., Wang, 2010). If technology and online learning are to be used in a course, it is therefore important to
make both tasks and the use of technology valuable to students and conducive to learning goals. It is also
important to then ensure that students understand the reasons behind utilising EdTech, are taught how to
use the tools involved, and are encouraged to engage with their peers as much as possible.
140
Australasian Journal of Educational Technology, 2020, 36(4).
Conclusion
This study synthesised 42 studies in the field of A&H, the majority of which addressed language learning,
and ESL/EFL in particular. A considerable number of studies were undertaken in East Asian countries,
with Taiwan contributing seven studies alone. This raises questions as to how this heavy regional and
disciplinary focus can be explained, as well as how EdTech is employed in other disciplines within A&H,
such as history, performing arts or religious studies (UNESCO, 2015). Also, given that regions such as
continental Europe, Africa and Oceania were only minimally present in this sample, further investigation
is encouraged within those regions, as learning is rooted in cultural contexts and occurs against specific
institutional background and learner characteristics. With more countries, and a broader range of
institutional settings and disciplines explored, a more holistic picture can potentially be gained of how
EdTech can be effectively used to enhance student engagement.
The authors sought to adhere to the principles of conducting a systematic review as closely as possible.
However, the implicit bias of having searched only English language databases and the explicit restriction
to journal articles from the years 2007 to 2016, constitute a limitation to the results of this study. In order
to capture more recent and emerging technologies, including artificial intelligence, it is suggested that this
review be updated accordingly in the future.
Funding
This work was supported by the Bundesministerium für Bildung und Forschung (German Federal Ministry
of Education and Research - BMBF) [grant number 16DHL1007].
References
Albiladi, W. S., & Alshareef, K. K. (2019). Blended learning in English teaching and learning: A review
of the current literature. Journal of Language Teaching and Research, 10(2), 232-238.
https://doi.org/10.17507/jltr.1002.03
Alshaikhi, D., & Madini, A. A. (2016). Attitude toward enhancing extensive listening through podcasts
supplementary pack. English Language Teaching, 9(7), 32. https://doi.org/10.5539/elt.v9n7p32
Appleton, J., Christenson, S. L., & Furlong, M. (2008). Student engagement with school: Critical
conceptual and methodological issues of the construct. Psychology in the Schools, 45(5), 369–386.
https://doi.org/10.1002/pits.20303
Asoodar, M., Marandi, S. S., Atai, M. R., & Vaezi, S. (2014). Learner reflections in virtual vs. blended
EAP classes. Computers in Human Behavior, 41, 533–543. https://doi.org/10.1016/j.chb.2014.09.050
Bond, M. (2019). Flipped learning and parent engagement in secondary schools: A South Australian case
study. British Journal of Educational Technology, 50(3), 1294–1319.
https://doi.org/10.1111/bjet.12765
Bond, M., & Bedenlier, S. (2019). Facilitating student engagement through educational technology:
Towards a conceptual framework. Journal of Interactive Media in Education, 2019(1): 11, 1-14.
https://doi.org/10.5334/jime.528
Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in
student engagement and educational technology in higher education: A systematic evidence map.
International Journal of Educational Technology in Higher Education, 17(1), 1–30.
https://doi.org/10.1186/s41239-019-0176-8
Bond, M., Zawacki-Richter, O., & Nichols, M. (2019). Revisiting five decades of educational technology
research: A content and authorship analysis of the British Journal of Educational Technology. British
Journal of Educational Technology, 50(1), 12–63. https://doi.org/10.1111/bjet.12730
Bower, M. (2015). A typology of Web 2.0 learning technologies. EDUCAUSE Digital Library. Retrieved
from http://www.educause.edu/library/resources/typology-web-20-learning-technologies
Bower, M. (2016). Deriving a typology of Web 2.0 learning technologies. British Journal of Educational
Technology, 47(4), 763–777. https://doi.org/10.1111/bjet.12344
Brunton, J., & Thomas, J. (2012). Information management in systematic reviews. In D. Gough, S.
Oliver, & J. Thomas (Eds.), An introduction to systematic reviews (pp. 83–106). London: SAGE.
141
Australasian Journal of Educational Technology, 2020, 36(4).
Carver, B. W., Davis, R., Kelley, R. T., Obar, J. A., & Davis, L. L. (2012). Assigning students to edit
Wikipedia: Four case studies. E-Learning and Digital Media, 9(3), 273–283.
https://doi.org/10.2304/elea.2012.9.3.273
Chang, M. M., & Lin, M.-C. (2013). Strategy-oriented web-based English instruction – A meta-analysis.
Australasian Journal of Educational Technology, 29(2), 203–216. https://doi.org/10.14742/ajet.67
Chapelle, C. A. (2009). The relationship between second language acquisition theory and computer-
assisted language learning. The Modern Language Journal, 93(s1), 741–753.
https://doi.org/10.1111/j.1540-4781.2009.00970.x
Chapman, E. (2003). Assessing student engagement rates. Retrieved from
http://www.ericdigests.org/2005-2/engagement.html
Chen, C.‑M., & Chang, C.‑C. (2014). Mining learning social networks for cooperative learning with
appropriate learning partners in a problem-based learning environment. Interactive Learning
Environments, 22(1), 97–124. https://doi.org/10.1080/10494820.2011.641677
Cheung, S. K. S. (2015). A case study on the students’ attitude and acceptance of mobile learning. In: Li,
K.C., Wong, T.L., Cheung, S. K. S., Lam, J., Ng, K. K. (Eds.). Technology in Education.
Transforming Educational Practices with Technology. Communications in Computer and Information
Science (pp. 45-54), vol. 494. Berlin: Springer. https://doi.org/10.1007/978-3-662-46158-7_5
Chiu, Y., Kao, C., & Reynolds, B. L. (2012). The relative effectiveness of digital game-based learning
types in English as a foreign language setting: A meta-analysis. British Journal of Educational
Technology, 43(4), E104–E107. https://doi.org/10.1111/j.1467-8535.2012.01295.x
Cho, K., Lee, S., Joo, M-H., & Becker, J. B. (2018). The effects of using mobile devices on student
achievement in language learning: A meta-analysis. Education Sciences, 8(3), 1-16.
https://doi.org/10.3390/educsci8030105
Christenson, S. L., Reschly, A. L., & Wylie, C. (Eds.) (2012). Handbook of research on student
engagement. Boston, MA: Springer. https://doi.org/10.1007/978-1-4614-2018-7
Claro, M., & Ananiadou, K. (2009). 21st century skills and competences for new millennium learners in
OECD countries (OECD Education Working Papers No. 41). Paris: OECD.
https://doi.org/10.1787/218525261154
Coates, H. (2007). A model of online and general campus‐based student engagement. Assessment &
Evaluation in Higher Education, 32(2), 121–141. https://doi.org/10.1080/02602930600801878
Crook, C. (2019). The “British” voice of educational technology research: 50th birthday reflection.
British Journal of Educational Technology, 50(2), 485–489. https://doi.org/10.1111/bjet.12757
Dolch, C., & Zawacki-Richter, O. (2018). Are students getting used to learning technology? Changing
media usage patterns of traditional and non-traditional students in higher education. Research in
Learning Technology, 26(0). https://doi.org/10.25304/rlt.v26.2038
Döring, N., & Bortz, J. (2016). Forschungsmethoden und Evaluation in den Sozial- und
Humanwissenschaften. Berlin: Springer. https://doi.org/10.1007/978-3-642-41089-5
Ducate, L. C., Anderson, L. L., & Moreno, N. (2011). Wading through the world of wikis: An analysis of
three wiki projects. Foreign Language Annals, 44(3), 495–524. https://doi.org/10.1111/j.1944-
9720.2011.01144.x
Filsecker, M., & Kerres, M. (2014). Engagement as a volitional construct. Simulation & Gaming, 45(4-5),
450–470. https://doi.org/10.1177/1046878114553569
Fredricks, J., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state
of the evidence. Review of Educational Research, 74(1), 59–109.
https://doi.org/10.3102/00346543074001059
Fredricks, J. A., Filsecker, M., & Lawson, M. A. (2016). Student engagement, context, and adjustment:
Addressing definitional, measurement, and methodological issues. Learning and Instruction, 43, 1–4.
https://doi.org/10.1016/j.learninstruc.2016.02.002
Fukuzawa, S., & Boyd, C. (2016). Student engagement in a large classroom: Using technology to
generate a hybridized problem-based learning experience in a large first year undergraduate class.
Canadian Journal for the Scholarship of Teaching and Learning, 7(1), 1-14.
https://doi.org/10.5206/cjsotl-rcacea.2016.1.7
Garcia-Sanchez, S., & Rojas-Lizana, S. (2012). Bridging the language and cultural gaps: The use of
blogs. Technology Pedagogy and Education, 21(3), 361–381.
https://doi.org/10.1080/1475939x.2012.719396
142
Australasian Journal of Educational Technology, 2020, 36(4).
Garrett, N. (2009). Computer-assisted language learning trends and issues revisited: Integrating
innovation. The Modern Language Journal, 93(s1), 719–740. https://doi.org/10.1111/j.1540-
4781.2009.00969.x
Gleason, J. (2012). Using technology-assisted instruction and assessment to reduce the effect of class size
on student outcomes in undergraduate mathematics courses. College Teaching, 60(3), 87–94.
https://doi.org/10.1080/87567555.2011.637249
Gough, D., Oliver, S., & Thomas, J. (Eds.) (2012). An introduction to systematic reviews. Thousand
Oaks, CA: SAGE.
Grgurović, M., & Hegelheimer, V. (2007). Help options and multimedia listening: Students’ use of
subtitles and the transcript. Language Learning and Technology, 11(1), 45–66. Retrieved from
https://scholarspace.manoa.hawaii.edu/bitstream/10125/44088/1/11_01_grgurovic.pdf
Henderson, M., Selwyn, N., & Aston, R. (2017). What works and why? Student perceptions of ‘useful’
digital technology in university teaching and learning. Studies in Higher Education, 42(8), 1567–
1579. https://doi.org/10.1080/03075079.2015.1007946
Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-
mediated learning: A review. Computers & Education, 90, 36–53.
https://doi.org/10.1016/j.compedu.2015.09.005
Hew, K. F., Lan, M., Tang, Y., Jia, C., & Lo, C. K. (2019). Where is the “theory” within the field of
educational technology research? British Journal of Educational Technology, 50(3), 956–971.
https://doi.org/10.1111/bjet.12770
Huang, C-T., & Yang, S. C. (2015). Effects of online reciprocal teaching on reading strategies,
comprehension, self-efficacy, and motivation. Journal of Educational Computing Research, 52(3),
381–407. https://doi.org/10.1177/0735633115571924
Hung, H-T. (2015). Intentional vocabulary learning using digital flashcards. English Language Teaching,
8(10), 107–112. http://dx.doi.org/10.5539/elt.v8n10p107
Hung, H-T., Yang, J. C., Hwang, G-J., Chu, H-C., & Wang, C-C. (2018). A scoping review of research
on digital game-based language learning. Computers & Education, 126, 89–104.
https://doi.org/10.1016/j.compedu.2018.07.001
Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5),
758–773. https://doi.org/10.1080/03075079.2011.598505
Kelley, K., Lai, K., Lai, M. K., & Suggests, M. (2018). Package “MBESS.” Retrieved from https://cran.r-
project.org/web/packages/MBESS/MBESS.pdf
Kenny, M. A. (2008). Discussion, cooperation, collaboration: The impact of task structure on student
interaction in a web-based translation exercise module. Interpreter and Translator Trainer, 2(2), 139–
164. https://doi.org/10.1080/1750399X.2008.10798771
Krasnova, T., & Vanushin, I. (2016). Blended learning perception among undergraduate engineering
students. International Journal of Emerging Technologies in Learning (iJET), 11(01), 54-56.
https://doi.org/10.3991/ijet.v11i01.4901
Kupper, L. L., & Hafner, K. B. (1989). How appropriate are popular sample size formulas? The American
Statistician, 43(2), 101–105. Retrieved from https://www.jstor.org/stable/2684511
Lai, J. W.M., & Bower, M. (2019). How is the use of technology in education evaluated? A systematic
review. Computers & Education, 133, 27–42. https://doi.org/10.1016/j.compedu.2019.01.010
Lin, W-C., & Yang, S. C. (2013). Exploring the roles of Google.doc and peer e-tutors in English writing.
English Teaching: Practice and Critique, 12(1), 79–90. Retrieved from
http://files.eric.ed.gov/fulltext/EJ1017168.pdf
Lopera Medina, S. (2014). Motivation conditions in a foreign language reading comprehension course
offering both a web-based modality and a face-to-face modality. PROFILE, 16(1), 89–104.
https://doi.org/10.15446/profile.v16n1.36939
Lu, J., & Churchill, D. (2014). The effect of social interaction on learning engagement in a social
networking environment. Interactive Learning Environments, 22(4), 401–417.
https://doi.org/10.1080/10494820.2012.680966
Lu, Z., Hou, L., & Huang, X. (2010). A research on a student-centred teaching model in an ICT-based
English audio-video speaking class. International Journal of Education and Development Using
Information and Communication Technology, 6(3), 101–123. Retrieved from
https://files.eric.ed.gov/fulltext/EJ1085029.pdf
Mahatmya, D., Lohman, B. J., Matjasko, J. L., & Farb, A. F. (2012). Engagement across developmental
periods. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student
143
Australasian Journal of Educational Technology, 2020, 36(4).
144
Australasian Journal of Educational Technology, 2020, 36(4).
Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and
student engagement: A critical review of the literature. International Journal of Educational
Technology in Higher Education, 14(1). https://doi.org/10.1186/s41239-017-0063-0
Selwyn, N. (2016). Digital downsides: exploring university students’ negative engagements with digital
technology. Teaching in Higher Education, 21(8), 1006–1021.
https://doi.org/10.1080/13562517.2016.1213229
Shi, Z. J., & Luo, G. F. (2016). Application of WeChat teaching platform in interactive translation
teaching. International Journal of Emerging Technologies in Learning, 11(9), 71–75.
https://doi.org/10.3991/ijet.v11i09.6113
Skinner, E., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and
everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on
Student Engagement (pp. 21–44). Boston, MA: Springer US. https://doi.org/10.1007/978-1-4614-
2018-7_2
Smith, K., & Craig, H. (2013). Enhancing the autonomous use of CALL: A new curriculum model in
EFL. CALICO Journal, 30(2), 252–278. Retrieved from
https://journals.equinoxpub.com/index.php/CALICO/article/viewFile/22957/18963
Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement:
Classroom-based practices. Journal of Engineering Education, 94(1), 87–101.
https://doi.org/10.1002/j.2168-9830.2005.tb00831.x
Song, M., & Liu, J. N. (2013). The interpersonal interaction in computer supported collaborative learning
environment. International Journal on E-Learning, 12(3), 329–351. Retrieved from
https://www.learntechlib.org/primary/p/36159/
Sun, L. (2014). Investigating the effectiveness of Moodle-based blended learning in college English
courses. International Journal of Information Technology and Management, 13(1), 83–94.
https://doi.org/10.1504/IJITM.2014.059152
Sun, Y-C. (2012). Examining the effectiveness of extensive speaking practice via voice blogs in a foreign
language learning context. CALICO Journal, 29(3), 494–506. Retrieved from
https://journals.equinoxpub.com/CALICO/article/viewFile/23722/19727
Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years
of research says about the impact of technology on learning: A second-order meta-analysis and
validation study. Review of Educational Research, 81(1), 4–28.
https://doi.org/10.3102/0034654310393361
Thorne, S. L., Black, R. W., & Sykes, J. M. (2009). Second language use, socialization, and learning in
internet interest communities and online gaming. The Modern Language Journal, 93(s1), 802–821.
https://doi.org/10.1111/j.1540-4781.2009.00974.x
Tsai S-C. (2012). Integration of multimedia courseware into ESP instruction for technological purposes in
higher technical education. Journal of Educational Technology & Society, 15(2), 50–61. Retrieved
from https://www.semanticscholar.org/paper/Integration-of-Multimedia-Courseware-into-ESP-for-
Tsai/95adcf4b68ea4bf60ad23841317a3ae98c12c35d
Tschirhart, C., & Rigler, E. (2009). LondonMet e-Packs: A pragmatic approach to learner/teacher
autonomy. Language Learning Journal, 37(1), 71–83. https://doi.org/10.1080/09571730802404394
United Nations Educational, Scientific and Cultural Organization Institute for Statistics (2015).
International standard classification of education: Fields of education and training 2013.
https://doi.org/10.15220/978-92-9189-179-5-en
Wang, M. (2010). Online collaboration and offline interaction between students using asynchronous tools
in blended learning. Australasian Journal of Educational Technology, 26(6), 830–846.
https://doi.org/10.14742/ajet.1045
Wu, S. Y. (2016). The effect of teaching strategies and students’ cognitive style on the online discussion
environment. Asia-Pacific Education Researcher, 25(2), 267–277. https://doi.org/10.1007/s40299-
015-0259-9
Xu, H. L., & Moloney, R. (2011). “It makes the whole learning experience better”: Student feedback on
the use of the interactive whiteboard in learning Chinese at tertiary level. Asian Social Science, 7(11),
20–29. https://doi.org/10.5539/ass.v7n11p20
Yang, Y-F., & Hsieh, P-Y. (2015). Negotiation of meaning to comprehend hypertexts through peer
questioning. Language Learning & Technology, 19(2), 69–84. Retrieved from
https://scholarspace.manoa.hawaii.edu/bitstream/10125/44418/1/19_02_yanghsieh.pdf
145
Australasian Journal of Educational Technology, 2020, 36(4).
Yang, Y-T. C., Gamble, J., & Tang, S-Y. S. (2012). Voice over instant messaging as a tool for enhancing
the oral proficiency and motivation of English-as-a-foreign-language learners. British Journal of
Educational Technology, 43(3), 448–464. https://doi.org/10.1111/j.1467-8535.2011.01204.x
Yildiz, S. (2009). Social presence in the web-based classroom: Implications for intercultural
communication. Journal of Studies in International Education, 13(1), 46–65.
https://doi.org/10.1177/1028315308317654
Zepke, N. (2014). Student engagement research in higher education: questioning an academic orthodoxy.
Teaching in Higher Education, 19(6), 697–708. https://doi.org/10.1080/13562517.2014.901956
Zepke, N. (2018). Student engagement in neo-liberal times: What is missing? Higher Education Research
& Development, 37(2), 433–446. https://doi.org/10.1080/07294360.2017.1370440
Zhang, H., Song, W., Shen, S., & Huang, R. (2014). The effects of blog-mediated peer feedback on
learners’ motivation, collaboration, and course satisfaction in a second language writing course.
Australasian Journal of Educational Technology, 30(6), 670–685. https://doi.org/10.14742/ajet.860
Zhang, W., & Han, C. (2012). A case study of the application of a blended learning approach to web-
based college English teaching platform in a medical university in eastern China. Theory and Practice
in Language Studies, 2(9), 1961–1970. https://doi.org/10.4304/tpls.2.9.1961-1970
Copyright: Articles published in the Australasian Journal of Educational Technology (AJET) are
available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-
NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under
CC BY-NC-ND 4.0.
Please cite as: Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020).
Facilitating student engagement through educational technology in higher education: A systematic
review in the field of arts and humanities. Australasian Journal of Educational Technology, 36(4),
126-150. https://doi.org/10.14742/ajet.5477
146
Appendix A
Facets of engagement and disengagement
Facets of student engagement
147
Appendix B
Educational technology tool typology, based on Bower (2016)
Knowledge
Multimodal Website creation Data analysis
Text-based tools organisation and
production tools tools tools
sharing
Discussion Animations Blogs Cloud storage Learning
forums Tutorials ePortfolios Bookmarking analytics
Collaborative Recorded lectures LMS dashboard
writing tools Videos Diary tool in
Readings Podcast/Vodcast Moodle
Newsletter Screencast
Text Authoring tools
RSS Voice recorder
Interactive
textbook
Annotation tools
Email
Chat
Instant messaging
Wikis
Synchronous
Digital Social networking
Assessment tools collaboration Mobile learning
Storytelling tools tools
tools
Storyboards eAssessment Social platforms Audio-Video Apps
Quizzes Microblogging conferencing mLearning
ARS
Open badges
Games
Games
148
Appendix C
Arts and humanities study characteristics in this sample
Author Year Journal Citations Field of Study
Alshaikhi & Madini 2016 English Language Teaching 2 English
Asoodar et al. 2014 Computers in Human Behavior 7 English
Carver et al. (Study 2012 Learning and Media 7 Introduction to the
1) Study of the Arab
World
Carver et al. (Study 2012 Learning and Media 7 Woman Health &
2) Human Rights
Chen & Chang 2014 Interactive Learning 28 Chinese
Environments
Cheung 2015 Communications in Computer and 11 Various languages
Information Science
Ducate et al. 2011 Foreign Language Annals 77 Various languages
Fukuzawa & Boyd 2016 The Canadian Journal for the 5 Anthropology
Scholarship of Teaching and
Learning
García-Sánchez & 2012 Technology, Pedagogy and 23 English
Sol Rojas-Lizana Education
Grgurović & 2007 Language Learning and 175 English
Hegelheimer Technology
Huang & Yang 2015 Journal of Educational Computing 18 English
Research
Hung 2015 English Language Teaching 13 English
Kenny 2008 Interpreter and Translator Trainer 19 Translation studies
Krasnova & 2016 International Journal of Emerging 9 English
Vanushin Technologies in Learning
Lin & Yang 2013 English Teaching: Practice and 20 English
Critique
Lopera Medina 2014 PROFILE: Issues in Teachers' 13 English
Professional Development
Lu & Churchill 2014 Interactive Learning 50 Literature
Environments
Lu et al. 2010 International Journal of Education 47 English
and Development using
Information and Communication
Technology
Mejia 2016 Lfe-Revista De Lenguas Para 1 Spanish
Fines Especificos
Nielsen 2013 JALT CALL Journal 11 English
Orawiwatnakul & 2016 Turkish Online Journal of 5 English
Wichadee Educational Technology
Pattanapichet & 2015 Turkish Online Journal of 7 English
Wichadee Distance Education
Peck 2012 Asian Social Science 21 Linguistics
Peterson 2012 Recall 76 English
Peterson 2009 Computer Assisted Language 83 English
Learning
Ramamuruthy & 2015 Malaysian Online Journal of 11 English
Rao Educational Technology
Sauro 2009 Language Learning & 225 English
Technology
149
Shi & Luo 2016 International Journal of Emerging 6 English
Technologies in Learning
Smith & Craig 2013 CALICO Journal 20 English
Song & Liu 2013 International Journal on E- 0 Literature
Learning
Sun 2012 CALICO Journal 11 English
Sun 2014 International Journal of 33 English
Information Technology and
Management
Tsai 2012 Journal of Educational 22 English
Technology & Society
Tschirhart & Rigler 2009 Language Learning Journal 10 Foreign language
Wang 2010 Australasian Journal of 77 English
Educational Technology
Wu 2016 Asia-Pacific Education 1 Chinese
Researcher
Xu & Moloney 2011 Asian Social Science 25 Chinese
Yang et al. 2012 British Journal of Educational 34 English
Technology
Yang & Hsieh 2015 Language Learning & 2 English
Technology
Yildiz 2009 Journal of Studies in International 54 English
Education
Zhang & Han 2012 Theory and Practice in Language 16 English
Studies
Zhang et al. 2014 Australasian Journal of 18 English
Educational Technology
150