0% found this document useful (0 votes)
69 views

Davies 2013

The article explores flipping the classroom in a college-level spreadsheet course. It compares a traditional classroom, flipped classroom, and simulation-based training approach. The flipped classroom had students watch instructional videos before class and do practice problems in class. A study found the flipped classroom was more effective and scalable than the simulation, and students were more motivated. The simulation frustrated students with its process focus and decreased motivation. The flipped classroom allowed for personalized instruction and greater differentiation.

Uploaded by

Rivaria Safitri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views

Davies 2013

The article explores flipping the classroom in a college-level spreadsheet course. It compares a traditional classroom, flipped classroom, and simulation-based training approach. The flipped classroom had students watch instructional videos before class and do practice problems in class. A study found the flipped classroom was more effective and scalable than the simulation, and students were more motivated. The simulation frustrated students with its process focus and decreased motivation. The flipped classroom allowed for personalized instruction and greater differentiation.

Uploaded by

Rivaria Safitri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Education Tech Research Dev (2013) 61:563–580

DOI 10.1007/s11423-013-9305-6

DEVELOPMENT ARTICLE

Flipping the classroom and instructional technology


integration in a college-level information systems
spreadsheet course

Randall S. Davies • Douglas L. Dean • Nick Ball

Published online: 11 June 2013


Ó Association for Educational Communications and Technology 2013

Abstract The purpose of this research was to explore how technology can be used to
teach technological skills and to determine what benefit flipping the classroom might have
for students taking an introductory-level college course on spreadsheets in terms of student
achievement and satisfaction with the class. A pretest posttest quasi-experimental mixed
methods design was utilized to determine any differences in student achievement that
might be associated with the instructional approach being used. In addition, the scalability
of each approach was evaluated along with students’ perceptions of these approaches to
determine the affect each intervention might have on a student’s motivation to learn. The
simulation-based instruction tested in this study was found to be an extremely scalable
solution but less effective than the regular classroom and flipped classroom approaches in
terms of student learning. While students did demonstrate learning gains, the process focus
of the simulation’s instruction and assessments frustrated students and decreased their
motivation to learn. Students’ attitudes towards the topic, their willingness to refer the
course to others, and the likelihood that they would take another course like this were
considerably lower than those of students in the flipped or regular classroom situations.
The results of this study support the conclusion that a technology enhanced flipped
classroom was both effective and scalable; it better facilitated learning than the simulation-
based training and students found this approach to be more motivating in that it allowed for
greater differentiation of instruction.

R. S. Davies (&)
Instructional Psychology and Technology, McKay School of Education, Brigham Young University,
Provo, UT, USA
e-mail: randy.davies@byu.edu

D. L. Dean  N. Ball
Information Systems, Marriott School of Management, Brigham Young University, Provo, UT, USA
e-mail: doug_dean@byu.edu
N. Ball
e-mail: nick_ball@byu.edu

123
564 R. S. Davies et al.

Keywords Technology integration  Technology simulations  Computer-aided


instruction  Differentiated instruction

Introduction

The primary goal of education is for students to learn. In order to maximize learning, most
educators hope to personalize instruction for their students, which generally includes
identifying the needs and capabilities of individual learners, making instruction relevant
and meaningful, and providing flexibility in scheduling, assignments, and pacing (Keefe
2007). The goal of personalizing instruction usually means replacing traditional models of
education with customized instruction. Traditional classrooms cannot always provide this
type of differentiation, which has led some educators to recommend a blended learning
environment (Dziuban 2004; Garrison and Kanuka 2004; Cornelius and Gordon 2008;
Verkroost et al. 2008; Patterson 2012), which incorporates technology in an effort to flip
the classroom (Bergmann and Wilie 2012; Friedman and Friedman 2001).
A simple description of flipping the classroom is to replace after-lecture homework with
the expectation that students study course material prior to class (Alvarez 2011; Moravec
et al. 2010). Class time is then dedicated to practice assignments, targeted remedial help, or
activities designed to promote higher order thinking skills (Khan 2012). As an instructional
technique, not all aspects of flipping the classroom are particularly new. For example, in
some traditional classroom-based teaching approaches, teachers expect students to come to
class prepared (i.e., having read assigned materials prior to class). What has recently
captured the imagination of many teachers who have considered the benefits of flipping is
the fact that advances in instructional technology have greatly enhanced our ability to
make this a functional reality by increasing ubiquitous access to learning resources and
thus the capacity to provide enriched resources to all students (Woolf 2010). For example,
some traditional textbooks are being supplemented with internet resources, videos, and
simulations. Many educators are promoting both the use of technology and some version of
an inverted classroom often in a blended learning environment (Alvarez 2011; Berrett
2011; Dziuban 2004; Fulton 2012; Graham 2006; Hughes 2012; Khan 2012; Kleiman
2000; Novak 2011; Talbert 2012). However, much of the research regarding the specific
pedagogical approach of flipping the classroom is only beginning to be published and is
often based on contextually situated learning circumstances (see Lage et al. 2000; Moravec
et al. 2010; Strayer 2007, 2012). Certainly, teachers need to know how to best integrate
technology into the learning process because how it is used matters. Moreover, it is
important to understand the challenges and what benefits are derived from each approach
(Bergmann and Sams 2012; Davies 2011).
The purpose of this research was to explore how technology can be used to teach tech-
nological skills and to determine what benefit flipping the classroom might have for students
taking an introductory-level college course on spreadsheets. We were particularly interested
in how technology might be utilized to effectively promote learning in this type of course.

Literature review

Basic concept of a flipped classroom

The idea of flipping the classroom is not new (Pardo et al. 2012), but the idea has recently
gained prominence due to advances in technology and increased ubiquitous access to

123
College-level information systems 565

computers and other mobile devices (Davies and West 2013). Advances in technology
allow teachers to provide online instructional videos and to benefit from online assessment
systems (Friedman and Friedman 2001; Woolf 2010). While many online resources are
being developed (see for example Carnegie or Cengage Learning), one of the most
prominent examples of online resources for flipping the classroom is the Khan Academy, a
generously funded project that provides open educational video resources on a variety of
subjects (see http://www.khanacademy.org/).
The idea of flipping the classroom with resources like the Khan Academy is simple
(Khan 2012). Rather than the teacher providing synchronous in-class group instruction,
students are expected to use the video resources provided, along with other materials, to
learn concepts and complete tasks on their own at their own pace and at locations con-
venient to the student. Individual students can focus their efforts on their individual
learning needs so that they are not left behind by class discussions that go too fast or
become bored by class time that is spent covering content they already know.
This approach allows the teacher to use class time in different ways, such as adapting
time allocation based on reports of where students need help. Students can participate in in-
class discussions or receive remedial assistance on things they were not able to learn on
their own. Moreover, students need attend class only if they need help beyond what is
provided by the other learning resources, resulting in time savings for those who do not
need in-class help.
There are a variety of ways that teachers implement a flipped classroom (Hughes 2012),
but the concept is basically the same (Bergmann and Wilie 2012; Berrett 2011; Talbert
2012). Direct instruction is blended with constructivist learning pedagogies so that indi-
vidualized differentiated learning is facilitated. Learning is not limited to the classroom,
and students can move at their own pace and direct their efforts based on their own
individual needs, thus personalizing instruction. Students are expected to take responsi-
bility for their own learning. The teacher’s role as a course designer shifts somewhat from
structuring in-classroom time to providing learning resources that can be consumed
asynchronously as needed.
Switching from a traditional classroom to a flipped classroom can be daunting because
of the lack of accessible, effective models for accomplishing it. However, effective flipped
classrooms share a few important characteristics: (1) Students transform from passive
listeners to active learners, (2) technology often facilitates the endeavor, (3) class time and
traditional homework time are exchanged so that homework is done first and class time
takes on a fluid structure to help personalize instruction, (4) content is given context as it
relates to real-world scenarios, and (5) class time is used either to help students grasp
especially challenging concepts or to help students engage in higher orders of critical
thinking and problem solving (Bergmann and Wilie 2012).
One of the unintended side effects of current accountability policies for teachers in the
USA is that students have been released from much of the responsibility to learn (Price
2012). Flipping the classroom assumes student will take control of their learning in terms
of the pace of study, mastery of content, and responsibility for coming to class prepared
(Alvarez 2011; Fulton 2012).

Role of technology in flipping the classroom

As an instructional tool, technology can facilitate learning in a number of ways. Two


specific areas in which technology can be extremely valuable are presenting content and
assessing achievement. Because one of the instructional approaches in this study used a

123
566 R. S. Davies et al.

simulation and one of the purposes of this study was to determine the degree to which
flipping the classroom might help personalize or differentiate instruction we have provided
some background on these topics.
The internet and advances in streaming video have greatly improved students’ access to
information, as they are able to access instruction from anywhere, something that was
previously relegated to the classroom. Because of these technological advances educators
are also able to utilize a variety of instructional tools students can use outside the class-
room including computer simulations. A simulation is a representation or model of a real-
world system. Simulations are needed when working in an authentic real world setting is
unsafe or impractical (Alessi 2000; Jacobs and Dempsey 1993). To make a simulation
educational it is necessary for the designer to consider the instructional objectives of the
instruction (Quinn 2005). The U.S. Department of Defense (1997) identified three types of
simulations for a designer to consider: live simulations (i.e., real people acting out situa-
tions), virtual simulations (i.e., real people controlling avatars), and constructive simula-
tions (i.e., people interacting within a simulated system). In addition to the type of
simulation required, an instructional designer must determine the kind of fidelity required
to best support the learning objectives. Fidelity refers to the degree to which the simulator
accurately represents the objects and tasks in the real world. Fidelity can be applied to
different aspects of the simulation. For example, deciding how realistic the physical
environment needs to be is referred to as the physical fidelity of the simulation. Cognitive
fidelity on the other hand refers to how realistic they tasks need to be (Jacobs and Dempsey
1993; Hays and Singer 1989; Alessi 1988, 2000). Instructional decisions should be
informed by the nature of the learning outcomes (e.g., whether the learning requires the
student to perform a task in a specific environment or whether they must become proficient
at a cognitive task unbounded by the physical setting). If the learning objective requires the
performance in a specific setting the simulator must have high physical fidelity (i.e., the
situation should be very similar to the real world setting). If the learning objectives are
primarily cognitive in nature it is important that the simulator has high cognitive fidelity
(i.e., the type of problem the student is asked to engage in is more important than the
authenticity of the physical setting in the simulation). The success of the simulation will be
determined by whether it helps students accomplish the intended learning outcomes (Quinn
2005; Gatto 1993).
One way to improve learning (i.e., help student accomplish intended learning objec-
tives) is to differentiate the instruction. The purpose of differentiation is to meet the
learning needs of individual students. Personalized instruction for students generally
includes identifying the capabilities of individual learners, providing flexibility in the pace
and content being presented, and making instruction meaningful for the individual student
(Keefe 2007). The idea of differentiated instruction is not new (Keefe and Jenkins 2002;
Tomlinson 2003); however the potential for technology to facilitate differentiation is
appealing (Woolf 2010). In terms of technology integrations, technology-enabled assess-
ment can help instructors obtain diagnostic and formative information about students in
order to customize instruction (Cizek 2010; Keefe 2007; Marzano 2009). This allows the
instruction to be differentiated or personalized for individual students. When feedback is
received on students’ mastery of specific skills, students can better direct their own
learning efforts. As teachers utilize technologies to automate or eliminate time-consuming
tasks, they are able to more effectively differentiate the instruction.
Many factors are required for technology-enabled differentiated instruction to become a
reality. Much of the educational software currently being used in schools focuses on
content delivery. It does not necessarily customize instruction to the individual needs of the

123
College-level information systems 567

learner. Computer programs used in schools have primarily involved drill and practice
programs for math. Improving basic word processing skills (i.e., typing) is also a prevalent
technology-facilitated instructional activity taking place in schools (Ross et al. 2010).
These educational software programs are intended to supplement the work of teachers
rather than replacing them and are typically not integrated directly into classroom
instruction. The current efforts to personalize instruction with technology have focused on
managing learning (Woolf 2010).

Research purpose and questions

The purpose of this study was to examine the effectiveness and feasibility of flipping a
college course designed to teach introductory spreadsheet skills when compared to the
traditional classroom approach. The first research question was: Does the instructional
approach impact learning effectiveness? Learning effectiveness was determined by stu-
dents’ achievement in the course. The second research question examined student’s per-
ceptions of a flipped approach relative to a traditional classroom approach. We specifically
considered: (a) How much did students perceive they learned, (b) how much value did they
attribute to the course, and (c) did the course impact their attitudes towards the course and
the topic? The motivation for conducting this study was based on the expectation that
flipping the classroom would benefit student learning.

Methods

This research used a pretest posttest quasi-experimental research design with a cross-case
comparative approach to the data analysis. Descriptive and inferential statistics were used
to determine the magnitude of any differences found between and within groups. Survey
data supplemented assessment data to help researchers better interpret and understand the
results. The setting for the study was an introductory MS Excel class taught by the
Information Systems Department in the Marriott School of Management at Brigham
Young University.

Description of instructional approaches

This section describes the three instructional approaches tested as treatments in this study:
(1) traditional instruction in the form of large-group classroom-based lectures, (2) tech-
nology-driven independent study using MyITLab videos and software simulation, and (3)
technology-enabled independent study using videos with classroom support (a flipped
classroom).
Some aspects of the three treatments were similar: All three covered the same MS
Excel functions and features, and all used a textbook, syllabus, assignments, and
exams. Aspects of the treatments that differed are shown in Table 1. It should be
noted that while there are other software programs which simulate or model MS
Excel, MyITLab is the software solutions the Information Systems (ISys) Department
has been using for the past several years. Its selection was one of convenience for
this study but it is a well established learning management system which has been
around for many years.

123
568 R. S. Davies et al.

Table 1 Differences in treatments in this study


Treatment Motivation & Functions Optional Focus of Work
conceptual enrichment and menus remedial automated environment
lectures assessment

Traditional Provided in classroom No Results MS Excel


Simulation None Videos Process Simulation
Flipped Videos Yes Results MS Excel

Form and content of instruction

In the traditional approach, instruction was provided in the classroom. In the simulated and
flipped treatments, instruction was provided through videos. In the traditional and flipped
treatments, additional lecture material was included to provide motivation and supply
instruction for thinking logically about the Excel problems from a broader perspective: (1)
In the classroom for the traditional treatment, and (2) as part of the video-based lectures in
the flipped treatment. In the simulation approach, motivational materials and additional
problem-solving instruction were not provided. In the flipped treatment, lectures were held
that were optional to students. Those who attended had questions or wanted help.

Focus of automated assessment

In all three treatments, students were assessed through an automated grader. This grading
system has been used by the ISys Department for several years as the primary assessment
of student achievement in the traditional classroom sections of this course. Some tasks in
Excel require the user to enter the right formula with the right parameters to get the correct
outcome: Thus there was one correct way to do the work. Other tasks, like the application
of formatting a range of cells, can be accomplished in multiple correct ways. The auto-
mated graders used in this study were either process based or results based. The simulated
environment used a process-based approach: The automated grader gave credit only when
a student used the approach prescribed by the training materials for accomplishing a task.
The traditional and flipped treatments used a results-based approach by which credit was
given for the right outcome even when a student used an approach different from that
prescribed in the instructional materials.

Student-work environment

In the traditional and flipped treatments, students did homework and exams in the MS
Excel program. In the simulated environment, MyITLab, students did homework assign-
ments and exams in the simulated environment, being expected to learn how to perform
MS Excel tasks through the simulation, not the actual software program.

Class process

In all three treatments, students were asked to read the textbook materials before
attempting to complete the homework problems. In the traditional classroom treatment, the
teacher provided instruction in the classroom. Students were expected to come to class,
listen to the presentations, and ask questions. Then before the next class, they completed

123
College-level information systems 569

homework tasks on their own in MS Excel. In the simulation-based treatment, students did
not attend class but completed their homework problems and assessments in the simulated
environment. In addition to reading the textbook, students could watch videos demon-
strating how to accomplish a task. In the flipped classroom approach, in addition to reading
the textbook students could watch videos that demonstrated how to accomplish tasks. The
videos also provided motivation segments and additional instruction about how to think
about the problems. In the flipped classroom, students had the option to attend class to
receive assistance and instruction as needed. In each situation, students’ participation was
voluntary. Students were free to participate to the degree they deemed necessary in order to
complete the required assignments and pass the course assessments.

Participants

Subjects in this research were undergraduate students taking the introductory spreadsheet
course at Brigham Young University during the winter semester of 2012, which for this
course was divided into two 5-week terms. Participants in the regular classroom experience
were taught during the first term; participants in the flipped classroom and simulation
groups were enrolled during the second. Institutional review board procedures prohibited
us from compelling students to participate in each and every aspects of the study. As a
result not all students completed all the weekly surveys and course assessments. Table 2
presents participation rates, including the number of students taking the course as a
required part of their program and the number who selected it as an optional course. Of the
original 301 students invited to participate, 207 completed at least some elements of the
data collection. Of these, 190 completed both the pretest and posttest opinion surveys; 92
completed all five of the weekly effort-tracking surveys; and 188 completed both the
pretest and posttest MS Excel assessments.
Group similarities were determined based on the amount of previous training students
reported. Based on this analysis it was concluded that student participants in each of the
three groups had similar amounts of previous spreadsheet training, v2(6, n = 209) = 5.1,
p = 0.531.

Data collection and analysis

To answer the primary question regarding students’ achievement, a repeated measures


ANOVA was used to identify any statistically significant differences. The pretest and
posttest assessments were identical tests designed to assess student realization of the
learning outcomes of the course. This test was the summative assessment instructors from
the regular classroom group typically gave to students upon completion of the course.
Table 2 Participation rates
Treatment type Invited to Number of Response Course was Course was
participate respondents rate (%) required optional

Regular classroom 67 65 97 49 16
Flipped classroom 61 53 87 34 19
Excel simulation 173 89 51 49 40
Total 301 207 69 132 75

123
570 R. S. Davies et al.

Following IRB requirements, the pretest was optional for all students. The posttest
assessment was required for the regular and flipped classrooms as it was a regular part of
the instruction; however, the posttest was optional for the Excel simulation group, as the
simulation software typically administered its own version of the final exam. Only the data
from those students who voluntarily agreed to participate in the study were included.
Pretest and posttest surveys along with weekly online surveys were used to answer the
question of how students perceived the experience. In addition, results from the end-of-
course official course evaluations were obtained. In the post-test survey, students were
asked to rate their perceptions about how much they had learned in the course, how
valuable they thought the course was, whether they would recommend the course to
another student, and whether they would take another Information Systems course. Dif-
ferences in response distributions were compared using either an ANOVA or a non-
parametric Chi squared analysis, depending on the type of data being evaluated. In weekly
effort-tracking surveys, students were asked how much time they had spent on various
learning activities (study time, homework time, and time getting help) and how valuable
these activities had been in terms of learning the material. Survey results were disaggre-
gated for comparison by instructional approach.

Limitations

As is the case with all research, several limitations in the design of this research exist. For
example, the inability to require students to participate is an unavoidable limitation in
educational research. We did however obtain a 69 % response rate, which is respectable.
Still, to maintain a valid comparison, only those who completed the assessments admin-
istered for this study were used in the analysis. That is, only those who agreed to participate
and completed both the pre and posttests were included.
There is also some controversy over whether it is appropriate to use parametric sta-
tistical methods with the type of data obtained from the response scale used (i.e., whether
the response scale produces ordinal or interval level data). Given this concern we ran both
parametric and non-parametric statistical analysis. The results were very similar (i.e.,
identical interpretations) so we decided to report the more powerful parametric analysis in
the paper where appropriate.
Another issue facing researchers when attempting to attribute an effect to a specific
intervention is that of potential confounding variables. It is possible that results obtained
from this study may have been affected by any number of extraneous variables (e.g.,
student engagement or effort). While random assignment was impossible and we did not
have direct empirical data on these factors, we did obtain self-reported information from
students regarding their class attendance for those sections that held class. We also col-
lected self-report data regarding how much time students spent studying learning materials
each week. These may not be adequate measures of student effort (i.e., the amount of time
students spent engaged fully in the specific learning materials) nor do they represent all the
potential confounding variables. However, they do give us a basis for considering the
comparability of the groups. The fact that students indicated that they spent similar
amounts of time on each type of learning activity across all three treatments reduced
concerns over potential confounding factors.
Another potential limitation regarding results analysis was the fact that comparison
groups were of unequal sample size. Fortunately the statistical tests used are fairly robust.
We also examined whether the students in both groups were heterogeneous with respect to

123
College-level information systems 571

their achievement on the pretest and found no significant difference between groups. This
reduced our concerns about possible biases based on unequal sample sizes.
While we could not design around these limitations, we tested for potential bias where
possible and acknowledge the fact that limitations exist. We also recognize that response
patterns from this type of action research can often reveal fairly accurate indications of
value and benefit (Nolen and Putten 2007).

Results

An explanatory mixed methods approach was used to explore and interpret research
findings for this study. Quantitative results were used to identify patterns and themes,
followed by an analysis of qualitative evidence from surveys and observations to help
explain and better understand the findings.

Student achievement

A 3 9 2 mixed-design, repeated measures ANOVA was used to examine the effects of


instruction type (i.e., regular classroom, flipped classroom, and Excel simulation) and time
(i.e., pretest and posttest) on test scores. The assumptions of independence, normality and
homogeneity were tested and found to be adequate for using this procedure. The main
effects of instruction type were not significant, F(2,185) = 1.5, p = 0.223, indicating that
scores on the pretest and posttest were statistically similar for students participating in
these three modes of instruction. However, the main effect of time and the time by
instruction type interaction were both significant, F(1,185) = 731.2, p \ 0.001, g2 = 0.80
and F(2,185) = 3.53, p = 0.031, g2 = 0.04, respectively. Test scores did improve sig-
nificantly from pretest to posttest for all students, but scores from the Excel simulation
group did not increase to the same degree as those from students in the regular classroom
and flipped classroom groups (see Fig. 1).
While pretest scores for the Excel simulation group were similar to or slightly higher
than those of the regular classroom and flipped classroom groups, posttest scores for the

100

90

80
Average Score

70
Regular Classroom
60
Flipped Classroom
50 Excel Simulation
40

30

20
Pretest Posttest

Fig. 1 Comparison of pretest and posttest results by instruction type showing a different rate of
improvement for students in the MS Excel simulation group

123
572 R. S. Davies et al.

Excel simulation group averaged 8.5 and 5 points lower than posttest scores from the
flipped classroom and the regular classroom, respectively (see Table 3).

Student perceptions of the course

Table 4 shows the results of students’ end-of-semester evaluation of the overall course
and overall instructor. An ANOVA comparison of the three means for overall course
was significant, F(2,256) = 10.45, p \ 0.001. Tukey’s HSD pairwise test, adjusted for
unequal sample sizes, found the regular and flipped treatments were not significantly
different, but both treatments were significantly higher than the simulated treatment
(p \ 0.05). The comparison of the three means for overall instructor was also signifi-
cant, F(2,256) = 19.57, p \ 0.01. Tukey’s HSD pairwise test, adjusted for unequal
sample sizes, again found the regular and flipped treatments were not significantly
different, but both treatments were significantly higher than the simulated treatment
(p \ 0.05).
An analysis of students’ perception of learning confirms that students using the Excel
simulation approach were less likely to feel they had learned a lot from the course (see
Table 5). Students in the regular and flipped classrooms responded similarly to each other
regarding how much they felt they had learned. Over 80 % of the students from these

Table 3 Pretest and posttest results by instruction type


Test occasion n Mean SD Min. score Max. score

Pretest
All 188 36.1 25.0 0 100
Regular classroom 61 31.8 21.1 0 96
Flipped classroom 51 38.9 27.1 0 100
Excel simulation 76 37.7 26.2 0 100
Posttest
All 188 84.3 17.5 17 100
Regular classroom 61 85.4 16.1 27 99
Flipped classroom 51 88.9 15.3 17 100
Excel simulation 76 80.4 19.2 28 100

Table 4 Results from students’ end-of semester course evaluations by instruction type
n Mean SD

Overall course
Regular classroom 35 6.8 1.0
Flipped classroom 37 7.0 0.9
Excel simulation 198 6.1 1.4
Overall instructor
Regular classroom 35 7.2 0.9
Flipped classroom 37 7.4 0.9
Excel simulation 198 6.3 1.3
Responses based on an 8-point scale, with 1 as extremely poor and 8 as outstanding

123
College-level information systems 573

Table 5 Posttest results for the question ‘‘I learned a lot taking this course’’
Response options Regular classroom Flipped classroom Excel simulation Total
n = 59 n = 54 n = 77 n = 190

Very strongly agree 29 (49 %) 30 (56 %) 18 (23 %) 77 (41 %)


Strongly agree 21 (36 %) 15 (28 %) 31 (40 %) 67 (35 %)
Agree 7 (12 %) 6 (11 %) 17 (22 %) 30 (16 %)
Somewhat agree 2 (3 %) 2 (4 %) 9 (12 %) 13 (7 %)
Somewhat disagree 0 (0 %) 1 (2 %) 0 (0 %) 1 (1 %)
Disagree 0 (0 %) 0 (0 %) 2 (3 %) 2 (1 %)
Strongly disagree 0 (0 %) 0 (0 %) 0 (0 %) 0 (0 %)
Very strongly disagree 0 (0 %) 0 (0 %) 0 (0 %) 0 (0 %)

Response distributions were significantly different by group, v2(10, n = 190) = 24.2, p = 0.007

classes marked strongly agree or very strongly agree to indicate that they had learned a lot,
compared to only 63 % from the Excel simulation group who indicated these responses.
Student responses to this question were consistent with the results from the end-of-
semester course evaluations collected for these courses by the university. In response to the
question ‘‘How much did you learn from this course?’’ the mean was 6.9 for both flipped
and regular treatments and was 6.1 for the simulation treatment (on a scale on which 1
represents extremely little and 8 indicates a great deal). An ANOVA comparison of the
three means was significant, F(2,256) = 8.77, p \ 0.001. Tukey’s HSD pairwise test,
adjusted for unequal sample sizes, found that the regular and flipped treatments were not
significantly different, but both were significantly higher than the simulated treatment
(p \ 0.05).
Based on results from the posttest survey for this study, students in the regular and
flipped classrooms were much more likely to feel the class was extremely valuable
compared with students in the Excel simulation class (see Table 6).
Students in the regular and flipped classrooms were much more likely to indicate a
willingness to recommend this course to another student compared with students in the
Excel simulation class (see Table 7).
Students in the flipped classroom were much more likely to indicate a willingness to
take another Information Systems course than were students in both the regular and
simulation-based classes (see Table 8).

Table 6 Posttest results for the question ‘‘How valuable was this course?’’
Response options Regular classroom Flipped classroom Excel simulation Total
n = 58 n = 54 n = 75 n = 187

Extremely valuable 26 (45 %) 29 (53 %) 16 (21 %) 71 (38 %)


Valuable 31 (53 %) 16 (30 %) 43 (57 %) 90 (48 %)
Somewhat valuable 1 (2 %) 8 (15 %) 13 (17 %) 22 (12 %)
Somewhat waste of time 0 (0 %) 0 (0 %) 3 (4 %) 3 (2 %)
Waste of time 0 (0 %) 1 (2 %) 0 (0 %) 1 (1 %)
Extreme waste of time 0 (0 %) 0 (0 %) 0 (0 %) 0 (0 %)
2
Response distributions were significantly different by group, v (8, n = 187) = 29.5, p \ 0.001

123
574 R. S. Davies et al.

Table 7 Posttest results for the question ‘‘Would you recommend this course to another student?’’
Response options Regular classroom Flipped classroom Excel simulation Total
n = 58 n = 54 n = 75 n = 187

Absolutely yes 36 (62 %) 36 (67 %) 23 (31 %) 95 (51 %)


Probably 20 (35 %) 13 (24 %) 33 (44 %) 66 (35 %)
Somewhat likely 1 (2 %) 5 (9 %) 14 (14 %) 20 (11 %)
Somewhat unlikely 1 (2 %) 0 (0 %) 2 (3 %) 3 (2 %)
Unlikely 0 (0 %) 0 (0 %) 2 (3 %) 2 (1 %)
Extremely unlikely 0 (0 %) 0 (0 %) 1 (1 %) 1 (1 %)
2
Response distributions were significantly different by group, v (10, n = 187) = 28.5, p = 0.002

Table 8 Posttest results for the question ‘‘Would you take another ISys course?’’
Response options Regular classroom Flipped classroom Excel simulation Total
n = 58 n = 54 n = 75 n = 187

Absolutely yes 16 (28 %) 24 (44 %) 15 (20 %) 55 (29 %)


Probably 24 (41 %) 19 (35 %) 26 (35 %) 69 (37 %)
Somewhat likely 11 (19 %) 6 (11 %) 19 (25 %) 36 (19 %)
Somewhat unlikely 4 (7 %) 5 (9 %) 5 (7 %) 14 (8 %)
Unlikely 2 (3 %) 0 (0 %) 7 (9 %) 9 (5 %)
Extremely unlikely 1 (2 %) 0 (0 %) 3 (4 %) 4 (2 %)
2
Response distributions were significantly different by group, v (10, n = 187) = 19.0, p = 0.041

Table 9 Average weekly response for the question ‘‘How much time in hours did you spend on learning
activities this week?’’
Learning activity Regular classroom Flipped classroom Excel simulation Total
Hours (SD) Hours (SD) Hours (SD) Hours (SD)

Study 1.1 (0.72) 1.1 (0.60) 1.0 (0.67) 1.1 (0.65)


n = 55 n = 49 n = 85 n = 189
Homework 1.3 (1.3) 1.1 (0.65) 0.7 (0.67) 1.1 (0.98)
n = 63 n = 39 n = 54 n = 156
Getting help 0.4 (0.55) 0.4 (0.46) 0.7 (0.45) 0.5 (0.50)
n = 23 n = 20 n = 25 n = 68
Total 2.5 (1.5) 2.3 (1.2) 2.2 (1.4) 2.3 (0.69)
n = 63 n = 52 n = 90 n = 205
Total reported time spent on course was not significantly different by group, F(2,202) = 1.39, p = 0.252.
Only those students who reported participating in these activities were included in this analysis

Time spent on course learning activities

Table 9 presents response data on the question of how much time students spent on the
various learning activities in the course. There were no significant differences in the
amount of time spent on the course overall. Students from the Excel simulation group did,
however, report spending less time on homework, F(2,153) = 6.4, p = 0.002, than those
in the regular and flipped classrooms. This might be explained by the fact that the simu-
lated course instruction and homework were essentially the same learning activity.

123
College-level information systems 575

These results include only those students who reported spending some time on these
activities. Of interest is the fact that few students reported getting outside help. And when
they did seek help, they tended to spend only about half an hour per week getting help.
Those students in the Excel simulation group who reported getting help tended to spend
slightly more time on the activity. But based on evidence from students’ contact with
instructors, these students were often concerned that the simulation failed to give them
credit for a correct solution to a problem that did not follow the prescribed process
delineated by the system.

Value of course learning activities

Table 10 presents students’ perceptions of the value of the various learning activities in
which they participated. Reponses were reported weekly and averaged for each student.
The response scale ranged from 1, extremely unhelpful (or an extreme waste of time), to 6,
extremely valuable. An analysis of the data was done for each of the specific learning
activities (i.e., study, required homework tasks, and help received). All of the results were
statistically significant. In all cases, an analysis of the data showed that students in the
Excel simulation group consistently reported a lower value for the learning activities than
students in the regular and flipped classrooms. In fact, few students in the Excel simulation
group indicated they felt the learning activities required in the course were extremely
valuable (i.e., less than 10 %), contrasting with about 30 % in the regular and flipped
classrooms.
Results from the end-of-semester course evaluations were consistent with these results:
In the course evaluations students rated the effectiveness of course materials and activities
higher in the regular and flipped treatments than in the simulation treatment. In response to
the request ‘‘Rate the effectiveness of the course materials and learning activities,’’ the
mean was 7.0 for the flipped treatment and 6.9 for the regular treatment; the simulation
treatment group rated the course materials and learning activities at 6.0 (where 1 represents
extremely ineffective and 8 indicates extremely effective). An ANOVA comparison of the
three means was again significant, F(2,256) = 12.49, p \ 0.001. Tukey’s HSD pairwise
test, adjusted for unequal sample sizes, found that the regular and flipped treatments were
not significantly different, but that both of these treatments were significantly higher than
the simulated treatment (p \ 0.01).
On the end-of-semester course evaluations, students in the regular and flipped treat-
ments also rated the intellectual skills they developed in the course higher than did students
in the simulation treatment. In response to the request ‘‘Rate the course on how effective it
was in terms of helping you develop intellectual skills,’’ the mean was 6.9 for the flipped

Table 10 Average Weekly Results for the Question ‘‘How Valuable Were These Learning Activities?’’
Learning activity Regular classroom Flipped classroom Excel simulation Total
Mean (SD) Mean (SD) Mean (SD) Mean (SD)

Study 4.2 (0.67) 4.2 (0.58) 3.7 (0.67) 4.0 (0.69)


n = 57 n = 50 n = 87 n = 194
Homework 4.3 (0.51) 4.2 (0.76) 3.8 (0.77) 4.1 (0.74)
n = 63 n = 51 i = 89 n = 203
Getting help 4.2 (0.64) 4.0 (0.84) 3.5 (0.71) 3.9 (0.79)
n = 35 n = 24 n = 39 n = 98

Only those students who reported participating in these activities were included in this analysis

123
576 R. S. Davies et al.

treatment and 6.5 for the regular treatment; however, students in the simulation treatment
group gave an average rating of only 5.8. An ANOVA comparison of the three means was
significant, F(2,256) = 11.08, p \ 0.001. Tukey’s HSD pairwise test, adjusted for unequal
sample sizes, found that the regular and flipped treatments were not significantly different,
but that both of these treatments were significantly higher than the simulated treatment
(p \ 0.01).

Discussion and conclusions

This research explored how technology can be used to teach technological skills and what
benefit flipping the classroom might have for students taking a college course for intro-
ductory level spreadsheets. The criteria for evaluating these instructional approaches
included both academic achievement data and student perception data regarding the value
of various learning experiences provided.
Comparing the regular, flipped, and simulation-based courses demonstrated that how
technology is integrated into a course makes a difference. While somewhat less time
consuming (i.e., students spent less time doing homework), using the Excel simulation
software to teach spreadsheet skills was not as effective as other methods of instruction
tested in this study. This should not be seen as a condemnation of all simulation software as
students did learn using this approach. However, students using the software simulation
approach did not learn as much as students instructed with the actual software program.
Students considered the instructional activities less valuable. They also indicated they
would be less likely to take another Information Systems class or to recommend this course
to a fellow student.
These results support recommendations made by Davies and West (2013) as well as
Koehler and Mishra (2008) that the U.S. Department of Education’s (2010) expectation
that technology should be used in schools must be regulated by an expectation that it be
used in a instructionally sound manner (i.e., one that takes into consideration the intended
learning and the contextual factors that might affect learning). Instructors may choose to
use technology for various reasons. One reason often cited for using technology in
delivering introductory Excel classes is implementation efficiencies. As an introductory
Excel class is often a general requirement for all students, a scalable teaching approach is
needed. To deliver this course in a traditional classroom environment would be somewhat
impractical because of the number of faculty and classroom computers needed. If the
objective of the learning method is instructional efficiency and scalability, then the flipped
and simulation-based course models can be considered. Both offer alternatives that can be
implemented with few faculty and technology resources. In this regard the simulation-
based method has a slight advantage over the flipped model because using it does not
require the institution to install the Excel software on university computers; the instruction
and assessment can be conducted through an internet-based simulation.
Despite the implementation advantages of the simulation-based approach, this study
found it not as effective as either the flipped or regular approaches in terms of student
learning, student perceptions of the learning process, and student attitudes toward the
course. This appears to result at least in part from limitations in using a simulation of an
existing software application. In many ways the simulation was not rich enough to provide
students with the same experience as the actual software, despite the fact that this particular
software is commercially published and recognized as the most sophisticated Excel sim-
ulation software available. In particular, the simulation was not complex enough to allow

123
College-level information systems 577

for multiple ways to achieve a correct result. This was a particular deficiency given the
difficult nature of the tasks students completed as part of the assigned work in the class.
Had more trivial tasks been assigned, this issue would not likely have been as important
because fewer paths would have lead to the correct result. In this study we found that the
use of a simulated environment (i.e., technology simulating the actual software application)
was less effective and less satisfying for the learners.
Compared to the simulated treatment, the flipped classroom approach provided a
method for delivering the class that was both scalable and effective. The flipped classroom
approach allowed students to learn course content at their own pace, and students were not
required to come to class unless they needed help beyond the asynchronous materials,
allowing them to make better use of their time and improving their perception of the class.
Class time was more effective because it was used to provide remedial assistance to the
few students who needed extra help. Another efficiency in terms of the implementation of
this approach was the scalability of the course: A flipped classroom of this type can easily
accommodate a larger number of students in each section of the course. Based on course
assessments, this approach was also more effective. Students not only made greater aca-
demic gains but were also more satisfied with the learning environment. Moreover, stu-
dents completed their work in the MS Excel environment, so the limitations associated
with the simulation environment were not present.
The flipped approach was also better than the regular approach for delivering this course
but not significantly. This finding was somewhat surprising. We expected to find that the
flipped approach would be more scalable than the regular approach for this class and hoped
that the flipped approach would be as effective in terms of student learning and attitudes
about the class materials. We found no statistical differences between the flipped and
regular approaches for student evaluations of the course and instructor, student reports on
how much they learned in the class, student assessments of the value of the class, students’
reported willingness to recommend the course to another student, and student evaluations
of the learning activities in the class. However, in each area the mean scores for the flipped
approach were slightly more favorable than those of the regular approach. We also found
that students were significantly more willing to take another Information Systems class and
that student learning in the flipped class was higher than in the regular class. The evidence
suggests that the flipped approach is at least as effective as the regular approach for
delivering this class and somewhat more scalable.
One reason that the flipped approach may be better than the regular approach relates to a
dilemma regular classroom instructors face when delivering this type of content. Our
experience suggests that students enter an introductory Excel class with a wide range of
technological backgrounds, and instructors face a dilemma in pacing the classroom
instruction. Pacing that is too fast may cause those with limited technological backgrounds
to fall behind and perform poorly. But pacing that is too slow may result in those with
moderate to extensive technological backgrounds finding lower satisfaction with the
learning process. Instructors typically choose a pace that is too slow for some and too fast
for others—a suboptimal pace for many if not nearly all students. The flipped approach
allows students to pace themselves through the subject material within the context of the
due dates. Those with extensive technological backgrounds can move more quickly
through the materials than those with limited backgrounds.
We also expected to find that the flipped approach might be inferior to the regular
approach in motivating students to recognize the importance of the class materials. We
thought that the ability of good instructors to communicate the importance of the subject
matter and provide effective conceptual frameworks to help students organize and

123
578 R. S. Davies et al.

contextualize the educational experience would be difficult to reproduce in the flipped


approach. But the results of this study seem to suggest that this is not the case. The course
materials, particularly the set of instructional videos, created for the flipped approach, were
as effective at motivating students about the subject matter as the instructors delivering the
regular approach.
In summary, our findings suggest that the flipped approach and simulation-based
approach were both more instructionally efficient and scalable than the regular classroom
approach for the Excel course studied. The cost to achieve this scalability, however, would
require greater upfront investment in the development of the video resources or simulation
software. Although more scalable, the simulation-based approach was somewhat inferior to
the other two instructional approaches in terms of student learning effectiveness.
In addition, these findings lead us to believe that the use of technology to simulate a
specific software application like Excel has several other disadvantages for developing
skills. While students did demonstrate learning gains, the process focus of the simulation
(i.e., assessing students’ ability to follow a specific process in order to complete tasks) was
frustrating for students and decreased their motivation to learn. Students’ attitudes towards
the topic, their willingness to refer the course to others, and the likelihood that they would
take another course like this were considerably lower than those of students in the flipped
or regular classroom situations.

Limitations and future research

While our study provides compelling evidence of the efficacy of the flipped approach over
both the regular and simulation-based approaches, caution should be used in generalizing
the findings beyond the scope of the context of our study. The class used for this study was
a short (5-week), well structured course that teaches students basic and intermediate skills
with a software application. Student performance on class activities can be precisely
defined, and the assessment of learning objectives has been automated. These circum-
stances enable the efficiencies gained by the flipped and simulation-based approaches over
the regular approach. Instruction in other contexts may not be as amenable to the flipped or
simulation-based approaches. Future research in this area is required.

References

Alessi, S. M. (1988). Fidelity in the design of instructional simulations. Journal of Computer-Based


Instruction, 15(2), 40–47.
Alessi, S. M. (2000). Simulation design for training and assessment. In H. F. O’Neil & D. H. Andrews
(Eds.), Aircrew training and assessment (pp. 199–224). Mahwah: Lawerance Erlbaum Associates.
Alvarez, B. (2011). Flipping the classroom: Homework in class, lessons at home. Learning First. Retrieved
4 June 2013 from http://www.learningfirst.org/flipping-classroom-homework-class-lessons-home.
Bergmann, J., Overmyer, J., & Wilie, B. (2012). The flipped class: Myths versus reality. The Daily Riff.
Retrieved 4 June 2013 from http://www.thedailyriff.com/articles/the-flipped-class-conversation-689.php.
Bergmann, J., & Sams, A. (2012). Before you flip, consider this. Phi Delta Kappan, 94(2), 25.
Berrett, D. (2011). How ‘Flipping’ the classroom can improve the traditional lecture. The chronicle of higher
education. Retrieved 4 June 2013 from http://chronicle.com/article/How-Flipping-the-Classroom/
130857/.
Cizek, G. J. (2010). An introduction to formative assessment: History, characteristics, and challenges. In
G. J. Cizek & H. L. Andrade (Eds.), Handbook of formative assessment (pp. 3–17). New York: Routledge.
Cornelius, S., & Gordon, C. (2008). Providing a flexible, learner-centered programme: Challenges for
educators. Internet and Higher Education, 11, 33–41.

123
College-level information systems 579

Davies, R. (2011). Understanding technology literacy: A framework for evaluating educational technology
integration. TechTrends, 55(5), 45–52. doi:10.1007/s11528-011-0527-3.
Davies, R., & West, R. (2013). Technology integration in school settings. In M. Spector, D. Merrill, J. Elen,
& M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th
ed.). New York: Taylor & Francis Ltd.
Dziuban, C. D. (2004). Blended learning. In C. O. Boulder (Ed.), Educause center for applied research.
Retrieved 4 June 2013 from http://net.educause.edu/ir/library/pdf/ERB0407.pdf.
Friedman, H., & Friedman, L. (2001). Crises in education: Online learning as a solution. Creative Edu-
cation, 2(3), 156–163.
Fulton, K. P. (2012). 10 reasons to flip. Phi Delta Kappan, 94(2), 20–24.
Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher
education. Internet and Higher Education, 7(2), 95–105.
Gatto, D. (1993). The use of interactive computer simulations in training. Australian Journal of Educational
Technology, 9(2), 144–156.
Graham, C. R. (2006). Blended learning systems: Definition, current trends, and future directions. In C.
J. Bonk & C. R. Graham (Eds.), Handbook of blended learning: Global perspectives, local designs.
San Francisco: Pfeiffer Publishing.
Hays, R. T., & Singer, M. J. (1989). Simulation fidelity in training system design: Bridging the gap between
reality and training. New York: Springer-Verlag.
Hughes, H. (2012). Introduction to flipping the college classroom. In T. Amiel & B. Wilson (Eds.), Pro-
ceedings from world conference on educational multimedia, hypermedia and telecommunications 2012
(pp. 2434–2438). Chesapeake: AACE.
Jacobs, J. W., & Dempsey, J. V. (1993). Simulation and gaming: Fidelity, feedback, and motivation. In J.
V. Dempsey & G. C. Sales (Eds.), Interactive instruction and feedback (pp. 197–227). Englewood
Cliffs: Educational Technology Publications.
Keefe, J. (2007). What is personalization? Phi Delta Kappan, 89(3), 217–223.
Keefe, J., & Jenkins, J. (2002). Personalized instruction. Phi Delta Kappan, 83(6), 440–448.
Khan, S. (2012). The one world schoolhouse: Education reimagined. London: Hodder and Stoughton.
Kleiman, G. M. (2000). Myths and realities about technology in K-12 schools. In the harvard education
letter report. The digital classroom: How technology is changing the way we teach and learn.
Retrieved 4 June 2013 from http://www.edletter.org/dc/kleiman.htm.
Koehler, M. J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Tech-
nology (Ed.), The handbook of technological pedagogical content knowledge (TPCK) for educators.
New York: American Association of Colleges of Teacher Education and Routledge.
Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: a gateway to creating an inclusive
learning environment. Journal of Economic Education, 31(1), 30–43.
Marzano, R. J. (2009). Formative versus summative assessments as measures of student learning. In
T. J. Kowalski & T. J. Lasley (Eds.), Handbook of data-based decision making in education
(pp. 261–271). New York: Routledge.
Moravec, M., Williams, A., Aguilar-Roca, N., & O’Dowd, D. K. (2010). Learn before lecture: a strategy that
improves learning outcomes in a large introductory biology class. CBE Life Sciences Education, 9,
473–481.
Nolen, A. L., & Putten, J. V. (2007). Action research in education: Addressing gaps in ethical principles and
practices. Educational Researcher, 36(7), 401–407. doi:10.3102/0013189X07309629.
Novak, G. M. (2011). Just-in-time teaching. New Directions for Teaching and Learning, 2011(128), 63–73.
doi:10.1002/tl.469.
Pardo, A., Pérez-Sanagustin, M., Hugo, A., Parada, H. A., & Leony, D. (2012). Flip with care. Proceedings
of SoLAR southern flare conference. Retrieved 4 June 2013 from http://www.researchgate.net/
publication/232906379_Flip_with_care.
Patterson, G. A. (2012). An interview with Michael Horn: Blending education for high-octane motivation.
Phi Delta Kappan, 94(2), 14–18.
Price, J. (2012). Textbook bling: An evaluation of textbook quality and usability in open educational
resources versus traditionally published textbooks (Unpublished master’s project). Provo: Brigham
Young University.
Quinn, C. N. (2005). Engaging learning: Designing e-learning simulation games. San Francisco: Pfeiffer
Publishing.
Ross, S. M., Morrison, G., & Lowther, D. L. (2010). Educational technology research past and present:
Balancing rigor and relevance to impact school learning. Contemporary Educational Technology, 1(1),
17–35.

123
580 R. S. Davies et al.

Strayer, J. F. (2007). The effect of the classroom flip on the learning environment: A comparison of learning
activity in a traditional classroom and a flip classroom that used an intelligent tutoring system
(Unpublished doctoral dissertation). Columbus: Ohio State University.
Strayer, J. F. (2012). How learning in an inverted classroom influences cooperation, innovation and task
orientation. Learning Environments Research, 15(2), 171–193.
Talbert, R. (2012). Inverted classroom. Colleagues, 9(1), Article 7.
Tomlinson, C. (2003). Fulfilling the promise of the differentiated classroom: Strategies and tools for
responsive teaching. Alexandria, VA: Association for Supervision and Curriculum Development.
U.S. Department of Defense. (1997). DoD modeling and simulation (M&S) glossary, DoD 5000.59-M.
Retrieved 4 June 2013 from http://www.dtic.mil/whs/directives/corres/pdf/500059m.pdf.
U.S. Department of Education. (2010). Transforming American education: Learning powered by technol-
ogy. National education technology plan 2010. Washington, DC: Office of Educational Technology.
Verkroost, M., Meijerink, L., Lintsen, H., & Veen, W. (2008). Finding a balance in dimensions of blended
learning. International Journal on E-Learning, 7(3), 499–522.
Woolf, B. P. (2010). A roadmap for education technology. Retrieved 4 June 2013 from http://www.
coe.uga.edu/itt/files/2010/12/educ-tech-roadmap-nsf.pdf.

Randall S. Davies Ph.D, is an Assistant Professor of Instructional Psychology and Technology in the
McKay School of Education at Brigham Young University. His research involves program evaluation in
educational settings with the general objective of understanding and improving the teaching and learning
process.

Douglas L. Dean Ph.D, is an Associate Professor of Information Systems in the Marriott School of
Management at Brigham Young University. His research interests include online communities, knowledge
management, scientometrics, and collaborative tools and methods.

Nick Ball Ph.D, is an Assistant Professor of Information Systems in the Marriott School of Management at
Brigham Young University.

123

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy