Perception by Picciano (2002)
Perception by Picciano (2002)
Perception by Picciano (2002)
ABSTRACT
The research literature on Web-based learning supports the assumption that interaction is important for a
successful course, yet questions exist regarding the nature and extent of the interaction and its effects on
student performance. Much of the research is based on student perceptions of the quality and quantity of
their interactions and how much they have learned in an online course. The purpose of this study is to
examine performance in an online course in relationship to student interaction and sense of presence in
the course. Data on multiple independent (measures of interaction and presence) and dependent (measures
of performance) variables were collected and subjected to analysis. An attempt was made to go beyond
typical institutional performance measures such as grades and withdrawal rates and to examine measures
specifically related to course objectives.
KEYWORDS
Distance learning, Interaction, Presence, Social presence, Learning effectiveness, Outcomes, Student
performance, Asynchronous learning, Computer-mediated learning, Computer-mediated communications,
Education administration
I. INTRODUCTION
As access to the Internet and World Wide Web has continued to grow, Web-based learning has continued
to expand. With approximately half of the households in the United States (or 150 million people
connected to the Internet), an estimated 2 million students are taking post-secondary courses that are fully
delivered online [1]. Millions of other students at all educational levels (primary, secondary, post-
secondary, continuing education) participate online in hybrid, mixed mode, and Web-enhanced face-to-
face courses. However, the effectiveness of online courses particularly in relation to individual student
needs, perceptions, and student-outcome is sometimes questioned [2], [3].
A common element for learning in a typical classroom environment is the social and communicative
interactions between student and teacher, and student and student [4]. The ability to ask a question, to
share an opinion with a fellow student, or to disagree with the point of view in a reading assignment are
all fundamental learning activities. Web-based learning requires adjustments on the part of students and
teachers for successful interactions to occur. Many online courses provide students and faculty, and
students and students the ability to interact with each other via an electronic bulletin board, discussion
board, email, or synchronous chat areas. The success of these courses frequently depends upon the nature
of this interaction. It is not unusual for instructors to encourage, and in some cases require a certain
21
JALN Volume 6, Issue 1 – July 2002
amount of participation in the form of postings per week in online discussions as part of the grade for the
course [5]. The research literature on Web-based learning supports this approach. Yet issues exist
regarding the nature and extent of the interaction and its effects on student performance.
In examinations of interaction, the concept of "presence" or a sense of being in a place and belonging to a
group also has received attention. A student's physical presence in a face-to-face course assumes that she
or he has a sense of belonging to the class or group of students enrolled in the course. He or she listens to
the discussion and may chose to raise a hand to comment, to answer or to ask a question. Furthermore,
this same student may develop a relationship with other students in the class and discuss topics related to
the class during a break, at the water fountain, or in the cafeteria. However, this is an assumption and not
always true. For a variety of reasons, some students can also feel alienated in a face-to-face class and not
feel part of a group. Presence in an online course has been the subject of a number of articles redefining
and categorizing this concept. In an online course, the simplest definition of presence refers to a student's
sense of being in and belonging in a course and the ability to interact with other students and an instructor
although physical contact is not available. However, as this concept is studied, the definition is expanding
and being refined to include telepresence, cognitive presence, social presence, teaching presence, and
other forms of presence. The term "community" is related to presence and refers to a group of individuals
who belong to a social unit such as students in a class. In an online course, terms such as communities of
inquiry, communities of learners, and knowledge-building communities have evolved.
As the definition of presence has expanded and evolved, a distinction is being made between interaction
and presence, emphasizing that they are not the same. Interaction may indicate presence but it is also
possible for a student to interact by posting a message on an electronic bulletin board while not
necessarily feeling that she or he is a part of a group or a class. If they are different, then it is also possible
that interaction and presence can affect student performance independently.
Student performance is open to many definitions. Successful completion of a course, course withdrawals,
grades, added knowledge, and skill building are some of the ways that performance is measured,
depending upon the content of the course and the nature of the students. Courses may also have multiple
performance outcomes, each of which might be measured separately through testing, written assignments,
or the completion of individual and group projects. Many studies of student performance in face-to-face
and online courses rely on student perceptions of their learning experiences including "how well" or "how
much" they have learned. Ultimately, student perceptions of their learning may be as good as other
measures because these perceptions may be the catalysts for continuing to pursue coursework and other
learning opportunities. Student performance is well understood to be a multivariable phenomenon
effected by study habits, prior knowledge, communications skills, time available for study, teacher
effectiveness, etc. The purpose of this study is to examine performance in an online course in terms of
student interaction and sense of presence. Data on multiple independent (measures of interaction and
presence) and dependent (measures of performance) variables were collected and subjected to analysis.
22
JALN Volume 6, Issue 1 – July 2002
courses through the SUNY Learning Network (SLN), conclude that the relationship of satisfaction,
interaction, and performance (grades) was as follows:
The greater the percentage of the course grade that was based on discussion, the more satisfied
the students were, the more they thought they learned from the course, and the more interaction
they thought they had with the instructor and with their peers. [13]
Hartman and Truman-Davis in a survey of faculty teaching online courses found "statistically significant
correlations (Amount of Interaction - r = .726; Quality of Interaction - r = .807)" and concluded that the
interaction was critical to faculty satisfaction [14].
Dziuban and Moskal, in a paper entitled Emerging Research Issues in Distributed Learning, likewise
report very high correlations and relationships between interaction in online courses and student
satisfaction [15]. Their conclusions are based on a questionnaire (N=52,218) collected by the Research
Initiative for Teaching Effectiveness at the University of Central Florida over a three-year period, from
students enrolled in Web-based (fully online) courses, mixed-mode (some online, some face-to-face)
courses, and Web-enhanced, face-to-face courses. Among their findings were statistically significant
correlations between the quantity and quality of the interaction and student satisfaction in all three types
of courses. However, in Web-based courses, the relationship of interaction to perceived success appears a
more critical factor than in the other (mixed-mode or Web-enhanced) courses. They recommend that
more research was needed specifically on the question: What is the impact of class interaction in Web-
based courses?
While most of the research supports the relationship of interaction and satisfaction in Web-based courses,
some observers have cautioned that this is not always the case. Ruberg, Taylor, and Moore, for example,
observes that in order to interact successfully, students must adjust to the non-linear, asynchronous nature
of Web-based learning [16]. Typical face-to-face situations tend to be linear, focusing on a single
discussion thread. Asynchronous, Web-based learning sessions on an electronic bulletin board can have
multiple threads with several discussions and interactions progressing simultaneously. Students respond
to the teacher but also to other students, depending on their interest and points of view. Students can
initiate a new discussion as easily as the teacher. Sproull and Kiesler caution about discussions that
continue based on misinformation because in asynchronous mode an instructor cannot immediately
correct or clarify a comment [17]. As a result, students need to have the experience and knowledge base
to sift the discussion for misinformation. In asynchronous learning, the amount of student interaction and
the number of comments can easily lead to what Mackay described as information overload [18].
Furthermore, comments in on-line discussions tend to be lengthier than in face-to-face situations. With
more information from many sources, students need to be more attentive to both the who and what of a
discussion. Herbert Simon, economist and Nobel Prize laureate, succinctly cautions, “a wealth of
information can create a poverty of attention” [19].
Michael Beaudoin in a paper entitled, Learning or Lurking? Tracking the ‘Invisible’ Online Student,
examines the relationship between student interaction and learning [20]. In the study, he divides an online
class into three groups (high interaction, moderate interaction, and low interaction). He reveals that while
the high interaction students achieved the highest performance, the low interaction group performed
higher than did the moderate interaction group. Most faculty have probably observed similar situations in
many classes. While much of the research relates student satisfaction and performance to the active
participation in online course activities, faculty teaching these courses face a small dilemma in
establishing requirements for interacting online because some students may not need to participate
23
JALN Volume 6, Issue 1 – July 2002
Related to the research on interaction is the concept of presence. Students who feel that they are part of a
group or "present" in a community will, in fact, wish to participate actively in group and community
activities. Presence has a social psychology basis related to how individuals respond and interact using
different forms of media [21], [22]. In one of the most extensive recent reviews of the literature on the
subject, Lombard and Ditton define presence as the perceptual "illusion of nonmediation" [23]. An
"illusion of nonmediation" occurs when a person fails to perceive or acknowledge the existence of a
medium in his/her communication environment and responds as he/she would if the medium were not
there. Furthermore, because it is a perception, presence can and does vary from individual to individual. It
can also be situational and vary across time for the same individual, making it a complex subject for
research. While the literature on presence has existed for a number of years, Lombard and Ditton
concluded that the "research on presence was in its infancy" [23]. Specifically, little was known about the
"characteristics of a medium's form and content," "the characteristics of medium users that encourage a
sense of presence," and "the effects of presence once it is evoked." These critics recommend that
research on presence be conducted within the specific context of a medium and within six categories,
which they refer to as conceptualizations.
Interaction and presence in a an online course can be studied for many reasons including vibrancy of a
discussion, students willingness to share ideas, participation in collaborative activities, and group
projects, all of which can support productive learning environments. Ultimately, however, student
performance outcomes need to be evaluated to determine the overall success of a course. An extensive
amount of literature exists on performance outcomes as related to distance learning. Keegan comments
that measuring student success is a "preoccupation" in distance learning especially where adults were
concerned [29]. While much work has been done in this area, student outcomes are not easy to define in
higher education; even experienced researchers have characterized them as "messy" [15]. For example,
while grades and their derivatives such as grade point averages are common student performance
measures, they can be problematic particularly in light of concerns about issues such as grade inflation. At
Harvard University, for example, the Boston Globe reported that "48.5 percent of the grades last year
[2000] were A's and A-minuses, ...B grades were 45 percent.… Grades in the three C categories [were]
4.9 percent... D's and failing grades accounted for less than 1 percent each" [30]. Susan Pedersen,
Harvard's dean of undergraduate education, commented that with such a narrow range of effective grades
available [essentially, B, B+, A-, and A], faculty find it difficult to distinguish adequately between work
24
JALN Volume 6, Issue 1 – July 2002
Course completion and attrition rates are considered to be important student performance measures
especially as related to adult and distance learning [25], [31], [2], [11]. Moore and Kearsley have reported
student attrition rates as high as 50 percent in some distance learning programs [25]. However, attrition is
a complex phenomenon dependent on a myriad of academic, social, and personal factors including the
academic program (graduate, undergraduate, continuing education), admissions criteria (selective, open
admissions), and the nature of the student (mature, motivated, command of basic skills).
The literature on quality issues in distance learning suggests that multiple measures related to individual
academic program and course objectives should be used in studying student performance [3], [31], [13],
[15]. Performance data can be in the form of tests, written assignments, projects, and satisfaction surveys.
For the purposes of this research, this multiple measure approach is respected and utilized. Data on
student perceptions of their learning as well as other actual measures specifically related to course
objectives are collected and analyzed below.
In summary, a good deal of research has been conducted on interaction, presence and student
performance in Web-based learning. While researchers can draw from the past for insight, new situations
created through new technology require new study and evaluation. As educators attempt to develop and
implement these technologies in instruction, on-going evaluation and study involving multiple measures
will be necessary.
The major research questions that guided this study are as follows:
For purposes of this study, presence was defined as an "illusion of nonmediation" which occurs when a
person fails to perceive the existence of a medium in his/her communication environment [23].
Furthermore, the social component of this definition refers to a student's sense of belonging in a course or
group and the ability to interact with others, although physical contact is not available.
IV. METHODOLOGY
A. Program/Course
The methodology used for this study was a descriptive analysis of interaction, presence, and performance
data collected in a graduate course in an education administration program at Hunter College in New
25
JALN Volume 6, Issue 1 – July 2002
York City. The Education Administration and Supervision Program at Hunter College is a thirty-credit
graduate program leading to New York State certification as a school administrator. New York State
requires a minimum of eighteen graduate credits plus an internship. The program at Hunter requires
twenty-four credits (eight courses) plus a six-credit internship. Web-based courses have been offered in
this program since 1997, and students can complete a majority of the coursework for the program online.
For the past ten years, the program has maintained an enrollment of 100 to 125 students, almost all part-
time. Because of funding and a desire to insure academic quality, the enrollment in the program has been
limited.
The course entitled, Administration and Supervision (ADSUP) 722 - Issues in Contemporary Education,
is an elective course. This course is designed to provide a forum for the presentation and discussion of
issues in contemporary education. It also is designed to provide future administrators with an appreciation
of differences in points of view and the ability to approach issues that can be divisive in a school or
community. Thirteen contemporary issues in education such as charter schools, teacher unionization,
bilingual education, and special education form the content of the course. The course is structured around
readings and a weekly discussion. In addition, written assignments are required which are designed to put
the student in the position of an administrator making a decision or recommending a course of action
related to one of the issues.
B. Students
To enroll in the graduate program in Education Administration and Supervision at Hunter College, all
students must have at least five years teaching experience and an earned MA. All of the students are
education professionals already certified as teachers by New York State, who are seeking further
certification as school administrators. More than 80% are women. Approximately 25% are students from
minority groups. Approximately 75% percent of these students work in New York City public schools,
while the remaining 25% work in private schools or in public schools outside of New York City. The
students in this program recognize the importance of technology, and the vast majority of the enrollees
have access to computer and Internet technology in their homes. Many are also professionally curious
about an alternative pedagogical experience, such as Web-based learning using the Internet and other
current technological tools.
All of the students balance full-time jobs, families, parenthood, and higher education in a carefully
planned day, which includes rushing for subways and buses to meet the next commitment. They are a
mature group who organize their daily lives around lesson plans, making sure their children get to the
babysitter or day care center, maintaining a home, and when time permits, completing homework
assignments. Courses that can be taken at anytime or in anyplace have a good deal of appeal. These
students are able to fit their graduate studies into their busy lives, eliminating the need to travel several
times per week to the College. These students typify the mature, self-directed, and busy “students” who
could take advantage of and benefit from this form of instruction.
From the group described above, twenty-three (N=23) students enrolled in ADSUP 722 for Fall 2001.
Their average age was thirty-seven years. Sixteen were female and seven were male. The ethnic
composition was as follows: three African-American, three Latino, and seventeen White/Caucasian.
Eight of the twenty-three students had previously taken an online course(s); the remaining fifteen had not.
C. Instructional Components
A completely asynchronous model was used for delivering this course via a course Web site utilizing the
26
JALN Volume 6, Issue 1 – July 2002
BlackBoard course management system (CMS). To connect to the course Web site, most students used a
commercial Internet and e-mail provider such as America On-line or Compuserve in their homes. Several
students also used Internet facilities available in their schools.
The course was organized into thirteen weekly themes and topics. The Web site for the course included a
syllabus, reading assignments, weekly discussion topics and questions, supplementary reading material,
and related links. These materials were always available and served as the organizational anchors for the
course. Each topic was organized for an asynchronous discussion on an electronic discussion board
during a specific week and was based on assigned readings and case studies. Four students were selected
each week to work with the instructor as discussion facilitators. The use of students as facilitators was
designed to encourage them to be contributors to and not simply receivers of learning activities. Once the
discussion of a topic commenced on Sunday morning, any student could contribute to the discussion, ask
a question of another student or the instructor. At the end of the week’s discussion on the following
Saturday, the instructor summarized the topic, added additional notes and comments, and posted these to
the Web site for access by the entire class.
Techniques to encourage social presence and a sense of community were used throughout the course.
Rourke and others provide an excellent review of some of the techniques that can be used to foster a sense
of presence and community building including: complimenting students, self-disclosure, warmth, and
activities that build and sustain a sense of group commitment [28]. In this course, many of these
techniques were used. For example, first names were used in all online discussions. Discussion questions
were designed to encourage students to relate the material to their experiences in their own schools and
environments. Students were used as facilitators each week to encourage them to assume some ownership
of the online discussion and to reduce—but not eliminate—dependence on the instructor. An internet cafe
where students could interact on non-instructional issues was also available. This facility was used, for
instance, for comments and discussions during the terrorist attack on the World Trade Center on
September 11th during the second week of our course, as well as at the time of the birth of a daughter to
one of our students during the eighth week of the course.
It is accepted that the instructional model presented above for an online course is highly dependent upon
faculty-to-student and student-to-student interactions via electronic discussions. As a result, the findings
may only relate to similar instructional models. Other online course models using intelligent software
tutoring or programmed instruction techniques that are dependent upon far fewer person-to person-
interactions may have different issues requiring study methodologies not provided for here.
Data on actual student participation in online discussions were collected throughout the semester.
Students also completed a satisfaction survey (see Appendix) at the end of the course, which asked a
series of questions addressing their overall experiences, especially as related to their learning and
interaction with others and the technology used. A series of questions (Questions 16A through 16K) relate
to social presence was included as part of this survey. These questions are based on the Inventory of
Presence Questionnaire developed by the Presence Research Working Group (http://www.presence-
research.org) at the Technische Universiteit Eindhoven, Netherlands and on a questionnaire developed by
27
JALN Volume 6, Issue 1 – July 2002
Chih-Hsiung Tu [32].
In addition to student perceptions of their learning as collected on the student satisfaction survey, two
student performance measures are collected: scores on an examination and scores on a written
assignment. The latter measures relate to the course's two main objectives: to develop and add to the
student's knowledge base regarding contemporary issues in education, as well as to provide future
administrators with an appreciation of differences in points of view and an ability to approach issues that
can be divisive in a school or community. The examination was designed to assess knowledge of the
course subject matter and was based on the thirteen issues explored during the semester. An objective,
multiple choice question and answer format was used. The written assignment was a case study that
required the students to put themselves in the position of a newly appointed principal who has to consider
implementing a new, controversial academic program. For purposes of this study, this assignment was
graded by an independent scorer who used content analysis techniques to identify phrases and concepts to
determine student abilities to integrate multiple perspectives and differing points of view in deciding
whether and how to implement the academic program. These performance measures corresponded
specifically to the objectives of the course as established by the instructor. Other performance measures
such as grades were not used because of the difficulty to distinguish adequately between student work of
differing quality where letter grades (A, B, C) are used. In addition, student participation or interaction
was included as part of the overall grading criteria, bringing into question the use of these grades in
relationship to interaction. Withdrawal or attrition data also were not a factor in this study since all of the
students completed the course. This is not unusual for graduate programs with selective admissions
requirements.
Because of the small sample size of the student population, no attempt is made to use formal statistical
significance or sample size techniques to infer that the results of this study represented larger populations.
Instead basic descriptive analyses using means and correlations are used.
V. RESULTS
A. Student Perceptions of Interaction and Learning
To determine the relationship between student perceptions of their interaction and performance, the
student satisfaction survey contains questions about each of these. Questions 9A through 9D (see
Appendix) compares the amount and quality of their interactions with students and the instructor
compared to traditional courses. Responses were formatted in a Likert scale with values ranging from 1-
5 (Decreased - Somewhat Decreased - No Change - Somewhat Increased - Increased). The responses to
these questions are scored and combined into an overall perception of student interaction variable that
ranged from 1 to 5. The mean for all students on this perception of interaction variable was 4.00
(Somewhat Increased).
Questions 9E and 9F (see Appendix) on the student satisfaction survey refer to the quality and quantity of
their learning experiences. Responses were formatted in a Likert scale in the same way as the interaction
questions. The responses to these questions are scored and combined into an overall perception of student
learning variable that ranged from 1 to 5. The mean for all students on this perception of learning
variable was 4.32 (Somewhat Increased, plus). Performing a simple correlation on these two variables, the
resulting coefficient is positive (.6732), as well as statistically significant (.05 level).
These results indicated that there is a strong, positive relationship between student perceptions of their
interaction in the course and their perceptions of the quality and quantity of their learning. These results
28
JALN Volume 6, Issue 1 – July 2002
120
Number of Student
100
Postings
80
60
40
20
0
Week Week Week Week Week Week Week Week Week Week Week Week Week Week
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Week
120
Number of Postings
100
80
60
40
20
0
S1 S2 S3 S4 S5 S6 S7 S8 S9 S10 S11 S12 S13 S14 S15 S16 S17 S18 S19 S20 S21 S22 S23
Student
Student postings constituted one indicator for actual participation in the course since it showed the
number of times students read and responded in writing to the instructor's or to another student's posting.
In performing a correlation on actual student postings with actual student performance scores on the
examination and written assignment, the results were positive at .1318 and .4577 but not statistically
significant (.05 level). While positive, these correlations especially on the exam were somewhat weaker
than the coefficient (.6732) for student perceptions of their interaction and their learning presented in
Section A above.
29
JALN Volume 6, Issue 1 – July 2002
In pursuing the relationship between actual interaction and student scores on the examination and the
written assignment, the data on interaction were sorted by the number of student postings and divided into
thirds representing low interaction, moderate interaction, and high interaction student groups. Mean
scores on the exam and on the paper were then calculated for each group (see Table 1).
Table 1. Mean Student Scores on Exam and Written Assignment Controlling for Interaction Group (N=23)
Mean Mean
Interaction Group Exam Score Written Assignment Score
Low 85.0 63.7
Moderate 85.7 64.4
High 85.6 81.1
The data in Table 1 indicate that there were no differences among the three interaction groups in terms of
actual performance on the examination. On the written assignment, the high interaction group scored
significantly higher than the low and moderate interaction groups. The overall conclusion was that actual
student interaction as measured by the number of postings on the discussion board had no relationship to
performance on the examination. Actual student interaction as measured by the number of postings on the
discussion board did have a relationship to the written assignment for students in the high interactive
grouping.
In comparing student perceptions of social presence with actual performance measures, the results are
somewhat different. The correlation between student perception of social presence and the written
assignment was statistically significant (.05 Level) and positive (.5467); however, the correlation between
student perception of social presence and the examination was inversed at (-.3570) and not statistically
significant at the .05 level.
30
JALN Volume 6, Issue 1 – July 2002
In pursuing the relationship between student perception of social presence and student scores on the
examination and written assignment, the data on social presence were sorted by the overall perception of
social presence variable and divided into thirds representing low perception of social presence, moderate
perception of social presence, and high perception of social presence groups. Mean scores on the exam
and on the paper were then calculated for each group (see Table 2).
Table 2. Mean Student Scores on Exam and Written Assignment Controllingfor Social Presence (N=23)
Mean Mean
Social Presence Exam Score Written Assignment Score
Low 87.5 55.5
Moderate 87.5 70.1
High 80.0 80.0
The data in Table 2 indicate that while there is no difference between the low and moderate social
presence groupings on the examination; there is a significant difference and a lower mean for the high
social presence group compared to the other two groups. On the other hand, on the written assignment,
the differences between the three groups supports the high correlation between student perception of
social presence and performance on the written assignment with each group scoring progressively higher
(55.5 -> 70.1 ->80.0).
The overall conclusion is that student perception of social presence did not have a statistically significant
relationship to performance on the examination, while student perception of social presence had a
positive, statistically significant relationship to performance on the written assignment.
31
JALN Volume 6, Issue 1 – July 2002
Mean Mean
Interaction Group Actual Postings Perceived Postings
Low 24.87 36.75
Moderate 41.14 38.00
High 60.62 49.00
The data in Table 3 indicate that while the perceptions of the number of postings of the moderate
interaction group of students are consistent with their actual postings, the low interaction group perceived
themselves to have made a higher number of postings than they actually did and the high interaction
group perceived themselves to have made fewer postings than they actually did. The results indicate that
student perceptions of their interaction in a course need to be viewed with a bit of caution.
VI. DISCUSSION
A. Interaction
The results of this study support the findings in other research which establish a strong relationship
between students' perceptions of the quality and quantity of their interaction and their perceived
performance in an online course. However, in comparing student interaction as defined by actual postings
on a discussion board to actual performance measures designed specifically to measure course objectives,
the results are not consistent.
The data in Table 1 indicate that there were not differences among the three (low, moderate, high)
interaction groups in terms of actual performance on the examination. This study did not attempt to
answer the “why?” for this phenomenon, but speculation is possible. For instance, it is likely that all
students, and especially the low interaction group, studied for the examination. The questions on the
examination were derived mostly from the weekly discussions and instructor notes that were available on
line. The low interaction students may have read much of the material posted during the weekly
discussion but simply chose not to comment. This is not unlike a student in a regular face-to-face class
who listens attentively but does not raise his or her hand, yet still does well on a test or exam.
On the written assignment, the high interaction group scored significantly higher than the low and
moderate interaction groups, which scored about the same. The written assignment was based on a case
study and designed to determine student's ability to integrate multiple perspectives and differing points of
view in deciding whether and how to implement an academic program. The assignment was similar to
situations presented on the weekly discussion board in that students were posting their comments and
opinions on educational issues taking into consideration what already had been posted by their colleagues
in the class. Hence, a relationship might exist for students who interact extensively on a discussion board
and who are required to respond to similar situations such as that presented in the case study. On the other
hand, students in the high interaction group may be especially sensitive to differing points of view.
Whether these students already possessed these abilities and they were honed as part of the weekly
discussion board activities is difficult to assess.
32
JALN Volume 6, Issue 1 – July 2002
To expand on the relationship between high interaction and high score on the written assignment, what
might be at work is that the "everyone has a right to an opinion" format of an interactive, discussion board
environment rewards a student's facility with ad hoc discussion. The written project required recognizing
and including multiple perspectives rather than reflecting knowledge of objective content. Students may
even perceive themselves as learning more but that doesn't necessarily mean they do learn more in these
environments.
B. Social Presence
Student perception of social presence has a small inverse but not statistically significant relationship to
performance on the examination, while student perception of social presence demonstrates a strong
positive and statistically significant relationship to performance on the written assignment. Social
presence in this class depended upon participation in the weekly discussions, which encouraged an
appreciation for the points of view of others. It was in the weekly discussions that students could
"socialize," identify with, learn something about the other students, and relate to the personal experiences
of their colleagues, who were all educators. Those who felt the "presence" of their colleagues as a result
of what was read and written on the discussion board perhaps could relate better to an activity such as the
written assignment that was similar to the discussion board activity. On the other hand, their sense of
"presence" possibly did not relate to an objective, multiple-choice examination because it was not an
expressive activity but an asocial impersonal activity.
VII. CONCLUSION
This study examined performance in an online course in relationship to student interaction and sense of
presence in the course. An attempt was made to go beyond student perceptions of interaction and
performance and to include perceptions of social presence as well as actual participation in class
activities. In addition, data were collected on performance measures that related specifically to course
objectives. Typical institutional performance measures such as grades and withdrawal rates were not
included. While much of the research including this study, supports the strong relationship between
students perception of interaction and perceived learning, the results of this study indicated that the
relationship of actual measures of interaction and performance is mixed and inconsistent depending upon
the measures.
The results of this study should not be interpreted to indicate that interaction is not a key course
component in instructional design. To the contrary, by design, the success of many online courses is
dependent upon the nature of student to student and student to faculty interaction. However, how
interaction effects learning outcomes and what are the relationships between the two is a complex
pedagogical phenomenon in need of further study.
33
JALN Volume 6, Issue 1 – July 2002
VIII. APPENDIX
2. Age: ________
5. Where did you most frequently use a computer for this course?
Home___ Work___ Other ___ If other, specify: ______________________
6. How easy/difficult was it for you to use technology to participate in this course?
Easy___ Somewhat Easy ___ Somewhat Difficult ___ Difficult___
7. How would you rate your overall educational experience in taking this course?
Poor___ Satisfactory___ Good___ Very Good___ Excellent___
For questions 9A through 9H, in comparison to traditional classroom instruction, in this course
Somewhat No Somewhat
Increased Increased Change Decreased Decreased
9A. The amount of interaction with other students ___ ___ ___ ___ ___
9B. The quality of interaction with other students ___ ___ ___ ___ ___
9C. The amount of interaction with the instructor ___ ___ ___ ___ ___
9D. The quality of interaction with the instructor ___ ___ ___ ___ ___
9E. The quantity of your learning experience ___ ___ ___ ___ ___
9F. The quality of your learning experience ___ ___ ___ ___ ___
9G. The motivation to participate in class activities ___ ___ ___ ___ ___
34
JALN Volume 6, Issue 1 – July 2002
9H. Your familiarity with computer technology ___ ___ ___ ___ ___
10A. On average, regardless of whether you posted a message or not, how often did you access the course Web site
each week?
a. once a week
b. twice a week
c. three times a week
d. four times a week
e. five or more times a week
10B. On average, how often did you post a message to the Discussion Board each week?
a. once a week
b. twice a week
c. three times a week
d. four times a week
e. five or more times a week
11. Would you rate your experiences to date with this course as Successful____ Not Successful____
If successful, what aspect of the course most contributed to its success:
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
12. Should the Hunter College ADSUP Program offer more internet (asynchronous learning) courses?
Yes___ No____
If yes, because:
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
If no, because:
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
35
JALN Volume 6, Issue 1 – July 2002
14. To provide materials and to communicate online, a software system called BlackBoard was used. Can you please
rate how easy/difficult it was for you to use the BlackBoard software.
Easy___ Somewhat Easy ___ Somewhat Difficult ___ Difficult___
15. During this course, you had several tools available to you at the Course Web site for accessing information and
for communicating with colleagues and the instructor. Please rate the following:
Somewhat Very
Not Important Important Important Critical
Used Tool Tool Tool Tool
15A. Course Information ___ ___ ___ ___ ___
15G. Other_________________
For questions 16A through 16J, please circle the answer which best describes your opinion of the ADSUP 722
online course.
16B. Even though we were not physically together in a traditional classroom, I still felt like I was part of a group in
the online course.
36
JALN Volume 6, Issue 1 – July 2002
16F. An online course allows me to express my feelings, and to learn the feelings of others.
16J. I felt I got to learn a great deal about the instructor in the online course.
16K. I felt I got to learn a great deal about the other students in the online course.
IX. ACKNOWLEDGEMENTS
The author of this paper is grateful to the students in the Administration and Supervision Program at
Hunter College who are always willing to try new things, who provide valuable feedback, and who have
made me a better teacher.
37
JALN Volume 6, Issue 1 – July 2002
X. REFERENCES
1. Galt Global Review. Education news: Virtual classrooms booming. (December 2001).
http://www.galtglobalreview.com/education/virtual_classrooms.html
2. Phipps, R.A. and Merisotis, J. P. What's the difference: A review of contemporary research on
the effectiveness of distance learning in higher education. Washington, D.C.: The Institute for
Higher Education Policy, 1999. http://www.chea.org/Events/QualityAssurance/98May.html
3. Phipps, R.A., Wellman, J.V., and Merisotis, J. P. Assuring quality in distance learning: A report
prepared for the Council for Higher Education Accreditation. Washington, D.C.: The Institute for
Higher Education Policy, 1998.
4. Stubbs, M. Language, Schools, and Classrooms. London: Methuen, 1976.
5. Sener, J. Bringing ALN into the mainstream: NVCC case studies. In Online Education:
Proceedings of the 2000 Sloan Summer Workshop on Asynchronous Learning Networks. Volume 2
in the Sloan-C series, J. Bourne and J. Moore, Editors, Needham, MA: Sloan-C Press, 2001.
6. Chickering A.W. and Gamson, A.F. Seven principles for good practice in undergraduate
education. Racine, WI: The Johnson Foundation, Inc., 1987.
7. Kumari, D.S. Connecting graduate students to virtual guests through asynchronous discussions:
analysis of an experience. Journal of Asynchronous Learning Networks, 5(2), 2001.
http://www.aln.org/alnweb/journal/Vol5_issue2/Kumari/Kumari.htm
8. Fulford, C.P. and Zhang, S. Perceptions of interaction: The critical predictor in distance education.
The American Journal of Distance Education 7(3): 8-21,1993.
9. Kearsley, G. The nature and value of interaction in distance education. Distance Education
Symposium 3: Instruction. University Park, PA: American Center for the Study of Distance
Education, 1995.
10. Sherry, L. Issues in distance learning. International Journal of Distance Education, 1(4), 337-365,
1996.
11. Picciano, A.G. Distance Learning: Making Connections across Virtual Space and Time. Upper
Saddle River, NJ: Prentice-Hall, 2001.
12. Picciano, A.G. Developing an asynchronous course model at a large, urban university. Journal of
Asynchronous Learning Networks, 2(1), 1998.
http://www.aln.org/alnweb/journal/vol2_issue1/picciano.htm
13. Shea, P., Fredericksen, E., Pickett, A., Pelz, W., and Swan, K. Measures of learning
effectiveness in the SUNY Learning Network. In Online Education: Proceedings of the 2000 Sloan
Summer Workshop on Asynchronous Learning Networks. Volume 2 in the Sloan-C series, J.
Bourne and J. Moore, Editors, Needham, MA: Sloan-C Press, 2001.
14. Hartman, J. L. and Truman-Davis, B. Factors related to the satisfaction of faculty teaching online
courses at the University of Central Florida. In Online Education: Proceedings of the 2000 Sloan
Summer Workshop on Asynchronous Learning Networks. Volume 2 in the Sloan-C series, J.
Bourne and J. Moore, Editors, Needham, MA: Sloan-C Press, 2001.
15. Dziuban, C. and Moskal, P. Emerging research issues in distributed learning. Orlando, FL: Paper
delivered at the 7th Sloan-C International Conference on Asynchronous Learning Networks, 2001.
16. Ruberg, L. F., Taylor, C.D., and Moore, D.M. Student participation and interaction on-line: A
case study of two college classes: Freshman Writing and Plant Science Lab. International Journal of
Educational Telecommunications, 2(1), 69-92, 1996.
17. Sproull, L.S. and Kiesler, S. Connections: New Ways of Working in the Networked Organization.
Cambridge, MA: MIT, 1991.
18. Mackay, W.E. Diversity in the Use of Electronic Mail: A preliminary inquiry.” ACM Transactions
on Office Information Systems, 6(4), 380-397, 1989.
19. Varian, H. “The Information Economy.” Scientific American, 273(3), 200-202, 1995.
20. Beaudoin, M. Learning or lurking? Tracking the ‘invisible’ online student. Orlando, FL: Paper
delivered at the 7th Sloan-C International Conference on Asynchronous Learning Networks, 2001.
38
JALN Volume 6, Issue 1 – July 2002
21. Mehrabian, A. Some referents and measures of nonverbal behavior. Behavior Research Methods
and Instrumentation, 1(6), 205-207, 1969.
22. Short, J., Williams, E. and Christie, B. The Social Psychology of Telecommunications. London:
John Wiley and Sons, 1976.
23. Lombard, M. and Ditton, T. At the heart of it all: The concept of presence. Journal of Computer
Mediated Communications, 3(2), 1997. http://www.ascusc.org/jcmc/vol3/issue2/lombard.html
24. Tammelin, M. From telepresence to social presence: The role of presence in a network-based
learning environment. In Aspects of Media Education: Strategic Imperatives in the Information
Age. Tella, S., Editor. Media Education Centre. Department of Teacher Education. University of
Helsinki. Media Education Publications 8, 1998.
25. Moore, G. Sharing faces, places, and spaces: The Ontario Telepresence Project Field Studies. In
Video- Mediated Communication. Finn, K. E., Sellen, A. J. and Wilbur, S. B., Editors. Mahwah,
NJ: Lawrence Erlbaum, 301–321, 1997.
26. Buxton, W. A. S. Telepresence: Integrating shared task and person spaces. In Readings in
Groupware and Computer-Supported Cooperative Work. Baecker, R. M. Editors. San Mateo, CA:
Morgan Kaufmann, 816–822, 1993.
27. Biocca, F. Presence. Presentation at a workshop on Cognitive Issues in Virtual Reality, VR ‘95
Conference and Expo, San Jose, CA, 1995.
28. Rourke, L., Anderson, T. Garrison, D.R, and Archer, W. Assessing social presence in
asynchronous text-based computer conferencing. Journal of Distance Education/Revue de
l'enseignement à distance, 2001. http://cade.athabascau.ca/vol14.2/rourke_et_al.html
29. Keegan, D. Foundations of Distance Education (3rd ed.). London: Routledge, 1996.
30. Healy, P. Harvard figures show most of its grades are A's or B's. The Boston Globe, p. B6,
(November 21, 2001).
31. Hanson, D., Maushak, N.J., Schlosser, C.A., Anderson, M.L., Sorenson, C., and Simonson, M.
Distance education: Review of the literature, 2nd Edition. Washington, D.C.: Association For
Educational Communications and Technology, 1997.
32. Tu, Chih-Hsiung. How Chinese perceive social presence: An examination of interaction in an
online learning environment. Educational Media International; v38 n1 p. 45-60, 2001.
Dr. Picciano is a professor in the Education Administration and Supervision Program in the School of
Education at Hunter College. His teaching specializations include educational technology, organization
theory, and research methods.
He has been involved with a number of major grants from the National Science Foundation, the Alfred P.
Sloan Foundation, the US Department of Education, and IBM. He has collaborated with The American
Social History Project and Center for Media and Learning at CUNY on a number of instructional
multimedia projects dealing with subjects such as Irish immigration in the 1850s, women's rights and
labor issues at the turn of the century, and school integration in the 1950s. One of these programs, The
Five Points: A Multimedia Experience in Social History, was selected to be part of a New Learning
Technologies Exhibit, held in San Diego in 1992. His present research interests are centered on distance
learning technologies including asynchronous learning using Internet tools and media distribution
systems.
Dr. Picciano has served as a consultant for a variety of public and private organizations including the New
York City Board of Education, the New York State Department of Education, Commission on Higher
39
JALN Volume 6, Issue 1 – July 2002
Education/Middle States Association of Colleges and Universities, the US Coast Guard, and CITICORP.
He is the author of four books on education and technology including Educational Leadership and
Planning for Technology, 3rd Edition (Prentice-Hall, 2002) and Distance Learning: Making Connections
Across Virtual Space and Time (Prentice-Hall, 2001). His articles on educational technology have
appeared in journals such as the Journal of Asynchronous Learning Networks, Journal of Educational
Multimedia and Hypermedia, Computers in the Schools, The Urban Review, Equity and Choice, and
EDUCOM Review.
40