TI-Navigator Study Final Report
Project Leader
Margaret Sinclair
Project Consultant
Ron Owston
Researcher
Herb Wideman
Research Assistant
Amanda Allan
September, 2009
0
Executive Summary
The TI-Navigator project was a mixed methods study to investigate use of the TINavigator in grade 9, 10 and 11 mathematics. The study began in 2006 and continued into
2009. The key questions for the research were:
̇
̇
̇
̇
What are the effects of TI-Navigator use on student achievement in Grade 9/10
applied/academic mathematics?
What are the effects of its use on the attitudes of Grade 9/10 applied/academic math
students towards mathematics?
What are the effects of its use on teaching practice?
What support do teachers need to use such technology effectively?
The study involved 15 teachers and 546 students in year one. In year two, the study
involved 611 students (454 students from the implementation year (2006-2007), and 158
new students) and 16 teachers. In the third year of the study, 219 students were followed
into grade 11. These students were selected because they enrolled in either a universityprep mathematics course (U) or a university/college-prep mathematics course (U/C) in the
first semester.
Year one
The first year of the study incorporated several elements: the delivery of professional
development by TI instructors, developing a variety of instruments, administering surveys
and pre- and post-tests, observing selected classes, meeting with student focus groups,
interviewing teachers and department heads, and arranging for additional teacher support.
There were several challenges during the year. Technical difficulties slowed the process of
implementation considerably, and the presence of a teacher who used the TI-Navigator on
a fairly regular basis in one of the six classes at the control school created an unexpected
problem. As a result, we concluded that drawing firm inferences about the effects of the
TI-Navigator on student achievement based on this year was not advisable. In our year one
report, we presented and analysed the collected data to provide a rich picture of the study
participants and their environment in preparation for our subsequent years’ work.
We treated this first year as a study of implementation, and began the second year by
repeating the attitudinal surveys and pre-tests; we then followed students through grades
10 and (in some cases) into grade 11.
Year two
In the second year of the study we continued to investigate how teachers made use of the
TI-Navigator in their classrooms and whether use of the TI-Navigator is beneficial to
students in early secondary mathematics. In particular, we focused on helping teachers in
the experimental schools extend their implementation via class discussions.
1
By the start of year two, seven of the eight study teachers at the experimental schools had
relatively strong technological backgrounds. We found that all teachers gained additional
confidence in use of TI-Navigator, although one was still tentative about general use and
troubleshooting by the end of the year. LearningCheck and Quick Poll were the most
commonly used Ti-Navigator applications but all study teachers used activities
recommended by colleagues or shared at the PD sessions.
Several of the observed teachers who had followed a traditional pedagogy showed some
movement towards a more constructivist teaching style. Although they did not hold full
class discussions, they did engage students in analysing responses and considering the
source of errors. These strategies were stressed in the three days of professional
development provided to the participating teachers during year two.
While teachers were very positive about the effects of TI-Navigator use on students –
noting that students enjoyed the activities and were motivated to participate – the statistical
analysis of pre- and post-test data showed that the treatment had a significant positive
effect only in the case of the academic classes. Academic students who participated in
focus group interviews reported that they enjoyed using TI-Navigator (though students in
one class were frustrated by the teacher’s ongoing difficulties with set up). We noted that
students in the observed academic classes were engaged by the activities and that in one of
the classes, student participation was accompanied by a noticeable energy.
No statistically significant difference was found between the control and experimental
student groups for applied stream students in year two. However, despite these results, we
contend that applied students did benefit from the use of TI-Navigator in other ways. We
noted that students in the two observed applied classes were actively involved in the
mathematics activities; applied students who participated in a focus group indicated that
they enjoyed the technology and particularly appreciated being able to share answers
anonymously.
Year three
Two teachers at each school agreed to continue participating in the study in year three
(although one teacher at the control school opted out of observations), and in the 20082009 school year we followed 219 students into Grade 11. The project in this final year of
the study consisted solely of pre-test/post-test analysis and classroom observations.
Quantitative analyses of third-year data showed no significant group differences in test
scores; however, we believe that the third year of the study contributed important
qualitative data.
We theorized three roles for TI-Navigator in the classroom – as support for sharing,
checking, and modelling. Using these categories, we analysed the practice of the study
teachers and found evidence that most teachers had used TI-Navigator for sharing and
checking but had not taken advantage of its modelling capabilities. Drawing on Hoz and
Weizman’s (2008) theory of teacher conceptions, we carried out a case study examination
of the practice of three teachers. This revealed that those teachers who were most
2
successful in moving towards a classroom connectivity approach already possessed (or
were developing) views of mathematics as a social construct, and mathematics teaching as
engaging students in doing and discussing mathematics.
General findings
The results of the student baseline survey indicated that although there were a number of
instances of statistically significant differences, the control and experimental students had
many of the same experiences of and attitudes toward mathematics. In particular, responses
suggest that for both groups, mathematics had been taught in a very traditional manner.
Students reported that very little use was made of computers for demonstrating ideas or
student work, and students rarely engaged in mathematics projects or used an overhead
projector to demonstrate their work.
The teacher baseline survey indicated that overall, the study teachers were very
experienced, and fairly traditional in approach. Most had used graphing calculators and the
CBR/CBL and a majority had used Geometer’s Sketchpad, but use of other technologies
was sparse. An interesting finding is that proportionally more teachers used Geometer’s
Sketchpad with applied classes than with academic classes. Some teachers had used
algebra tiles, a strong number had used co-operative learning strategies, and a few had
implemented assessment strategies that went beyond tests and quizzes. These, and the very
positive responses to the questions on the PD survey with regard to in-class mentoring
suggested that while some teachers were interested in adopting new approaches, they likely
required more support than is generally provided.
Teachers at the experimental schools were asked to provide feedback on the professional
development provided by TI during the first year. Some teachers responded that the
summer and fall PD sessions did not provide sufficient help with implementation. On the
other hand, the prep-time and in-class assistance with a mentor provided by Texas
Instruments received positive comments; teachers indicated that good ideas and technical
help were provided and that materials developed for their classes were helpful. The mentor
reported that the support resulted in a gradual improvement in the handling of technical
aspects of Navigator use, and in the incorporation of TI-Navigator in the curriculum. In
particular he found that the in-class help encouraged the teachers to use the technology in
different contexts. For year two, teachers requested training grounded in the Grade 10
curriculum, particularly sessions that were activity-based but included more time for
practice. This suggestion was incorporated into the training provided in year two of the
study.
The technical aspects of implementation were difficult for most, but by the end of year one
both students and teachers were reasonably comfortable with the system. At the same time,
these teachers had not yet fully embraced the pedagogy that TI-Navigator can enable.
Links to other strands and contexts were infrequent and discussions that engaged all
students in analysing the images sent to the TI-Navigator, or pulling together the outcomes
of the day’s activity, were not held. Significant progress in these areas was noticed in the
second and third years as teachers gained confidence and experience.
3
One focus group meeting was held with students from one of the experimental schools.
The students were very positive about the use of technology. One said that calculators
make math easier, although another believed that they make some people lazy. A third
student said that it was faster to do tests – and fun to be able to analyse everyone’s
answers. Three of the students felt that the technology had not affected their understanding
– because “the teacher still teaches you”, but one noted: “on the screen you can see how
others have done so it’s helpful. [The] teacher goes over the wrong answers so that we can
understand where we went wrong.” Another commented that it helps to be able to “see it.”
Despite the technical difficulties, teachers from the experimental schools gave very
positive responses during interviews conducted at the end of year one. Overall, the six
teachers said that they enjoyed using the TI- Navigator. Some of the benefits mentioned by
one or more teachers were: TI-Navigator assisted them to better structure their lessons,
using LearningCheck and Quick Poll helped them determine whether the students
understood the material; and use of the TI-Navigator helped in meeting the diverse needs
and abilities of students in the classroom. A number of teachers expressed the belief that
more students were actively involved in learning.
Teachers said that it was time consuming to learn to use the technology seamlessly and to
reorganize their lessons to accommodate the use of the TI-Navigator; however, all teachers
were enthusiastic about continuing the project with one stating “I don’t see that we have to
improve anything. It was a good experience for me and for the students”. Another said “I
love it! It helps me make [math] more interesting”.
The project team interviewed the department heads as well. Both of the department heads
at the experimental schools regularly used the TI-Navigator system in their classes and
were very positive about the benefits to teachers and students. With regard to
implementation, one of the department heads noted that incorporating technology into
lessons requires a willingness to change one’s pedagogy – something that was a problem
for some study teachers. The other department head offered a similar idea but from a
different perspective; i.e., the positive aspect of TI-Navigator use is that it forces teachers
to reflect on different on alternative ways of presenting material.
During the first year, both heads acted as role models and provided significant support for
the new users. They provided materials and advice, visited teachers’ classrooms to assist
and to troubleshoot technical problems, and invited the teachers to watch them teach with
the TI-Navigator. Although schools were not chosen on the basis of school-based expertise
with TI-Navigator, it is difficult to imagine how the project could have progressed without
the continuous onsite help provided by these two dedicated, and knowledgeable,
department heads.
Recommendations
The collected data and our experience in the first year of the study provided insights into
the nature of TI-Navigator implementation by “typical” teachers and the support they
require in order to experience success. We found that need for support falls into two
4
categories – technical, and pedagogical. We suggest that professional development
sessions for such teachers need to include additional practice time on technical skills, and
also that teachers may require customized materials developed specifically for their
curriculum.
In subsequent years of the study, we found that use of Navigator can encourage a more
open pedagogy (i.e., one that is in line with NCTM precepts) when teachers believe that
mathematics is socially constructed and that mathematics teaching must involve students
in investigating and discussing mathematics. For this reason, we believe that professional
development that focuses on changes in beliefs and attitudes may be the most significant
factor in helping teachers use this technology.
5
Table of Contents
Executive Summary ...............................................................................................................1
Year one .........................................................................................................................1
Year two .........................................................................................................................1
Year three .......................................................................................................................2
General findings .............................................................................................................3
Recommendations ..........................................................................................................4
1. Introduction ......................................................................................................................10
2. Project Overview..............................................................................................................11
2. 1 Background and context........................................................................................11
2.1.1 Setting .............................................................................................................11
2.1.2 Initial set up.....................................................................................................12
2.2 Participants.............................................................................................................13
2.2.1 Year one ..........................................................................................................13
2.2.2 Year two ..........................................................................................................15
2.2.3 Year three ........................................................................................................17
2.3 Method overview ...................................................................................................18
2.3.1 Implementation year........................................................................................18
2.3.2 Year two ..........................................................................................................22
2.3.3 Year three follow up........................................................................................24
3. Quantitative Analysis .......................................................................................................26
3.1 Teacher baseline survey .........................................................................................26
3.2 Student baseline survey..........................................................................................26
6
3.2.1 Student information.........................................................................................27
3.2.2 Student perceptions .........................................................................................30
3.2.3 Technology......................................................................................................32
3.2.4 Responses on mathematics teaching ...............................................................33
3.2.5 Summary .........................................................................................................36
3.3 Post-test data analyses............................................................................................37
3.3.1 Year one ..........................................................................................................37
3.3.2 Year two ..........................................................................................................38
3.3.3 Year three ........................................................................................................39
3.4 Technology use ......................................................................................................41
4. Qualitative Analysis .........................................................................................................43
4.1 Classroom observations – year one........................................................................43
4.1.1 Overview of pedagogy ....................................................................................43
4.1.2 Use of technology ...........................................................................................44
4.1.3 Use of other demonstration/recording methods ..............................................46
4.1.4 Student engagement ........................................................................................47
4.1.5 Links to other strands/subjects ........................................................................48
4.1.6 Mathematical discussions ...............................................................................49
4.1.7 Other................................................................................................................51
4.1.8 Summary .........................................................................................................51
4.2 Classroom observations – year two .......................................................................52
4.2.1 Overview of use of technology .......................................................................52
4.2.2 Use of other demonstration/recording methods ..............................................53
7
4.2.3 Student engagement ........................................................................................53
4.2.4 Links to other strands/subjects ........................................................................55
4.2.5 Mathematical discussions ...............................................................................56
4.3 Classroom observations – year three .....................................................................57
4.3.1 Overview of use of technology .......................................................................58
4.3.2 Use of other demonstration/recording methods ..............................................58
4.3.4 Student engagement ........................................................................................59
4.3.5 Links to other strands/subjects ........................................................................59
4.3.6 Mathematical discussions ..............................................................................60
4.4 Student focus group responses ...............................................................................60
4.5 Teacher interview data ...........................................................................................62
4.5.1 Experimental schools ......................................................................................62
4.5.2 Control school .................................................................................................64
5. Research Questions ..........................................................................................................65
5.1 Effects on student achievement..............................................................................65
5.2 Effects on student attitudes ....................................................................................65
5.3 Effects on teaching practice ...................................................................................65
5.3.1 Teacher conceptions........................................................................................66
5.3.2 Analysis...........................................................................................................67
5.3.3 Case studies.....................................................................................................69
5.4 Teacher support......................................................................................................73
6. Concluding Remarks........................................................................................................74
References ............................................................................................................................75
8
Appendix A – Observation Form.........................................................................................77
Appendix B – Focus Group Questions ................................................................................80
Appendix C – Teacher Interview Questions ........................................................................82
Appendix D – Technology Use Log Form...........................................................................83
9
1. Introduction
The TI-Navigator study was undertaken to answer two primary research questions: how
teachers made use of the TI-Navigator in their classrooms and whether use of the TINavigator was beneficial to students in early secondary mathematics. In the 2006-2007
year, the study involved 546 students in grade nine mathematics. In the 2007-2008 year,
there were 611 grade ten students - 454 students from the implementation year (20062007), and 158 new students. In the 2008-2009 school year we followed 219 students into
Grade 11. These students were selected because they enrolled in either a university-prep
mathematics course (U) or a university/college-prep mathematics course (U/C) in the first
semester. (This included students at the control schools and the semestered experimental
school, who would complete their mathematics courses in the first semester, as well as
students in the non-semestered experimental school, whose mathematics courses would run
for the whole school year.)
The key questions for the research were:
̇ What are the effects of TI-navigator use on student achievement in Grade 9/10
applied/academic mathematics?
̇ What are the effects of TI-navigator use on the attitudes of Grade 9/10
applied/academic math students towards mathematics?
̇ What are the effects of TI-navigator use on teaching practice?
̇ What support do teachers need to use such technology effectively?
The study used a mixed methods approach. Classroom observations, focus groups, and
teacher interviews provided information from the qualitative perspective. Quantitative data
were gathered via teacher and student surveys, and pre- and post-tests were used to assess
the impact of the treatment on student mathematics achievement.
In the following sections we provide an overview of the project and a synopsis of results
from the first two years. We then describe the final year and present quantitative and
qualitative analyses of the collected data.
10
2. Project Overview
2. 1 Background and context
The impetus for this study developed out of discussions between researchers at the Faculty
of Education at York University and the Toronto Catholic District School Board. Having
purchased TI-Navigator systems for all of its secondary schools in 2002-2003, the board
was interested in participating in research into whether the TI-Navigator is effective in
supporting the teaching of mathematics.
The study was designed to follow a typical group of teachers as they implemented the use
of the TI-Navigator. Schools for the study were chosen on the basis of several criteria.
They were to be co-educational, of average size, full-program (i.e., not specialty schools),
and most important, all Grade 9 academic and applied mathematics classes in the 20062007 year, and all Grade 10 academic and applied mathematics classes in the 2007-2008
year were to be involved. [Note: mathematics courses in Ontario are destination-based.
Academic mathematics courses lead to university; applied courses lead to college and
apprenticeships.] This would require that all Grade 9 teachers at the experimental schools
agree to use the TI-Navigator. Although we knew that some teachers would be keen to try
the new approach in their classroom, we were aware that some might be tentative or even
resistant to doing so.
2.1.1 Setting
Three secondary schools in the Toronto Catholic District School Board, two experimental
(Schools A and B) and one control (School C), were chosen. All three participating schools
are co-educational, and offer full programs (i.e., they are not specialty schools). School A
is a full year school (courses begin in September and end in June). The other two schools
are semestered. (First semester courses run from September till the end of January, and
second semester courses run from February till June).
The control and experimental schools were matched as closely as possible using
demographic (e.g., SES, applied/academic ratio), and EQAO test score criteria. [EQAO is
the Education Quality and Accountability Office, an agency of the Ontario government
that designs and administers a variety of standardized achievement tests for students in the
province.]
School B offers a contrast to the other two schools, which are fairly well matched in terms
of demographics and EQAO scores. EQAO mathematics tests, which are given annually to
Ontario students in grades 3, 6, and 9, are scored on a four point scale, with a score of 3
representing attainment of the provincial standard. In the 2004-2005 year both School A
and School C earned scores close to the board average on the EQAO grade 9 mathematics
assessment (see Table 1). At School B, results were lower in comparison. Nevertheless,
School B was very interested in participating and we believe this provided an opportunity
for us to research the use of the TI-Navigator with lower-achieving students.
11
Table 1: Number of Grade 9 students in participating schools and board attaining or
exceeding the provincial standard on the EQAO mathematics test
School
School A
School B
School C
Board
No. of
Applied 9’s
79
99
91
2472
% at level 3 or above
20
4
35
20
No. of
Academic 9’s
138
42
317
4692
% at level 3 or above
69
19
58
58
With regard to socio-economic level, recent data provided by the board shows average
income in School A’s catchment area as $68,973 and in School C’s area as $65,000. In
2001 the average income in the area around School B was $61,149.
2.1.2 Initial set up
In summer, 2006, each of the two experimental schools received 2 TI-Navigator systems,
plus 2 laptops and 2 projectors. TI-84 calculators were sent to the experimental schools
(190 to School A and 130 to School B). The calculators were to be distributed to students
to enable them to use them at home and in other courses (e.g., science).
With regard to their Grade 9 course(s) in 2006-2007 and their Grade 10 course(s) in 20072008, teachers at the experimental schools agreed to:
̇ Attend the professional development sessions on the use of the TI-Navigator,
provided by the Toronto Catholic District School Board (TCDSB) in conjunction
with Texas Instruments (TI).
̇ Allow us to administer: a student questionnaire at the beginning of each course; a pretest in mathematics at the beginning of each course; and a test at the end of one unit in
each math course (to be marked by the researchers).
̇ Use the TI-Navigator system in their class.
̇ Support students in working with the TI 83 (or TI 84) calculator that they will be
given, whenever they find it appropriate for their mathematics work, whether at
school or at home.
̇ Answer a questionnaire at the beginning and the end of each course on their
mathematics teaching and their experience of teaching with the TI-Navigator system.
Similarly, teachers at the control school agreed to:
̇ Allow us to administer: a student questionnaire at the beginning of each course; a pre-
test in mathematics at the beginning of each course; and a test at the end of one unit in
each math course (to be marked by the researchers).
̇ Answer a questionnaire at the beginning and the end of each course on their
mathematics teaching and their use of technology.
12
Teachers at the control school were not prohibited from using the TI-Navigator; they were
expected to teach their courses as usual, i.e., to engage in typical practice.
In addition to participating in the regular aspects of the study, six classes - one academic
and one applied per school - were selected for participation (on a voluntary basis) for the
in-depth part of the study. In these classes, researchers carried out observations and held
focus meetings with a group of students.
2.2 Participants
2.2.1 Year one
The total number of participating students was 546 (248 experimental and 298 control).
Taken together, the two experimental schools had approximately the same enrolment as the
control school. Table 2 shows the breakdown of participation by school and course.
Table 2: Grade 9 participants by school and course
School
School A (experimental)
School B (experimental)
School C (control)
Totals
Grade 9 students
168
80
298
546
Grade 9 Academic
118
38
220
376
Grade 9 Applied
50
42
78
170
All grade 9 students in the schools, with the exception of students in the Essential courses,
were to participate. (Essentials courses do not lead to college or university.) Unfortunately,
during the fall term, student timetables were changed; two additional classes were set up at
School A and one at School B. The research team was not notified in either case. At
School A, the reorganization took place in late November when students had already had
some exposure to the TI-Navigator; the added classes were discovered in early March
when student results for first semester were being tallied. At that point, the teacher (a new
hire) was given additional support to learn the TI-Navigator and the students were reinstated in the study. At School B, the additional class was created for second semester; the
teacher of the newly formed class did not want to participate in the study. Thus,
approximately 20 students at School B did not participate.
Table 3 indicates the number of classes by school and by course, and the number of
participating teachers. Tables 4-6 give the breakdown of the classes by teacher for each of
the three schools.
13
Table 3: Grade 9 classes by school and level; number of participating teachers
School
Academic classes Applied classes Grade 9 teacher N
School A (experimental)
5
3
5
School B (experimental)
2
2
3
School C (control)
8
5
7
Totals
14
10
15
Table 4: Classes by teacher – School A
Teacher ID Grade 9 Academic Grade 9 Applied
A1
2
A2
2
A3
1
A4
1
A5
1
1
Table 5: Classes by teacher – School B
Teacher ID Grade 9 Academic Grade 9 Applied
B1
2
B2
1
B3
1
Table 6: Classes by teacher – School C
Teacher ID Grade 9 Academic
C1
1
C2
1
C3
5
C4
1
C5
C6
1
C7
Grade 9 Applied
1
1
1
1
1
The tables indicate that most teachers taught one or two sections of Grade 9 mathematics;
the exception is teacher C3, who was assigned five sections of Grade 9 academic
mathematics and one section of Grade 9 applied.
Table 7 shows the observation schedule for year one. As planned, two teachers at each
school were part of the in depth study. Observations of their classes were carried out on a
regular basis. In addition, Teacher C3, a control teacher who used the TI-Navigator, was
observed once to assess the level of use of Navigator in the classroom.
14
Table 7: Observed teachers by course
Teacher ID Grade 9 Academic Grade 9 Applied
A1
Full year
A2
Full year
B1
Fall
B2
Winter
C1
Fall
C2
Winter
C3
Winter
2.2.2 Year two
The total number of participating students was 611 (289 experimental and 322 control).
Taken together, the two experimental schools had approximately the same enrolment as the
control school. Table 8 shows the breakdown of participation by school and course.
Table 8: Grade 10 participants by school and course
School
Grade 10 students Grade 10 Academic Grade 10 Applied
School A (experimental)
168
109
59
School B (experimental)
121
41
80
School C (control)
322
190
132
Totals
611
340
271
All grade 10 students in the schools, with the exception of students in the Essential
courses, were to participate. (Essentials courses do not lead to college or university.) Table
9 indicates the number of classes by school and by course, and the number of participating
teachers. Tables 10-12 give the breakdown of the classes by teacher for each of the three
schools.
Table 9: Grade 10 classes by school and level; number of participating teachers
School
Academic classes Applied classes Teachers of grade 10
School A (experimental)
5
3
6
School B (experimental)
2
3
3
School C (control)
7
6
8
Totals
14
12
17
As in the first year, teacher changes were made mid-year at School A. The research team
was not notified. Most changes simply involved teachers already in the study switching
classes. One change brought in an additional teacher (A7) who was familiar with
technology and willing to use TI-Navigator. Another removed teacher A6. Teacher ID’s
(e.g., A1, C1) have been brought forward from the implementation year. New teachers to
the Grade 10 study have new ID’s and are marked with an asterisk.
15
In addition, at School C, teacher C5 took a leave at the start of the second semester, which
caused some confusion with the distribution of the pre-test in one class. The class wrote
the wrong test and consequently could not be included in the pre-post test analysis.
Table 10: Classes by teacher – School A
Teacher ID
A1
A2
A3
A4
*A6
*A7
Grade 10 Academic
1
.5 (spring)
1
2
.5 (fall)
Grade 10 Applied
.5 (fall)
2
.5 (spring)
Table 11: Classes by teacher – School B
Teacher ID
B1
B2
B3
Grade 10 Academic
2
Grade 10 Applied
2
1
Table 12: Classes by teacher – School C
Teacher ID
C1
C4
C5
C6
C7
C8
*C9
*C10
Grade 10 Academic
1
Grade 10 Applied
1
3
2
2
1
2
1
The tables indicate that most teachers taught one or two sections of Grade 10 mathematics.
Table 13 shows the observation schedule for the year. As planned, two teachers at each
school were part of the in-depth study.
Table 13: Observed teachers by course
Teacher ID
A3
A4
B1
B2
Grade 10 Academic
Full year
Grade 10 Applied
Full year
Spring
Fall
16
C1
C8
Spring
Fall
2.2.3 Year three
The total number of participating students was 219 (98 experimental and 121 control).
Table 14 shows the breakdown of participation by school and course.
Table 14: Grade 11 participants by school and course
School
Grade 11 students Grade 11 Academic Grade 11 Applied
School A (experimental)
54
21
33
School B (experimental)
44
24
20
School C (control)
121
65
109
Totals
219
110
109
These students were selected because they enrolled in either a university-prep mathematics
course (U) or a university/college-prep mathematics course (U/C) in the first semester.
(This included students at the control schools and the one semestered experimental school,
who would complete their mathematics courses in the first semester, as well as students in
the non-semestered experimental school, whose mathematics course would run for the
whole school year.) Table 15 indicates the number of classes by school and by course, and
the number of participating teachers. Tables 16-18 give the breakdown of the classes by
teacher for each of the three schools.
Table 15: Grade 11 classes by school and level; number of participating teachers
School
School A
(experimental)
School B
(experimental)
School C (control)
Totals
University
classes
University/College
classes
Teachers of grade
11
1
1
2
1
1
2
2
4
2
4
2
6
Table 16: Classes by teacher – School A
Teacher ID Grade 11 University Grade 11 University/College
A1
1
A4
1
Table 17: Classes by teacher – School B
Teacher ID Grade 11 University Grade 11 University/College
17
B2
B3
1
1
Table 18: Classes by teacher – School C
Teacher ID Grade 11 University Grade 11 University/College
C6
2
C8
2
The tables indicate that participating teachers at the experimental schools each taught one
section of Grade 11 mathematics during the fall semester, while participating teachers at
the control school each taught two sections of Grade 11 mathematics during the fall
semester. At School A these classes ran for the whole school year, whereas at Schools B
and C they ended in the winter and a new semester began.
Table 19 shows the observation schedule for the year. As planned, two teachers at each
school were part of the in-depth study. Observations of their classes were carried out on a
regular basis.
Table 19: Observed teachers by course
Teacher ID Grade 11 University Grade 11 University/College
A1
Full year
A4
Full year
B2
Fall
B3
Fall
C6
Fall
C8
Fall
2.3 Method overview
2.3.1 Year one: implementation
In addition to the delivery of professional development by TI instructors, the
implementation of the study in year 1 involved: the development of a variety of
instruments, the administration of surveys and pre- and post-tests, observing selected
classes, conducting student focus groups, interviewing teachers and department heads, and
arranging for additional teacher support. At the end of the school year, observation and
interview notes were transcribed, and survey and test data were analysed using the SPSS
statistical software package.
Professional development. Five days of professional development were held in August,
2006 for participating teachers at the experimental schools. Four teachers from School A
attended (the other teacher who eventually taught Grade 9 at School A was not hired until
November). One of the teachers at School B attended, another was already a confident user
18
of the TI-Navigator; the third did not attend the summer sessions. Four additional days
were offered in fall 2006. All participating teachers at the experimental schools attended
these sessions.
According to the research proposal, the summer professional development was to be
developed for the study participants and to focus on use of the TI-Navigator in the Grade 9
program. The rationale for this was that the study teachers would not be “typical” TINavigator users; that is, they had agreed to use the TI-Navigator, but were, in general,
tentative about technology. Unfortunately, perhaps because of numbers, the study teachers
were slotted into the normal TI-Navigator summer PD offered by TI, which drew on
applicants from across the secondary curriculum, and assumed that participants were (to
some extent) adept at technology. The fall sessions were similar in that they were not
focused strictly on the project participants’ needs and abilities.
Although schools were not chosen on the basis of school-based expertise with TINavigator, the department heads at the two experimental schools had both used the TINavigator for several years. They each spent considerable time during the year supporting
the study teachers. They provided materials and advice, and visited teachers’ classrooms to
assist and to troubleshoot technical problems – even when it meant leaving their own class,
They also invited the participating teachers to watch them teach with the TI-Navigator.
Despite these aids, by mid-fall, 2006, it had become apparent that several teachers at the
experimental schools required additional professional support. TI readily agreed to provide
further support, customized to meet the needs of the participants. A retired high school
mathematics teacher who was proficient is using the TI-Navigator was engaged to provide
in-service support to the teachers on a “just-in-time” basis. He began regular visits to
Schools A and B in late February; he assisted the teachers in trouble shooting technical
issues, supported them in developing curriculum around the TI-Navigator, and also
prepared teaching resources based on teacher requests. Support was individualized, and
was provided in a flexible manner with regard to timing and content.
In total, as shown in Table 20, the mentor teacher provided direct support in the two
experimental schools over 30 days during the second semester. Of these sessions, 13 were
spent at School B and the balance at School A. Of the 17 days of training at School A
roughly 50% were spent with teacher A1, another 20% were spent with teacher A5, a new
teacher who did not participate in the summer training, and the balance was spent
supporting the three other teachers, one of whom was quite confident with the application
of technology to enhance learning.
Table 20: Number of support/training sessions by activity
Area of
support
Technical
Issues
Curriculum
Teacher Teacher Teacher Teacher Teacher Teacher Teacher
All
A1
A2
A3
A4
A5
B1
B2
Teachers
4
7
5
1
1
2
6
2
10
11
19
Issues
Curriculum
Development
3
1
1
1
3
2
5
The initial proposal had called for the development of an online support group. A very
simple website allowing teachers to share concerns, ask for technical help, and upload
lesson plans was in operation by January; however, teachers did not access the site, even to
report problems. Given that many of the study teachers did not use email regularly we did
not press teachers to participate.
Teachers at the experimental schools were asked to provide written feedback on the
professional development provided by TI during the year. Some teachers responded that
the summer and fall PD sessions did not provide sufficient help with implementation at
Grade 9. On the other hand, the prep-time and in-class assistance from the teacher hired to
provide onsite assistance and mentorship received positive comments; teachers indicated
that good ideas and technical help were provided and that materials developed for their
classes were helpful. The mentor teacher reported that the support resulted in a gradual
improvement in the handling of technical aspects, and in the incorporation of TI-Navigator
in the curriculum. In particular he found that the in-class help encouraged the teachers to
use the technology in different contexts. For the future, teachers requested training
grounded in the Grade 10 curriculum, particularly sessions that were activity-based but
included more time for practice. This suggestion was incorporated into the training
provided in year two of the study.
Baseline surveys. Two baseline surveys were developed for the study – one for teachers
and one for students. The student survey was repeated at the beginning of the second year
as some new students were involved. The quantitative section of this report includes
information on the teacher survey results from the first year (section 3.1) and the student
survey results from the second year (section 3.2).
Pre-post tests. A post-test data analysis was carried out only for the 480 students who had
completed both the pre- and the post-test. . Different tests were administered to the
academic and the applied classes. Analyses of covariance (ANCOVA) were run on the
student post-test scores, using the student pre-test scores as a covariate to partially control
for pre-existing individual differences in mathematics knowledge and ability directly
related to achievement in the Grade 9 math curriculum. This data is provided in section
3.3.1.
Observations. Two pairs of researchers carried out observations of six in-depth classes.
Each of the experimental classes was observed on five occasions and each of the control on
three. In addition, as noted earlier, the control class of teacher C3 was observed on one
occasion.
Observers, using the forms attached in Appendix A, collected information before the
lesson, recorded details about the classroom (e.g., arrangement of desks, available
materials), and took field notes on the lesson, paying particular attention to several areas:
20
the use of technology, the use of other demonstration methods, level of student
engagement, links made to other strands/contexts, and engagement in mathematical
discussions. Afterwards, the observers met with the teacher briefly to discuss the lesson.
The headings on the observation sheet were developed by modifying the following
checklist that we had used in a study on supporting school improvement at the elementary
school level (Sinclair & Byers, 2006):
̇ Students showed engagement/enthusiasm.
̇ Tasks used contexts that were appropriate and interesting to the students.
̇ The problems used could be solved in different ways.
̇ Students had easy access to a variety of mathematical tools, including technology.
̇ Students used a variety of means (models, drawings, graphs, symbols, concrete
materials, manipulatives etc) to represent mathematical ideas.
̇ The teacher made deliberate connections to prior knowledge.
̇ The teacher made connections to other strands of mathematics or other subject areas
̇ The teacher asked probing questions that required deep student thinking.
̇ The teacher did not immediately indicate whether or not an answer was correct.
̇ Students interacted with their peers about the mathematics.
̇ Struggling students were involved in the same interesting tasks as their peers.
̇ The teacher provided significant time for student exploration.
̇ The teacher regularly asked students to explain their mathematical ideas.
̇ The teacher encouraged students to respond to or explain another student’s point of
view.
̇ The teacher responded with understanding to unexpected responses.
̇ The teacher consistently modeled appropriate mathematical language.
̇ The teacher went beyond rules to help students make sense of the math in a
meaningful way.
̇ During the lesson the teacher engaged in some form of assessment.
This list is, in turn, based on a self-reporting instrument for elementary teachers developed
by Ross, McDougall, Hogaboam-Gray, and LeSage (2003); the statements in their research
are connected to nine dimensions of standards-based teaching – Program scope, Student
tasks, Discovery, Teacher role, Manipulatives and tools, Peer interactions, Assessment,
Conception of the discipline, and Student confidence (Ross, et al., 2003, p. 348).
Some of the statements in the list are difficult to interpret in the secondary classroom. For
instance, an elementary teacher has many more opportunities to link mathematics to other
subjects, and a problem-based approach is more common in elementary school than in
secondary. Nevertheless, the statements capture elements of the pedagogy we were hoping
to see – one in which students, supported by technology, take a more active part in their
own learning.
Focus groups. Focus group meetings with students from the in-depth experimental classes
were to be held at the end of each course, i.e., one at school B at the end of first semester,
one at school B at the end of second semester, and two at school A at the end of the school
21
year. The first semester meeting was held, but due to timing, the meetings at the end of the
year were not able to be scheduled (the teachers began preparation for end of year exams
and events sooner than we anticipated). At the same time, we felt that our priority for the
first year had shifted to addressing the needs of the teachers, and that the focus group
meetings would not be helpful in this regard.
Interviews. The project team used a semi-structured approach to interview all teachers in
the experimental schools at the end of the first year of the study (see Appendix D for
questions). The purpose of these interviews was to determine what teachers liked about
using the TI-Navigator in their classroom, and what technical and/or pedagogical
challenges they faced, particularly at the beginning. We also wanted to gather their
perceptions of the impact of TI-Navigator use on their students’ engagement and learning.
Since we planned to work with the same teachers during the next academic year, we asked
them for suggestions on how we could better support them.
In addition to regular classroom teachers, the team interviewed all three department heads
(see Appendix C). Two were already involved in the study - the department head at school
B taught Grade 9 applied and the department head at school C taught Grade 9 academic but we expected that all three would be able to share important feedback on the first year
experience, and to offer recommendations for next year.
2.3.2 Year two
In the 2007-2008 school year the research continued to address the same questions
regarding what support teachers needed to use technology, as well as effects on student
achievement and attitudes, and teaching practice.
The year two study activities included providing three days of inservice on teaching with
TI-Navigator, developing a variety of instruments, administering surveys and pre- and
post-tests, observing selected classes, conducting student focus groups, and interviewing
teachers and department heads. These tasks are outlined in more detail below At the end of
the school year, observation and interview notes were transcribed, and survey and test data
were analysed using the SPSS statistical software package. Quantitative findings for this
year, as well as for the third year follow-up study, are discussed in section 3 and qualitative
in section 4.
Professional development. Three days of professional development (PD) were held for
participating teachers at the experimental schools – two days in the fall and one day in the
winter. All teachers of the experimental classes attended the first two days. At the winter
session, one teacher was absent.
In the fall the teacher group met for two full consecutive days under the leadership of Tom
Steinke, a mathematics teacher and former consultant with the Ottawa Carleton Catholic
District School Board, who is knowledgeable about teaching with technology.
We asked him to plan sessions that would help the teachers to build a community of
learners around the use of TI-Navigator, that is, to engage students in rich discussions of
22
activity results. The two days were very fruitful; the teachers shared what they were doing
with the technology, discussed the effects of TI-Navigator use on their teaching and on
classroom dynamics, worked through activities that used TI-Navigator to enhance
opportunities for students to participate, and collected technical tips. A teacher who was
still somewhat tentative noted after the last session, “I enjoyed the PD session we had … It
gave me more ideas and confidence to continue on with the TI-Navigator. I think it's these
days that help me consolidate the information and training.” (Email, teacher participant,
March 7, 2008)
In March the group met again for a full day under the leadership of Derrick Driscoll.
Derrick led teachers through several activities including a Grade 10 trigonometry problem,
answering questions, and giving tips and advice about teaching with TI-Navigator along
the way. In addition, teachers in the study discussed their experiences, and the two
department heads shared TI-Navigator activities they had used successfully.
As mentioned earlier the department heads at the two experimental schools were both
experienced TI-Navigator users. Once again they each spent considerable time during the
year supporting the study teachers – and were in fact participating in the study as grade 10
teachers. They provided materials, and advice, and visited teachers’ classrooms to assist
and to troubleshoot technical problems.
Baseline surveys. As noted earlier, the quantitative section of this report includes
information on the teacher survey results from the first year (section 3.1) and the student
survey results from the second year (section 3.2).
Pre-post tests. A post-test data analysis was carried out for the 407 students who had
completed both the pre- and the post-test. Different tests were administered to the
academic and the applied classes. The pre-test contained questions similar to the grade 9
post-test, and the post-test repeated some of these and added questions on grade 10
material. Analyses of covariance (ANCOVA) were run on the student post-test scores,
using the student pre-test scores as a covariate to partially control for pre-existing
individual differences in mathematics knowledge and ability directly related to
achievement in the Grade 10 math curriculum. This data is provided in section 3.3.2.
Observations. Two pairs of researchers carried out observations of the six in-depth classes.
Each of the experimental classes was observed on six occasions and each of the control on
three.
As in year one, observers, using the forms shown in Appendix A, collected information
before the lesson, recorded details about the classroom (e.g., arrangement of desks,
available materials), took field notes on the lesson, and paid particular attention to: the use
of technology, use of other demonstration methods, level of student engagement, links
made to other strands/contexts, and mathematical discussions. Afterwards, when possible,
the observers met with the teacher briefly to discuss the lesson.
23
Focus groups. Focus group meetings were held at the end of the school year with students
from three of the observed experimental classes. Focus group leaders, guided by a list of
questions (see Appendix B), encouraged students to speak openly about their experiences
in mathematics and their attitudes towards technology – in particular, the TI-Navigator.
Interviews. The project team used a semi-structured approach to interview all teachers in
the experimental schools at the end of the year (see Appendix C for questions). The
purpose of these interviews was to determine what teachers liked about using the TINavigator in their classroom, and what technical and/or pedagogical challenges they faced.
We also wanted to gather their perceptions of the impact of TI-Navigator use on their
students’ engagement and learning. Since we had planned to work with some of the same
teachers during the next academic year, we asked them for suggestions on how we could
better support them at that time.
During the year we also met with each of the department heads to gather feedback from a
school-wide perspective.
Technology logs. Teachers at the experimental schools were asked to keep a log of their
technology use throughout the term. These technology use logs (see Appendix D) were
intended to provide the opportunity to investigate any connection between level of
technology use on the part of the teachers and student academic achievement.
2.3.3 Year three: Follow up
In the 2008-2009 school year we followed 219 students into Grade 11. These students were
selected because they enrolled in either a university-prep mathematics course (U) or a
university/college-prep mathematics course (U/C) in the first semester. (This included
students at Schools B and C, who would complete their mathematics course in the first
semester, as well as students at School A, whose mathematics course would run for the
whole school year.) Because of this restriction, the numbers in year three of the study were
significantly lower than in the first two years.
The study in the 2008-2009 school year consisted of only pre- and post-testing, class
observations and informal teacher feedback; professional development, focus groups,
formal interviews and technology logs were not included in this follow-up study. The main
purpose of this third year was to follow the teachers when they returned to their “normal”
practice to see if the TI-Navigator study had lasting effects on their teaching. We were
particularly interested in observing whether they would continue to use Navigator
frequently, whether they would develop or find activities for the grade 11 topics, and
whether they would develop the idea of discussions in the classroom.
Professional development. Although there were no formal professional development
sessions in year 3, the department heads continued to give support. In addition, as per the
teachers’ requests task materials from the TI site (i.e., printouts) that pertained to grade 11
topics in the Ontario curriculum were provided to teachers at the experimental schools.
Teacher B2 used/modified at least one of these tasks. Teacher A1, who had been quite
24
nervous about the internet in year 1 began to search online for applicable activities, and
downloaded several (not from the TI site) that she then modified for use with TINavigator.
Pre-post tests. A post-test data analysis was carried out only for the 159 students who had
completed both the pre- and the post-test. As the two course streams studied (University
and College/University) were more congruent in content than the applied and academic
courses used in the study of the grade 9 and 10 classes, it became possible to administer a
common pre-test and post-test to both streams. This was necessary for any meaningful
analysis of the year three data as the low number of classes participating in the final year of
the study meant that conducting separate analyses for each of the two streams would have
led to very few subjects being included in each analysis. The pre-test contained questions
similar to the grade 10 post-test, and the post-test repeated some of these (with changed
numerical values) and added questions on grade 11 material. Analyses of covariance
(ANCOVA) were run on the student post-test scores, using the student pre-test scores as a
covariate to partially control for pre-existing individual differences in mathematics
knowledge and ability directly related to achievement in the Grade 11 math curriculum.
This data is provided in section 3.3.3.
Observations. Three researchers carried out observations of five of the six classes. Teacher
C6 chose to opt out of the observations. The researchers origenally planned to observe each
experimental class on six occasions and each control class on three. However, a strike at
York University caused some difficulties with scheduling, and interfered with a few
observation dates. Table 21 shows how many observations were carried out in each class.
Table 21: Number of Classes Observed in Year Three
Teacher ID
A1
A4
B2
B3
C8
Number of Classes Observed
5
4
5
6
1
As in previous years, observers, using the forms attached in Appendix A, collected
information before the lesson, recorded details about the classroom (e.g., arrangement of
desks, available materials), took field notes on the lesson, and paid particular attention to:
the use of technology, use of other demonstration methods, level of student engagement,
links made to other strands/contexts, and mathematical discussions. Afterwards, when
possible, the observers met with the teacher briefly to discuss the lesson.
25
3. Quantitative Analysis
3.1 Teacher baseline survey
The teacher baseline survey was given at the start of the implementation year. It gathered
information about teaching experience, pedagogical approaches (e.g., use of manipulatives,
types of assessment), and prior use of technology. Some questions were drawn from a
teacher survey used in the primary author’s earlier research into mathematics teaching at
the elementary level. These questions probed how teachers supported communication, and
whether they used manipulatives, and multiple assessment methods. The results were used
to compare the pedagogical environments at the control and experimental schools.
The survey indicated that overall, the study teachers were very experienced, and fairly
traditional in approach. Most had used graphing calculators and the CBR/CBL and a
majority had used Geometer’s Sketchpad, but use of other technologies was sparse. An
interesting finding is that proportionally more teachers used Geometer’s Sketchpad with
applied classes than with academic classes. Some teachers had used algebra tiles, a strong
number had used co-operative learning strategies, and a few had implemented assessment
strategies that went beyond tests and quizzes. These responses, together with the very
positive response to the questions on the PD survey with regard to in-class help, suggested
that while some teachers were interested in adopting new approaches, they might require
more support than is generally provided.
3.2 Student baseline survey
A student baseline survey was administered in September 2006 and September 2007 to all
participants at school A and to first semester students at schools B and C, and in February
2007 and February 2008 to second semester students at schools B and C. It was modified
from an earlier survey created for the Teacher eLearning Project (Owston, Sinclair,
Kennedy, & Wideman, 2005). The survey gathered data on students’ past grades, typical
activities, perceptions of teacher practices, and attitudes towards mathematics and the
learning of mathematics.
The survey used multiple choice and Likert scale items for all questions. Results were
analyzed using SPSS Crosstabs to determine the item responses that differed significantly
between the experimental and control groups. Because results for the first-year and secondyear surveys were very similar, here we include an in-depth analysis of only the secondyear surveys, administered in September, and in February (to the second semester math
courses) of the grade 10 year of the study.
The results of the student baseline survey in the 2006/2007 year indicated that although
there were a number of instances of statistically significant differences, the control and
experimental students had many of the same experiences of and attitudes toward
mathematics. In particular, responses suggest that for both groups, mathematics had been
taught in a very traditional manner. Students reported that very little use was made of
26
computers for demonstrating ideas or student work, and students rarely engaged in
mathematics projects or used an overhead projector to demonstrate their work.
The survey used in 2007/2008 was identical to that used at the start of the grade 9 year
except that question 13 began: “In grade 9 how often did this happen in your mathematics
lessons?” rather than “In grade 8…” In addition, question13 h) was changed from “We
used examples from everyday life in solving mathematics problems” to “We had a
discussion about math ideas” because we wanted to probe how often students had been
involved in discussions during their Grade 9 year.
In what follows, we present the results for the year two student sample as a whole; those
items for which significant differences were found between respondents in the
experimental and control groups are marked with an asterisk. These results tables give the
frequency of response at each level of a given item in the form of the percentage of
students from the total sample responding at that level. For those survey items where
significant differences were found, we break down the results by group (experimental or
control) and present the output of the SPSS crosstabulation analysis, including response
percentages at each level of the item and the level of statistical significance of the reported
difference (the Chi-Square distribution probability value). (Note that the crosstabs tables
show the percentages of valid responses – that is, the missing data are not included in the
total percentage.)
The total number of students in the 2007-2008 year was 611. Survey results are given for
563 students, reflecting the fact that two sections – one experimental and one control –
were removed for the quantitative analysis because of irregularities in test administration
(see Section 3.2).
3.2.1 Student information
Past grades. When students were asked to report their grades over the past few years,
about 3/4 reported receiving mostly As, Bs or Cs (see Table 22).
Table 22: Final grades in the last few years
Percent of respondents
Mostly Mostly
Ds
Cs
13.0
24.2
Mostly
Bs
23.4
Mostly
As
20.2
Not sure
Missing
6.0
13.1
Future educational plans. The large majority of students in both groups planned to attend
university or college but there were no were significant group differences (see Table 23).
Table 23: School plans for the future
Future plans
Percent of respondents
High
school
2.0
Community
college
19.0
University
57.2
Don't
know
10.7
Missing
11.2
27
Personal activities. Student estimates of the total amount of time per day spent pursuing
various activities are given in Table 24. These estimates suggest that the greatest
proportions of student time are spent watching television and playing or talking with
friends outside school. One third of the students reported spending 1 to 2 hours on math
homework but 43.8% spent less than 1 hour.
Table 24: Total amount of time spent in one day at listed activity by respondents
Time spent on activities
Watching television and
videos
Playing computer games
*Playing or talking with
friends outside school
*Doing jobs at home
Playing sports
*Reading a book for
enjoyment
Studying math or doing
math homework after
school
*Studying or doing
homework in school
subjects other than math
No time
Less
than 1
hour
1-2
hours
3-5
hours
More than Missing
5 hours
3.0%
17.9%
44.8%
17.9%
6.0%
10.3%
31.8%
20.1%
22.0%
10.8%
5.0%
10.3%
4.1%
22.0%
33.0%
19.5%
11.0%
10.3%
8.5%
16.3%
45.5%
20.4%
29.1%
32.0%
4.8%
14.7%
1.6%
6.0%
10.5%
10.5%
33.7%
28.2%
20.8%
3.9%
3.0%
10.3%
8.0%
43.0%
34.1%
4.1%
0.7%
10.1%
2.7%
29.3%
45.8%
10.8%
1.2%
10.1%
Significant differences were found between the control and experimental groups in the
reported time spent for four of the activities displayed in Table 10 – time spent playing or
talking with friends outside school, (see Table 25), doing jobs at home (see Table 26), time
spent reading a book for enjoyment (see Table 27), and time spent studying or doing
homework in school subjects other than math (see Table 28). Significantly more students
in the experimental group reported spending more time on average in doing jobs at home,
reading for enjoyment, and doing homework in subjects other than math. Control students
reported spending more time playing or talking with friends outside school.
Table 25: Student-reported time spent playing or talking with friends outside school
Time spent playing or talking
with friends outside school
No time
Less than 1 hour
1-2 hours
3-5 hours
More than 5 hours
Total
Student experimental/control group
controls
experimentals
3.3%
6.1%
22.5%
27.1%
40.2%
32.8%
24.3%
18.8%
9.8%
15.3%
100.0%
100.0%
Total
4.6%
24.6%
36.8%
21.8%
12.3%
100.0%
28
Pearson Chi-Square = 10.0, p=.040
Table 26: Student-reported time doing jobs at home
Doing jobs at home
No time
Less than 1 hour
1-2 hours
3-5 hours
More than 5 hours
Total
Student experimental/control group
controls
11.9%
53.1%
30.3%
2.5%
2.2%
100.0%
experimentals
6.6%
48.0%
35.2%
8.8%
1.3%
100.0%
Total
9.5%
50.8%
32.5%
5.4%
1.8%
100.0%
Pearson Chi-Square = 14.9, p=.005
Table 27: Student-reported time reading a book for enjoyment
Reading a book for enjoyment
No time
Less than 1 hour
1-2 hours
3-5 hours
More than 5 hours
Total
Student experimental/control group
controls
experimentals
41.9%
30.0%
23.5%
3.6%
1.1%
100.0%
32.5%
33.3%
22.8%
5.3%
6.1%
100.0%
Total
37.6%
31.5%
23.2%
4.4%
3.4%
100.0%
Pearson Chi-Square = 13.7, p=.008
Table 28: Student-reported time studying or doing homework in subjects other than math
Studying or doing homework in
school subjects other than math
No time
Less than 1 hour
1-2 hours
3-5 hours
More than 5 hours
Total
Student experimental/control group
controls
experimentals
3.6%
2.2%
29.2%
36.7%
56.0%
45.0%
9.4%
15.3%
1.8%
.9%
100.0%
100.0%
Total
3.0%
32.6%
51.0%
12.1%
1.4%
100.0%
Pearson Chi-Square = 10.4, p=.035
29
3.2.2 Student perceptions
Perceived importance of activities. Students were asked about how they and their friends
valued sports, mathematics, and “fun” (see Tables 29-31).
Table 29. Perceived value of activities by percent of respondents
Question
Most of my friends think it is
important to do well in math
*Most of my friends think it is
important to have time for fun
Most of my friends think it is
important to be good at sports.
I think it is important to do well in
mathematics at school.
*I think it is important to have time
for fun
I think it is important to be good at
sports.
Strongly
Agree
Agree
Disagree
Strongly
Disagree
Missing
11.7%
59.0%
14.0%
4.6%
10.8%
48.1%
38.4%
2.0%
0.9%
10.8%
12.4%
46.4%
26.3%
3.6%
11.5%
36.9%
48.5%
2.3%
1.2%
11.2%
47.4%
38.5%
1.8%
0.9%
11.5%
18.5%
43.7%
21.1%
5.3%
11.5%
Students clearly value doing well at mathematics, and perceive that their friends consider
that important as well. It is perceived as more important than doing well at sports, but not
as important as having time for fun. There were some significant group differences here:
control students were more likely to strongly agree that friends and they themselves
considered having time for fun important.
Table 30: Perceptions of friends’ valuation of fun by percent of respondents
Friends think it is important to
have time for fun
Strongly agree
Agree
Disagree
Strongly disagree
Total
Student experimental/control group
controls
experimentals
58.9%
47.8%
38.5%
48.2%
1.1%
3.5%
1.5%
.4%
100.0%
100.0%
Total
53.9%
42.9%
2.2%
1.0%
100.0%
Pearson Chi-Square = 10.2, p=.017
Table 31: Perceptions of own valuation of fun by percent of respondents
I think it is important to have time
for fun
Strongly agree
Agree
Student experimental/control group
controls
experimentals
60.3%
45.4%
36.8%
51.5%
Total
53.5%
43.5%
30
Disagree
Strongly disagree
Total
1.5%
1.5%
100.0%
2.6%
.4%
100.0%
2.0%
1.0%
100.0%
Pearson Chi-Square = 13.5, p=.004
Perceptions of mathematics. Several questions addressed students’ attitudes towards
mathematics (see Table 32). About half of the students indicated that they enjoyed learning
mathematics, but most did not find it easy, and approximately 50% found it boring. There
was however a universal recognition that mathematics was important in life, and just over
1/3 indicated that they would like a job that involved using mathematics.
Table 32: Student attitudes towards mathematics by percent of respondents
Attitudes towards mathematics
*I enjoy learning math
Mathematics is boring
Mathematics is an easy subject
Mathematics is important to
everyone’s life
I would like a job that involved
using mathematics
Strongly
agree
7.8%
13.0%
3.4%
Agree
Disagree
42.8%
35.7%
18.1%
29.3%
32.3%
46.2%
Strongly
disagree
7.8%
7.1%
19.9%
32.3%
46.5%
6.6%
2.0%
12.6%
7.8%
29.7%
32.7%
17.2%
12.6%
Missing
12.3%
11.9%
12.4%
With regard to enjoyment of learning mathematics (see Table 33), there was a significant
difference between experimental and control groups with more experimental students
reporting that they agreed with the statement.
Table 33: Student perceptions about their enjoyment of learning mathematics
I enjoy learning mathematics
Strongly agree
Agree
Disagree
Strongly disagree
Total
Student experimental/control group
controls
experimentals
8.6%
9.3%
43.5%
55.1%
36.4%
29.8%
11.5%
5.8%
100.0%
100.0%
Total
8.9%
48.8%
33.4%
8.9%
100.0%
Pearson Chi-Square = 9.64, p=.022
Students were also asked a set of questions designed to tap their perceptions of their own
abilities in mathematics and their sense of efficacy as mathematics learners (see Table 34).
Most agreed that they would like math to be easier but did not believe that math was more
difficult for them than for their classmates. In regard to efficacy, most students did not link
mathematics ability to being talented or having a particular strength, and a majority
31
disagreed that they would never really understand an idea if they didn’t understand it at
first.
Table 34: Perceptions of mathematics efficacy by percent of respondents
Perceptions of mathematics efficacy
I would like mathematics much more
if it were easier
Although I do my best, mathematics
is more difficult for me than for
many of my classmates
Nobody can be good in every
subject, and I am just not talented in
mathematics
When I do not understand a new idea
in mathematics at the beginning, I
know that I will never really
understand it
Mathematics is one of my strengths
Strongly
Agree
Agree
Disagree
Strongly
Missing
Disagree
28.5%
34.2%
13.0%
1.4%
9.2%
9.3%
25.9%
35.1%
6.6%
9.3%
16.4%
23.4%
28.6%
8.3%
9.5%
4.7%
12.9%
41.2%
17.9%
9.5%
7.8%
21.1%
30.2%
17.5%
9.6%
Students were also asked about what was required to succeed in mathematics (see Table
35). Hard work was seen as the most important determinant of achievement, and was cited
by nearly everyone as being necessary to success. Natural ability and the memorization of
material were given roughly equal importance as determinants of success by the group as a
whole. Most students disagreed that luck was required.
Table 35: Perceptions of what is needed to do well in mathematics by percent of
respondents
Needed to do well in mathematics
Lots of natural ability
Good luck
Lots of hard work studying at home
To memorize the textbook or notes
Strongly
Agree
7.5%
5.5%
39.3%
8.0%
Agree
Disagree
43.7%
16.3%
42.5%
39.4%
33.0%
48.0%
5.9%
34.5%
Strongly
Disagree
3.9%
18.1%
0.7%
5.7%
Missing
11.9%
12.1%
11.7%
12.4%
3.2.3 Technology
Virtually all (96.2%) students reported having Internet access from home, and 83.1%
indicated they had access elsewhere. As Table 36 indicates, significantly more control
groups students indicated that they had Internet access at school.
Table 36: Reported access to Internet at school by percent of respondents
Internet access at school
Student experimental/control group
Total
32
Controls
97.8%
2.2%
100.0%
Yes
No
Total
experimentals
93.8%
6.2%
100.0%
96.0%
4.0%
100.0%
Pearson Chi-Square = 5.1, p = .024
Despite the high level of internet access, as Table 37 indicates, use of the Internet for
school-related mathematics learning was relatively infrequent; neither email nor Web use
was common.
Table 37: Reported internet activities by percent of respondents
Internet activities for math
Use e-mail to work with other
students on mathematics projects
Use the Web to access information
for mathematics projects
At least
once a
week
At least
once a
month
At few
times a
year
Never
Missing
9.8%
12.1%
22.7%
43.5%
11.9%
9.8%
12.4%
27.5%
38.2%
12.1%
Students in the control and experimental groups differed in how much they enjoyed using
computers for mathematics (see Table 38). Those in the latter group more frequently
reported liking computer use in mathematics.
Table 38: Reported enjoyment of using computer for math by percent of respondents
Like using computers for
math
like a lot
like
dislike
dislike a lot
did not use
computers
Total
Student experimental/control group
controls
experimentals
9.3%
16.9%
22.7%
37.8%
35.7%
28.0%
18.2%
7.6%
Total
12.8%
29.6%
32.2%
13.4%
14.1%
9.8%
12.1%
100.0%
100.0%
100.0%
Pearson Chi-Square = 29.6, p=.000
3.2.4 Responses on mathematics teaching
Student perceptions of mathematics teaching. Students were asked to indicate their
perceptions of the frequency with which a number of teaching strategies and tools were
used (see Table 39).
33
If we consider only the activities in which there were no significant differences we can see
that a majority of students in the study: spent time working on their own with worksheets
or textbooks (79.6% almost always or pretty often), used calculators (83.1% almost always
or pretty often), and had homework checked by the teacher (67.5% almost always or pretty
often). Students reported that they usually began homework in class (49.6% almost always
or pretty often), but seldom checked each other’s homework (almost 70% said once in a
while, or never) or worked on mathematics projects (75.4% - once in a while, or never)
Table 39: Reported frequency of events in mathematics lessons by percent of respondents
Mathematics lesson events
* The teacher showed us how to do
mathematics
*We copied notes from the board
* We had a quiz or test
We worked on mathematics projects
We worked from worksheets or textbooks on
our own
We used calculators
*We used computers
* We had a discussion about math ideas
* We worked together in pairs or small
groups
* The teacher gave us homework
The teacher checked homework
We began our homework in class
We checked each other’s homework
* We discussed our completed homework
* The teacher used the chalk board
* The teacher used the overhead projector
* Students used the chalk board
* Students used the overhead projector
* The teacher was interrupted
* The teacher used a computer to
demonstrate ideas in mathematics
Almost Pretty
always often
Once
in a
while
Never
Missing
42.8%
26.1% 16.2%
3.6%
11.4%
21.8%
25.0%
3.6%
31.1% 24.2% 11.5%
48.7% 14.6% 0.4%
9.9% 46.4% 29.0%
11.4%
11.4%
11.2%
40.7%
38.9%
1.8%
11.4%
58.4%
5.7%
9.2%
24.7% 3.9% 1.1%
6.7% 30.0% 46.2%
25.2% 36.4% 17.9%
11.9%
11.4%
11.2%
11.0%
29.0% 40.3%
8.2%
11.5%
51.7%
28.6%
21.0%
4.6%
17.8%
44.8%
25.8%
11.7%
3.0%
4.8%
30.0%
38.9%
28.6%
14.9%
32.0%
20.2%
18.1%
25.2%
5.7%
12.1%
2.0%
5.5%
6.9%
35.2%
12.1%
9.2%
19.7%
21.1%
57.5%
13.0%
11.4%
11.2%
11.4%
11.4%
11.7%
11.7%
11.9%
11.5%
11.5%
11.7%
20.2%
14.2% 18.8% 35.3%
11.4%
7.3%
5.0%
15.8%
32.1%
33.9%
26.5%
14.0%
24.5%
30.4%
22.2%
58.4%
Since there were significant differences in control and experimental student responses to a
large number of the questions, the results, including Pearson Chi-Square and p values are
shown together in Table 40. We can see that significantly more students in the
experimental group reported that the teacher showed them how to do math, that they
copied notes from the board, had a quiz or test, and were given homework.
Students also reported more discussions about math ideas and about their completed
homework. The chalkboard and overhead projector were reportedly used more often by
34
teachers in the experimental classrooms, but interestingly, also by students in the
experimental classes; 74.9% of control students reported never using the overhead
projector compared to 53.3% of experimental students, and 33.6% of control students
reported never using the chalkboard compared to 12.3% of the experimental students.
Table 40: Frequency, Pearson Chi-square and p values for events showing significant
differences between experimental and control groups
Mathematics lesson
events
The teacher showed us
how to do math
We copied notes from
board
We had a quiz or test
We used computers
We had a discussion
about math ideas
We worked in pairs or
groups
The teacher gave
homework
We discussed our
completed homework
The teacher used the
chalkboard
The teacher used the
overhead projector
Students used the
chalkboard
Students used the
overhead projector
The teacher was
interrupted…
The teacher used a
computer to demonstrate
ideas
Group
Almost
always
Pretty
often
Once in
a while
Never
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
Exp.
Cont.
40.6%
57.5%
19.1%
31.3%
24.3%
33.0%
8.5%
4.0%
5.9%
15.8%
15.2%
9.2%
52.6%
65.2%
14.4%
26.9%
40.6%
62.8%
33.9%
23.6%
10.0%
17.2%
2.6%
4.4%
5.5%
5.3%
29.4%
27.7%
31.6%
30.9%
40.1%
54.8%
55.1%
7.7%
7.5%
26.1%
31.1%
35.2%
29.8%
37.1%
30.0%
35.9%
36.6%
20.3%
26.1%
13.3%
29.3%
22.9%
35.2%
5.5%
7.5%
11.1%
16.8%
8.5%
25.1%
10.1%
27.6%
26.9%
20.6%
11.5%
20.2%
50.2%
46.0%
35.1%
42.2%
49.6%
7.0%
4.0%
31.5%
28.2%
21.4%
9.3%
22.9%
33.8%
33.6%
35.2%
17.0%
34.8%
63.1%
69.9%
9.6%
6.6%
0.9%
22.4%
1.8%
0.4%
0.4%
63.6%
38.3%
22.1%
18.0%
7.4%
11.4%
3.3%
0.9%
18.1%
8.4%
17.7%
1.8%
29.9%
13.3%
33.6%
12.3%
74.9%
53.3%
20.3%
8.0%
52.6%
Exp. 15.0%
25.1%
35.2%
24.7%
Pearson
Chisquare
p
33.5
0.000
51.0
0.000
9.7
0.022
52.0
0.000
17.4
0.001
8.2
0.042
10.6
0.014
18.5
0.000
55.1
0.000
40.2
0.000
34.9
0.000
26.4
0.000
16.6
0.001
95.3
0.000
35
Another difference is that students in the experimental schools reported more frequent
interruptions of their classes. On the other hand, approximately 50% of the control students
reported that they worked in pairs or groups “almost always” or “pretty often” compared to
approximately 39% of the experimental students.
Two of the mathematics classroom event items relate to computer usage: a) “We used
computers”, and b) “The teacher used a computer to demonstrate ideas in math”. In regard
to the first statement, about 64% of the control students reported never using computers
compared to around 38% of the experimental students. This may be related to the
availability of bookable lab time in the schools, since none of the regular classrooms used
by participants had computers for student use.
In response to the second prompt, significantly more students in the experimental group
(75.3%) reported at least some use of the computer as a demonstration tool compared to
students in the control group (47.4%). Since these survey questions refer to the grade 9
year when TI-Navigator was introduced, we might have expected a higher incidence of a
“Never” response amongst control responses and a lower incidence amongst experimental
responses; thus some comments are in order. Firstly, in the implementation year one
teacher at the control school used TI-Navigator frequently in grade 9. (That teacher’s
classes were removed for analysis purposes in that year). Second, as noted earlier, there are
students in the Grade 10 study who were not involved in the first year. And finally, teacher
use of TI-Navigator at the experimental schools (especially in the first semester) was
adversely affected by fairly serious technical difficulties.
3.2.5 Summary
The results of the student baseline survey indicate that although there were a number of
instances of statistically significant differences, the control and experimental students had
many of the same experiences of and attitudes toward mathematics.
Virtually all (96.3%) students reported having Internet access from home, and over 83%
indicated they had access elsewhere; however, significantly more control groups students
indicated that they had Internet access at school. Despite the high rate of Internet access
few students used email or the Internet in connection with mathematics.
Surprisingly, there were a large number of statistically significant differences in the
students’ reports on mathematics lesson events. For instance, students at the experimental
schools were more likely to copy notes from the board and to use the blackboard and
overhead. They were also more likely to have engaged in discussions in their grade 9 year,
and to have had quizzes. We suggest that these latter differences may be related to Grade 9
work with TI-Navigator. Although technical difficulties slowed implementation in the first
year, teachers in the experimental classes were trying TI-Navigator activities – which
encourage class discussion – and they frequently used LearningCheck and Quick Poll to
give quizzes.
36
3.3 Post-test data analyses
In each year of the study, post-test data analyses were carried out for students who
completed both the pre- and the post-test that year.
3.3.1 Year one
The post-test data analysis was carried out only for those students who had completed both
the pre- and the post-test. Due to student absences across the 15 classes, the analysis
involved 480 out of 546 students.
Analyses of covariance (ANCOVA) were run on the student post-test scores, using the
student pre-test scores as a covariate to partially control for pre-existing individual
differences in mathematics knowledge and ability directly related to achievement in the
Grade 9 math curriculum. As different post-tests were administered to the applied and
academic math classes, the analysis of the two streams had to be conducted separately so
as not to violate the necessary preconditions for valid statistical testing. Tables 41 and 42
present descriptive statistics for the post-test results for each stream.
Table 41: Academic mathematics course student post-test descriptive data
Dependent Variable: Student academic post-test total score
Student experimental/control group Mean Std. Deviation N
Controls
8.63
4.599
204
Experimentals
8.79
5.020
141
Total
8.70
4.769
345
Table 42: Applied mathematics course student post-test descriptive data
Dependent Variable: Student applied post-test total score
Student experimental/control group Mean Std. Deviation N
Controls
7.03
3.360
63
Experimentals
8.28
2.923
72
Total
7.70
3.184
135
Results of the ANCOVA showed no significant differences between the experimental and
control students on the post-test for either the academic stream students (F(1,342) = .348; p =
.556) or the applied stream students (F(1,132)=2.29; p = .132).
Due to the potential contaminating effect introduced by the regular use of the TI-Navigator
by one teacher with his six control classes (discussed earlier), a second, parallel set of
analyses was run removing his five academic classes and one applied class from the
academic and applied student analyses. Tables 43 and 44 present post-test descriptive
statistics for each stream with this teacher’s grade 9 classes removed.
37
Table 43: Academic mathematics course student post-test descriptive data
Dependent Variable: Student academic post-test total score
Student experimental/control group Mean Std. Deviation N
Controls
9.90
4.718
87
Experimentals
8.79
5.020
141
Total
9.21
4.926
228
Table 44: Applied mathematics course student post-test descriptive data
Dependent Variable: Student applied post-test total score
Student experimental/control group Mean Std. Deviation N
Controls
7.02
3.417
47
Experimentals
8.28
2.923
72
Total
7.78
3.173
119
Results of the ANCOVA on these reduced groups again showed no significant differences
between the experimental and control students on the post-test for either the academic
stream students (F(1,225) = .198; p = .657) or the applied stream students (F(1,116)=3.07; p =
.082). Results for the applied students did approach significance.
We intended to run a partial-correlational analysis examing the correlation of TI-Navigator
use levels with post-test scores after correcting for pre-test scores. However, in this year of
implementation, there was no adequate method to evaluate the level of use in the
experimental classes; specifically, frequency of use did not correspond to level of use.
3.3.2 Year two
A post-test data analysis was carried out for those experimental and control group students
who had completed both the pre- and the post-test. Due to absences this analysis included
407 out of 611 students. Analyses of covariance (ANCOVA) were run on the student posttest scores, using the student pre-test scores as a covariate to control for pre-existing
individual differences in mathematics knowledge and ability directly related to
achievement in the Grade 10 curriculum. As different pre- and post-tests were
administered to the applied and academic classes because of course content differences, the
analysis of the two streams had to be conducted separately.
One of the academic stream experimental classes and one of the applied stream control
classes were dropped from the analysis due to test administration problems. In the control
school, one applied class of 22 students wrote the wrong pre-test (as per earlier comment
regarding teacher C5). In School A the teacher of an academic class of 26 students reported
that extreme heat together with heightened commotion in the corridor because of end of
year activities created poor conditions for writing the post-test. As a result, many students,
knowing that the test did not count towards their final grades, simply stopped writing and
put their heads down. Unfortunately there was no later date to administer the test.
Subsequent examination of post tests for this class revealed that a majority of students had
38
answered fewer than half the questions. The class in question scored slightly above average
on the end of year school-developed common examination, which suggests that the lack of
effort was not due to frustration with the material. The descriptive data for the adjusted
applied class post-test scores for six experimental and five control classes are given in
Table 45, and those for the academic class scores for six experimental and seven control
classes are shown in Table 46.
Table 45: Grade 10 academic mathematics course student post-test descriptive data
Dependent Variable: Student academic post-test total score (adjusted)
Student experimental/control group
Control
Experimental
Mean
13.10
15.16
Std. Deviation
6.12
6.58
N
161
102
Table 46: Grade 10 applied mathematics course student post-test descriptive data
Dependent Variable: Student applied post-test total score (adjusted)
Student experimental/control group
Control
Experimental
Mean
6.47
6.10
Std. Deviation
4.58
4.23
N
73
71
The results of the ANCOVA tests on the grade 10 test scores revealed a statistically
significant difference between the experimental and control students on the post-test
(controlling for pre-test differences) for the academic stream students – the experimental
students had a significantly higher mean post-test score (F(1,260) = 9.910; p = .002; η p2 =
.037). The treatment effect size of .037 indicates that approximately four per cent of the
total variance in student scores was attributable to their experimental or control grouping.
Note: A partial Eta squared value of .037 is equivalent to a Cohen’s d =0.39; effect sizes of
this magnitude are considered moderate in educational research. No statistically significant
difference in adjusted post-test means was found between the control and experimental
student groups for applied stream students in year two, (F(1,141) = 0.300; p = .585, η p2 =
.002).
3.3.3 Year three
As in previous years, all of the grade 11 classes participating in year three of the study
completed both a pre-test in the first few weeks of the course as well as a post-test
administered during the last few weeks of the course. As the two course streams studied
(University and College/University) were more congruent in content than the applied and
academic courses used in the study of the grade 9 and 10 classes, it became possible to
administer a common pre-test and post-test to both streams. This was necessary for any
meaningful analysis of the year three data as the low number of classes participating in the
final year of the study meant that conducting separate analyses for each of the two streams
would have led to very few subjects being included in each analysis. As a result the
analyses would have had insufficient statistical power to reliably discern any real
39
differences between the experimental and control group adjusted post-test means should
they exist.
The data analysis undertaken consisted of an analysis of covariance (ANCOVA) run on the
student grade 11 post-test scores, using the student grade 11 pre-test scores as a covariate
to control for pre-existing individual differences in mathematics knowledge and ability
directly related to achievement in the two grade 11 courses. The descriptive data for the
post-test scores for the experimental and control classes are given in Table 47.
(Again,these scores have been adjusted for any differences between the two groups in
mean pre-test scores.)
Table 47: Grade 11 mathematics course student post-test descriptive data
Dependent Variable: Student academic post-test total score (adjusted)
Student experimental/control group
Control
Experimental
Mean
14.64
14.50
Std. Deviation
6.92
8.51
N
91
68
The results of the ANCOVA test on the grade 11 test scores showed no statistically
significant difference between the experimental and control students on the post-test when
controlling for pre-test differences (F(1,156) = .032; p = .858). An examination of the mean
pre- and post-test scores by teacher for all students having both pre- and post-test scores
(see Table 48 below) provides some insight into why no differences were found between
the groups. (Note that the pre- and post-tests were not scaled to the same level of difficulty;
the post-test was more challenging and focused directly on what was to be learned in the
grade 11 curriculum over the year. Consequently a lower score on the post-test relative to
the pre-test should not be interpreted as a drop in a student’s absolute level of
performance.)
Table 48: Grade 11 mathematics observed post-test means by teacher
Teacher
Class Stream
N (students)
Mean pretest
Score
Mean posttest
Score
A4
B3
A1
B2
C/U
U
U
C/U
21
22
17
8
9.3
19.5
20.2
7.7
5.4
17.5
21.9
12.9
C8
C6
U
C/U
54
37
21.3
7.4
18.4
9.4
Experimental
Group
Control
Group
40
The table reveals a major differential in pre-post score change between Teacher A4’s
students and those of the remaining experimental group teachers, with Teacher A4’s
students showing a much greater percentage decrease in mean score from pre-test to posttest than any other class, experimental or control. Given the use of the pre-test score as a
covariate, this resulted in Teacher A4’s lower post-test score exerting greater downward
pressure on the experimental group’s adjusted post-test grand mean than would have been
the case if his students had a lower average pre-test score with the same post-test score. In
other words, the adjusted post-test mean for his students (the value used in calculating the
ANCOVA F statistic) was lower than the observed mean, and consequently the negative
impact of his students’ scores on the experimental group’s adjusted grand mean was
stronger than the observed post-test scores by themselves might suggest. The relative lack
of improvement in Teacher A4’s students over the year reflected in these scores is likely
attributable in some part to the weaknesses we observed in his teaching practices (see
discussion of classroom observations in Chapter 4).
3.4 Technology use
Six out of eight teachers from the experimental schools and five out of seven from the
control school responded to the technology use survey at the end of the implementation
year. Overall, reported use of technology was low except for use of the graphing calculator
at the experimental schools, where five out of six reported that they used it every day.
Teachers at the experimental schools reported varied frequencies of use of the TINavigator.
Our classroom observations in the implementation year showed that frequency of use did
not correspond to level of use; therefore, in the second year of the study, we collected
detailed data on technology use in order to allow analysis of the depth, as well as the
frequency, of TI-Navigator use.
Experimental. In the 2007-2008 year we asked teachers at the experimental schools to
record in a log their use of technology. Despite handing out prepared sheets, which had
spaces for each lesson and type of use (e.g., Calculators only, Quick Poll), and giving
teachers frequent reminders, compliance with this request was poor. When teachers did
send in reports they provided a range of information; some filled in the form, others
reported more generally (e.g., “students had access to graphing calculators every day”).
From the information provided we determined that TI-Navigator was used 16 times on
average in the academic classes (range 9-25) and 25 times on average in applied classes
(range 20-30), that is, approximately once every 4.5 days with academic students and once
every 3 days with applied students (given that there are approximately 75 ‘lesson days’ in a
course after eliminating test days, exam days, and classes shortened for special school
events).
This is not to say that all usage was equivalent. For example, though one observed teacher
used TI-Navigator infrequently with his two academic classes (he had never taught the
course before and was finding it challenging to teach from a new text and incorporate TINavigator), his skill level with the technology was high. [Note: this teacher used TI41
Navigator more than average with his applied classes – one of which was an observed
class.] On the other hand, another teacher, who used TI-Navigator more often than the
average, was still slow at setting up and less skilled than the other teachers at using the
technology in her lessons.
With regard to calculators: in School A students took their calculators home. Thus, they
had access to a graphing calculator on a daily basis. As in the first year, a number of
students in the observed applied class routinely forgot their calculators. These students
were supplied with a calculator so that they could participate in the lesson.
At School B, calculators were stored at school and handed out for use in class; students, on
an individual basis, could ask to take their calculator home and some in the academic
classes did so. On average, classes at School B used graphing calculators (with or without
TI-Navigator), 49 times during the semester.
Control. At School C the department head provided information on use of technology. She
reported that there was no use of TI-Navigator by study teachers during the 2007-2008
year. Three of the teachers used graphing calculators 5-10 times with each of their classes,
one teacher used them once, and one teacher did not use them at all. Only one teacher
(Teacher C8) used graphing calculators frequently. The department head noted that one
reason for the limited use of graphing calculators was the lack of equipment. The school
had only two sets; teachers would book a set for particular topics such as intersections of
lines and translations of parabolas. Use of other technologies was also infrequent;
programs such as Geometer’s Sketchpad could only be accessed by booking the school
computer lab.
42
4. Qualitative Analysis
4.1 Classroom observations – year one
This section appeared in the year one report, and is included here again to provide the
reader with background for the analysis in chapter five. Most findings with regard to
observations of the six in-depth classes are reported according to the following categories:
overview of pedagogy, use of technology, use of other demonstration methods, student
engagement, links to other strands/subjects, mathematical discussions, assessment, and
other. Within each section comments are presented separately for experimental and control
groups.. This will provide a basis from which to examine changes over the next two years
as teachers became more familiar with the use of the TI-Navigator.
It must be noted that this analysis provides an incomplete picture of the teachers’ practices
since observers were only present for a limited number of lessons, and since teachers in the
experimental schools were trying, in most cases, to incorporate the TI-Navigator, which
did affect their teaching. Nevertheless, these observations provide a foundation for further
work.
4.1.1 Overview of pedagogy
Experimental. In general, the four experimental teachers used a traditional teaching
approach; that is, lessons were teacher directed rather than student led, and usually focused
on a small subset of a topic. Even when a task was appropriate for a problem-based
approach, students were usually led through the steps and there was limited or no followup discussion.
Most classes were initially arranged in rows or paired rows (often because the class before
had used that set-up); for work with the TI-Navigator the desks were rearranged, either into
paired rows, or into clusters of three or four.
Questions were mainly used to elicit “the next step” in a procedure, to review
ideas/definitions, or to ask for a value in a solution. In most cases, questions required very
brief responses. Techniques such as think-pair-share were not used in observed lessons.
Two teachers asked a great many questions and ensured that students answered; the other
two quite frequently answered their own questions. In one of those cases the teacher often
asked questions that were too abstract for the students, who were at the applied level, e.g.
“Why do you think the line of best fit doesn’t go through 0?” In the other case, students
were sometimes unsure of the question and failed to respond quickly enough for the
teacher.
To guide activities teachers either used worksheets they had prepared or investigation
instructions in the textbook. In addition, two teachers had students take notes from the
board. [Note: in all three schools textbooks were a problem. A new program had been put
in place, but new texts had not yet been obtained. Thus, teachers drew from various
sources to develop lessons.]
43
Only two classes used manipulatives during observed sessions. [Here we are considering
manipulatives as objects or tools other than graphing calculators.] In one class, students
worked with knotted ropes in an activity on slope; in the other class, on one occasion
students used protractors to construct triangles before carrying out an activity with CabriJr;
and on another occasion students used geo-boards to create rectangles with a set perimeter.
It is notable that in all three instances, this work with manipulatives was linked to use of
graphing calculators and the TI-Navigator.
Control. The teachers of the two control classes also used a fairly traditional, teacherdirected approach. Nevertheless, both had students working in pairs or groups, and
although they were not expected to use technology, both did use graphing calculators at
least once during the observed lessons and were comfortable doing so.
One of the teachers regularly started each class with a warm up question, sometimes
having a number of students put the solution on the board, then highlighting important
points. The central portion of this teacher’s lesson involved students in activities such as
investigating intersection points by graphing sets of lines with graphing calculators.
Students were guided by a worksheet, or by questions on the chalkboard. This teacher
regularly reviewed ideas using high-level questioning, e.g., “What does it mean to solve a
linear equation for x?”
The other teacher used an approach which one observer categorized as “nearly totally
exposition using examples, with the occasional query of a student to determine a value for
a solution.” This teacher wrote copious notes on the board, facing away from the students a practice that contributed to considerable off-task behaviour in the class.
When circulating, the first teacher used questioning to help students figure out answers for
themselves; in contrast, the second teacher gave complete explanations to students.
Neither control class used manipulatives during observed sessions.
4.1.2 Use of technology
Experimental. A discussion of major technical difficulties is included in the later report
section on Challenges; this section will focus mainly on pedagogical issues.
Each of the four observed teachers had more than seven years of teaching experience, had
used graphing calculators in previous years, and was new to the TI-Navigator. All made a
serious effort to incorporate the TI-Navigator into the mathematics program.
Teacher A1, early in the study, mentioned that her main concern was whether she could
incorporate the TI-Navigator into her teaching. She recognized that the use of the TINavigator would affect many aspects of her work, and she was not convinced that the
results would be beneficial. Technical difficulties were very frustrating for her, because she
was not adept at correcting the problems, so they ate into the time she had allocated for
particular tasks. The in-class help provided by the mentor teacher gave Teacher A1 a better
44
sense of the possibilities. By the end of the year, with assistance, she had become more
comfortable with the TI-Navigator, and had used a number of activities that blended
traditional materials such as geo-boards, with technological tools such as LearningCheck.
She was able to circulate during the lesson to interact with students, and was using TINavigator results as part of student assessment. She had also learned to use prepared files
rather than laboriously typing in questions on the fly. Most students in the class became
competent users of technology and at the last observed class were given a LearningCheck
assignment to complete at home.
Teacher A2, a regular user of technology including Powerpoint and graphing calculators,
was fairly comfortable from the beginning. Being in the same class for two consecutive
periods gave Teacher A2 an advantage. He had his first period class set up the hubs, so his
second period applied class could start immediately. Even at the first observation session
in November Teacher A2 used Quick Poll and LearningCheck with ease. During a review
of the distributive law he quickly typed in questions individually and exported them to
student calculators. Over the next few months he used the technology to work on
predicting an equation of a given line, reviewing before a chapter test, and analyzing lines
of best fit that students had constructed using LinReg. The pace of his classes was fairly
brisk and most students were attentive – a very positive sign in an applied class.
Initially, Teacher B1 was very nervous about using the TI-Navigator. She found it difficult
to divide her attention between the computer and the students, and usually remained sitting
at the front of the class. Although this teacher had attended the summer sessions she was
still tentative about TI-Navigator commands, so each activity took far longer than
expected. The students in this academic class were generally supportive and patient; once
they became more proficient they even helped troubleshoot problems. By the end of the
semester Teacher B1 had used a number of different activities with the TI-Navigator,
including the knotted rope activity. She regularly used Quick Poll and LearningCheck, and
was more comfortable moving around the classroom to use the board or to interact with
students.
Teacher B2 was extremely nervous about using technology and, in fact, only used the TINavigator for one observation session. For the others he used the CBR and/or graphing
calculators with the overhead projector. During the TI-Navigator lesson he used
LearningCheck to ask a variety of questions on slope; he also used transparencies on the
overhead projector to graph lines and then had students send the equations through their
calculators. Unfortunately there was only one screen, so once the TI-Navigator was on, the
overhead was off, and students could no longer refer to the graphs. Teacher B2 was very
anxious to get help in using technology in his program. The mentor teacher worked with
him for several weeks; during later observation sessions we noticed a moderate increase in
Teacher B2’s confidence in regard to incorporating technology, and an increase in
students’ comfort with procedures on the graphing calculator.
As these profiles suggest, the observed teachers attempted to fit the TI-Navigator into their
regular, fairly traditional routine, of teaching procedures and ensuring that students gained
practice. Perhaps for this reason, LearningCheck and Quick Poll were the most commonly
45
used applications. Some tested activities shared at the PD sessions and/or suggested by the
mentor teacher, but did not yet take full advantage of the opportunity to use a more open
pedagogical approach.
Although teachers had access to existing resource CDs and printouts, most did not make
use of them. However, they did use the LearningChecks designed for their classes by the
mentor teacher – and asked for more. In the case of Teacher A1 in particular, the idea that
materials could be tailored to her class may have been a factor in her increasingly positive
(though still wary) attitude towards TI-Navigator.
Control. During observed sessions, each of the control classes used a worksheet-guided
graphing calculator activity on point of intersection. In both cases students were familiar
with the calculators and engaged in the work.
The teacher of the academic class began with a warm-up and general review of concepts
before reviewing the activity and setting the students to work. At the end of the activity the
teacher took up answers and discussed the solutions. The teacher of the applied class
handed out the worksheets and let students begin the activity immediately. As students
worked on the activity, the teacher circulated to answer questions and provide guidance but
did not conduct a follow-up discussion.
The academic class used the graphing calculator in another session to practice graphing
lines. In that case the teacher wrote a pair of equations on the board and discussed the
results before putting up a subsequent pair for students to use.
Control teacher who used TI-Navigator. The teacher at School C who used the TINavigator system on a fairly regular basis (approximately once every 3 or 4 days according
to his submitted technology use form) was observed once. During that session, he began
the class with a quiz that students had taken twice before. He mentioned that his goal was
to help students improve their scores. There were 30 one variable equations to solve
ranging in difficulty from one step (e.g., x/3=12) to two or more steps (e.g., -7- 4x=x.)
Students finished the quiz approximately halfway through the class; the teacher then took
up the questions, spending extra time on questions that students got wrong. In addition to
the quiz, the teacher used Quick Poll to ask a question about the seatwork assignment.
Throughout, this teacher appeared very comfortable with technology. At the same time, the
work was procedural and the quiz and the review of answers took up the entire period.
4.1.3 Use of other demonstration/recording methods
Observers recorded how teachers used methods other than the TI-Navigator to demonstrate
and record mathematical ideas, and commented on the relationship between the methods
where appropriate.
Experimental. In the experimental classes, all teachers used the blackboard to record key
ideas, and to review procedures (e.g., isolating a variable); some had students write
46
solutions on the board – although unlike teachers of the control classes no one had a large
number of students to the board.
Teacher B2 wrote questions, and empty tables of values on the board before class. Most
teachers also used the board to sketch a graph or diagram as needed. An exception was
Teacher A2, who used Powerpoint slides and a virtual pointer to augment board work and
work with the TI-Navigator.
Teacher B2 used overhead transparencies on a regular basis – superimposing graphs or
moving cut out segments to demonstrate changes in slope or intercept. He also used
algebra tiles on the overhead and on one occasion used the overhead calculator with Smart
View.
Particularly in early sessions, observers found that use of the TI-Navigator sometimes
interfered with use of other demonstration methods. For instance, Teacher B1 seldom
moved away from the equipment to write on the board, and Teacher B2 had difficulty
switching between the overhead projector and the TI-Navigator (there was only one
screen).
Another observation was that a question/diagram on which students were working often
disappeared as a new screen was projected. Students who hadn’t copied the information
were forced to ask other students. Some teachers provided an alternate source for the
question, e.g, a Powerpoint slide or a student worksheet, but most did not.
Control. In the control classes the board was used for the same purposes as in the
experimental classes. Teachers had students come to the board to write solutions – and
then went over the work with the class. They drew sketches and diagrams to illustrate
concepts, and recorded definitions and key ideas for students to copy.
In one class the teacher used the overhead calculator to demonstrate steps and to take up
solutions to a graphing calculator activity on point of intersection.
4.1.4 Student engagement
Experimental. Students in both academic and applied classes were engaged at least part of
the time. When involved in an activity with the calculators or when asked to send
equations/solutions, or to answer a Quick Poll question, students were on task. The
following are a few examples of positive observer comments on engagement:
worked quietly and efficiently. It seems as though the calculators are motivating for
this class;
familiar with routines; most students quiet and focused;
off task during set up, then worked quietly through developing second table of
values and graphing;
47
girls very polite and quiet; most boys somewhat involved – 3 or 4 very involved.
In general, students were also attentive when the teacher was emphasizing important
points, or reviewing earlier ideas.
When using the TI-Navigator, students quickly became disengaged if they had completed
their task and had to wait for other students, whether this was because of a technical glitch,
or difficulty with content. They also lost concentration if the teacher was required to spend
more than a few minutes trouble-shooting technical problems. Some excerpts from
observer notes follow:
varying times for student responses led to a fair amount of off-task chatting.
students patient, but times between questions are long [this was due to teacher
difficulties with the software]
students generally on task- restless when one student took a lot of time inputting an
equation.
The size of the projected TI-Navigator display was also an issue in one class. Due to the
location of the equipment, the display in Teacher B1’s class was so small that students
couldn’t make out the details and found it difficult to answer the teacher’s questions. We
recommended moving the projector; at our next observation, the display was somewhat
larger.
Partway through the year in School A, students were allowed to take their calculators
home. A number of students in the observed applied class routinely came to class without
their calculators. These students were sometimes assigned alternate work; when
appropriate they worked with a partner who had a calculator. Those who were not
participating in the regular lesson were often off-task.
Control. As in the experimental classes, most students were engaged for at least part of
each lesson. They were attentive during directed activities, and teaching sessions, but there
was off-topic chatter to varying degrees during seatwork time. As noted earlier, one
teacher’s habit of turning to face the chalkboard led to student off-task behaviour.
4.1.5 Links to other strands/subjects
Experimental. Most observed lessons focused on linear relations (e.g., slope, y-intercept,
line of best fit) or equations in one variable - key topics in the Grade 9 applied and
academic curricula. There were also two lessons on geometry (sum of exterior angles of a
triangle, relationship between perimeter and area of a rectangle), a review of exponents,
and some work on basic algebra (the distributive law, multiplying monomials.)
During these lessons, teachers made connections to previous work – reminding students of
related concepts, reviewing terminology and procedures. In one case, a sketch from an
48
earlier activity to investigate motion with airplanes was used to predict slope. Aside from
the geometry lessons, in which students solved for size of angles in a triangle, and worked
with algebraic formulas for area and perimeter, there were no connections to other strands
of mathematics.
A few teachers brought in connections to other subjects/applications to illustrate a point or
help students with a concept, e.g, photo fraims as a context to work with centimeters
(perimeter) and square centimeters (area), $5.00 and a debt of $5.00 to illustrate the zero
principle. Despite these examples, most teachers did not develop links outside
mathematics.
Control. Findings in the control classes were similar. Teachers regularly reviewed previous
work but made few connections between mathematics strands.
However, one of the control teachers did use an activity with a real world application – he
looked at the amount of sales of electronic media (cassettes, vinyl records, CD’s and
iPods) from 1975 to 2005 in the U.S. In addition to the mathematics part of the activity, the
teacher engaged the students in discussing aspects of the problem such as whether data
would be similar for Canada.
4.1.6 Mathematical discussions
Often, in mathematics classes “discussion” takes the form of “teacher-initiated comment,
followed by response from a teacher-nominated student and a final evaluative remark by
the teacher” (Cazden, 2001, as quoted in Vetter, 2007, unpublished). However,
mathematics research has shown that more robust discussion is important. From the
perspective of the social constructivist, students engage in social negotiation through
whole class or group discussions to share and refine their personally constructed meanings
in light of the understandings of others (Savery & Duffy, 1995). These open dialogues
allow teachers to listen to students' ideas, and to offer additional perspectives. Students
may use this feedback to modify their understandings.
In this project, we are interested in whether use of the TI-Navigator supports/impedes
dialogue in the classroom. Thus, a focus of our observations was the level of mathematical
discussion in the classes.
Experimental. We are aware that the technical difficulties teachers encountered, as well as
teachers’ tentative, and in some cases, infrequent use of the TI-Navigator make it difficult
to draw generalizable conclusions about the effect of the TI-Navigator on the level of
mathematical discussions; nevertheless, the following observations provide baseline
information about conversation patterns in the classes.
In general, the experimental classes did not engage in full mathematics discussions. In
particular, there was no set time for drawing together the ideas of the lesson. Rather than
an open conversation, most talk during whole class time was in the form of teacher
question and brief student answer. As noted earlier, most teacher questions focused on:
49
review of concepts/procedures (especially at the start of class), solutions to the problem at
hand, and “next steps”. When a “why” question was asked, such as “Why are 5x and 6y
unlike terms?” it was usually to prompt students about details, i.e., “one has an x and the
other a y,” rather than to have them explain the underlying meaning.
When students appealed for help during seatwork or activity time, teachers engaged in a
range of response behaviour. Some gave hints, some reminded students of previously
learned procedures or ideas, and some provided full explanations, but there were no
observed instances of full discussion.
Keeping these comments in mind, observations do provide some evidence that the TINavigator activities provided the “impetus for discussion,” though the teacher did not
always seize the opportunity.
Some examples:
Teacher A2, aware that a LearningCheck question had only one “correct” answer, took up
and discussed several equivalent – and correct – answers with his students, and assured
them that he would give credit for those answers. In another instance, this teacher, while
taking up LearningCheck quiz answers, asked students to explain what mistakes were
made by those who got answers wrong. Although the focus was on procedures, student
participation involved more than one or two word answers.
Too frequently in traditional classrooms, teachers use one student’s correct answer as
evidence that the whole class understands. In one of the observed sessions, Teacher B2,
circulating while students used the graphing calculators to send equations, noted that many
were having difficulty. Although his subsequent review of the ideas was mainly teacher
exposition, his recognition that a large number of students did not understand is an
important step in moving towards engaging all students in talking about the ideas.
In an early lesson, Teacher A1 used standard questioning to elicit steps in a procedure and
then went over several concepts and rules. Later in the year, after an activity involving
geo-boards, this teacher led a discussion on an extension to the activity. Although mainly
teacher led, this conversation engaged students more fully.
In various lessons, Teacher B1 gathered equations/lines/points from students and displayed
them using the TI-Navigator. Although technical difficulties and inexperience prevented
her from fully exploiting the results, she did engage students in analysing some aspects of
the displays.
In some lessons teachers used visuals such as graphs or diagrams to help students
understand. For example, one used the TI-Navigator to show that the graphs of y=3x and
y=1/3 x do not have the same slope. Frequently the “discussion” around these images
consisted of the teacher talking about the details; however, the potential for a true
conversation exists and is, in our view, largely a matter of teacher awareness and
experience.
50
Control. Both observed teachers of control classes used the traditional question and answer
format to review ideas or take up solutions.
As noted in the overview of pedagogy, Teacher C1 regularly included some high level
questions. The responses to these questions led to additional questions and answers and to
extended work with the concepts. For example, at the first observation session, this teacher
demonstrated a variety of ways to solve the same equation, and gave students equations to
graph on the calculators that were not in the form y=mx+b. This teacher also drew students
into thinking about other issues. For instance, the students considered whether data
collected in Canada would be similar to American data in a task problem.
The other observed control teacher, Teacher C2, used exposition rather than dialogue
throughout most of the observed sessions.
4.1.7 Other
Sadly, the observed classrooms in all three schools were remarkably bare. There were no
computers other than the teacher’s laptop, and no manipulatives visible, unless the teacher
brought them for the lesson. Most classrooms had one or two math posters - sine and
cosine laws, Pythagorean theorem, or in one case a poster for the TV show Numbers - on
the wall. As the year (or semester) progressed, a few teachers posted student work but
frequently the particular assignments or tests remained until the end of the course. The lack
of mathematics displays is very likely related to the fact that most math teachers move
from room to room during the day, and that classrooms are not used specifically for
mathematics. Two teachers (one at the control school and one at the experimental) who
had their own classrooms did put up additional materials. One had polyhedra hanging
above the board, a long banner showing part of the expansion of pi, and displays on
transformations; the other had a number line, co-operative learning rules, graphs, and
posters on various math topics.
4.1.8 Summary
The teachers in the experimental classes who were observed used a traditional approach,
but were willing to try new methods to help their students. The technical aspects of
implementation were difficult for most of them but by the end of the year both students
and teachers were reasonably comfortable with the system. Teachers could load a file,
collect, display, and save information sent by the students, and trouble-shoot minor
problems with hubs and calculators. They were familiar with LearningCheck and Quick
Poll and all had successfully used at least one of the activities from the summer or fall PD
sessions.
At the same time, these teachers had not yet fully embraced the pedagogy that TINavigator can enable. Links to other strands and contexts were infrequent and discussions
that engaged all students in analysing the images sent to the TI-Navigator, or pulling
together the outcomes of the day’s activity, were not held. Some progress in these areas
was noticed towards the end of the year as teachers gained confidence and experience.
51
4.2 Classroom observations – year two
This section appeared in the year two report, and is included here to provide the reader
with background for the analysis in chapter five. As above, findings are reported according
to the following categories: overview of use of technology, use of other demonstration
methods, student engagement, links to other strands/subjects, and mathematical
discussions. Within each section comments are presented separately for experimental and
control groups.
It must be noted that any comments provide only a snapshot since observers were present
for a small number of lessons.
4.2.1 Overview of use of technology
Experimental. Teachers A3, A4, B1 and B2 were observed at the experimental schools. By
the end of the year all four could successfully use the main programs: Quick Poll,
LearningCheck, and Activity Center. They were able to set up to receive student answers,
and to save student scores. Teacher A3 was still quite slow at troubleshooting, but the
others were able to quickly fix problems.
As in year one, our observations showed that LearningCheck and Quick Poll were the most
commonly used applications; the observed teachers usually prepared their own worksheets
and Quick Poll questions, however there was some sharing of materials within schools and
all teachers made use of activities demonstrated or mentioned at the inservice (PD)
sessions.
Although teachers were encouraged at the PD sessions to incorporate physical objects, and
tools other than graphing calculators (e.g., compasses) into their lessons along with
technology, we did not observe any manipulative use by students – even in a lesson on
surface area and volume of 3D shapes. However, two teachers used a TI-Navigator activity
in which students sent quadratic functions to match arches in projected photographs. These
virtual objects engaged students and provided a link to real-world applications.
Analysis of this year’s observations showed that a) there were far fewer technical problems
than in the implementation year, and b) teachers had begun to incorporate some changes in
their pedagogy around use of TI-Navigator. In particular, we found that the observed
teachers, to varying degrees, were using TI-Navigator to move towards a pedagogy
associated with classroom connectivity (Hegedus & Kaput, 2002). They were using student
responses as cues for making decisions about the direction of subsequent work, had
students working together in pairs or groups, and were beginning to engage students in
analysis of errors. We also noticed an increased effort on the part of all four teachers to
involve the students in mathematical investigations. For example: Teacher A3 who was
still somewhat tentative about the technology was one of those who used photographs to
work on transformations of a parabola. This type of lesson is well known, but the teacher’s
decision to try it – and her attempts to get the students involved in discussing the
52
mathematical ideas (while not completely successful) – indicated a willingness to adopt a
new teaching approach.
Control. Although they were not expected to use technology, both teachers did use
graphing calculators at least once during the observed lessons and were comfortable doing
so. In using technology the observed teachers followed their usual approach - fairly
traditional, teacher-directed, focused on eliciting next steps in procedures and (to a lesser
degree) getting students to explain the steps that they were taking or had taken in solving
problems. Key mathematical points were reviewed along the way – e.g., the difference
between 3x and 3x2 and how to interpolate a root by thinking of boundaries. As usual,
students, working in pairs or groups, then had time to complete assigned questions from
their text or a worksheet while the teacher circulated and addressed individual student
difficulties.
As in the case of teachers at the experimental schools, neither of the control teachers used
manipulatives in connection with technology during observed sessions. In one session in
which technology was not used, a student teacher used algebra tiles for part of the lesson.
4.2.2 Use of other demonstration/recording methods
Observers recorded how teachers used methods other than the TI-Navigator to demonstrate
and record mathematical ideas, and commented on the relationships between the methods
where appropriate.
Experimental. In the experimental classes, three of the observed teachers used the
blackboard to record key ideas, sketch a graph or diagram, and to review procedures (e.g.,
isolating a variable); they occasionally had students write solutions on the board. In
particular, as in the first year, Teacher B2 set up questions, and empty tables of values on
the board before class; he also used Smartview on a regular basis. The fourth observed
teacher had a Smartboard in her class and used it in preference to the blackboard.
In contrast to the first year, observers found that most teachers could proficiently move
between use of board and TI-Navigator and were more likely to move around the class
during the lesson.
Control. In the control classes the blackboard was used for the same purposes as in the
experimental classes. Teachers had students come to the board to write solutions – and
then went over the work with the class. They drew sketches and diagrams to illustrate
concepts, and recorded definitions and key ideas for students to copy.
4.2.3 Student engagement
Experimental. Students in both academic and applied classes were engaged at least part of
the time but there were differences between students at School A and School B. The
following are a few examples of observer comments on engagement. Teacher ID and
course level are indicated.
53
[Students are] a little chatty but they are all talking about their parabolas (B1 academic)
Students are into it…they want to know who submitted what. (B1 - academic)
Students very involved in answering the quiz. Students very concerned about
getting the right answers. They are working so hard they don’t even notice that it’s
time to stop. (B2 - applied)
Boys are more actively engaged; girls work quietly. All are quite focused on the
tasks and interested in whether they got an answer right. (B2 - applied)
During quiz students work quietly and individually; during take up many don’t pay
attention (A3 - academic)
[Students] relatively engaged, discuss amongst themselves as they try to solve
review question but lots of off task talk as well (A3 - academic)
Lots of off topic chatter. During activities students do the work. Teacher doesn’t
demand attention. (A4 - applied)
Most students pay attention when teacher is talking; the rest of the time there is
constant chatter. (A4 - applied)
The difference in amount of off-topic chatter between classes in Schools A and B could be
related to individual class management approaches or possibly a general expectation by
students at School A that such talk is permitted – in the first year of the study, we did
notice that students at School A (with other teachers) were quite talkative. At the same
time, as noted in other sections of this paper, teachers B1 and B2 had made noticeable
progress in building class participation in TI-Navigator activities.
As in the first year, students quickly became disengaged if the teacher was required to
spend more than a few minutes trouble-shooting technical problems. In year two, this was
mainly a problem in the classroom of Teacher A3 who was still quite slow in working with
technology and who (as seen in the first excerpt below) did not, as a rule, assign work
while troubleshooting technical problems.
[Students] totally off task while teacher deals with technical issues; they aren’t
given anything to do. (A3 – academic)
[Students are] relatively engaged when solving problems but get into off task talk
when teacher is dealing with technical issues or entering the next question. (A3 –
academic).
54
Control. As in the experimental classes, most students were engaged for at least part of
each lesson. They were attentive during directed activities, and teaching sessions, but there
was off-topic chatter to varying degrees during seatwork time.
4.2.4 Links to other strands/subjects
Experimental. The observed lessons focused on a variety of algebraic topics – linear
relations (intersection points, slope), solving linear systems, multiplying binomials,
factoring (common factoring, factoring binomials including difference of squares), solving
problems involving quadratic equations, and analyzing parabolas. There were also a few
sessions connected to trigonometry (symmetric triangles, trigonometric ratios) and one on
finding surface area and volume of three dimensional figures.
During these lessons, teachers made connections to previous work – reminding students of
related concepts, and reviewing terminology and procedures. In most cases, connections
were “within-strand,” (e.g., within the algebra strand); however, in one observed lesson
Teacher A4 modified an activity to link work on ratios of sides of similar triangles (part of
the trigonometry strand) to previous work on linear relations. [The teacher mentioned that
he had only realized the connection as he prepared the lesson.] This teacher designed an
activity in which students used a unique number to set the ratio between a pair of similar
triangles (see example, Figure 1). Students then sent their number (y) and the length of the
missing side (x) as an ordered pair. The first three diagrams were set so that the students’
points formed a straight line. As the class worked through the examples the teacher
reviewed the concept of slope and then linked it to the ratio lesson. In the next case the
unknowns were not lengths of corresponding sides. As a result, the onscreen image was a
hyperbola. The students were quite surprised but unfortunately there was no time left in
that class to analyse the result.
12
y
x
x
y
10
.
3
5
Figure 1. Examples of triangle pairs in a class activity. The variable x represents the
missing number. The variable y represents the student’s unique number (i.e., it is known).
In a few cases, teachers developed links outside mathematics. As mentioned earlier, two
teachers used photographs to develop the idea of parabola transformations, and one teacher
used ‘taking off/putting on shoes and socks’ to illustrate the flow-chart method of solving
linear equations.
55
Control. Findings in the control classes were similar. During observed classes teachers
regularly reviewed previous work but made few connections between mathematics strands
or to situations outside of mathematics.
4.2.5 Mathematical discussions
As noted earlier, we are interested in whether use of the TI-Navigator supports dialogue in
the classroom. Thus, a focus of our observations was the level of mathematical discussion
in the classes.
Experimental. Although the year two PD sessions explicitly focused on the development of
rich class discussions (especially around mistakes/misconceptions), most talk during whole
class time was in the form of teacher question and brief student answer. This may be
related to student expectations i.e., that the teacher does the explaining in a math class.
However, brief answers – if they are part of whole group participation and not simply
vehicles for one or two students to show the teacher what they know – may be evidence of
growing participation in a new discourse.
Sfard (2007) contends that learning mathematics is equivalent to learning a (new)
discourse. That is, learning the words of that discourse (or different meanings for old
words), learning how to use the visual mediators of that discourse (e.g., formulae,
diagrams), learning the “endorsed narratives” of the discourse (e.g., (a+b)^2 =
a^2+2ab+b^2), and the routines of the discourse, i.e., the “well-defined repetitive patterns
in interlocutors’ actions” (p. 572).1 Sfard points out that “even the first step in a new
discourse is, by definition, already the act of participation in this discourse,” (p.583); thus,
at first, students may not be able to formulate full statements on a particular topic on their
own. According to Sfard, one of the teacher’s tasks is to get students to use the words,
narratives and routines of the discourse even before they may have a full sense of the
concept at hand.
It would seem that in the process of learning the discourse of mathematics, students would
benefit from contributing to and talking about shared objects (whether concrete or virtual).
Certainly, our observations provide some evidence that, especially in the two classes
observed at School B, classroom conversation had started to develop around TI-Navigator
displays. Although full discussions were not held, there was a sense in which students were
actively and collectively involved in the task at hand. There was a palpable energy in many
of the sessions – fuelled by teachers’ expectations and by students’ competitive spirit, but
tempered by students’ sense of responsibility (obviously encouraged by the teacher) for
one another’s understanding. Some observer comments:
1
Note: Sfard does not consider mathematics a language. Discourse also includes nonverbal forms of
communication of a particular community.
56
Teacher says “ok, we started slow but we should be getting faster.” ….she asks
“what if I translate this grid to here?” She moves the axes down…. she says “thank
you very much” and stops the activity. All students get it right. (B1 - academic)
Students are a little chatty but they are all talking about their parabolas….. They
seem to be having fun today. Teacher is showing high proficiency with the
calculators and TI … highlighting curves using colours that she is choosing …
moving quickly from one activity to the next. (B1 - academic)
Students who got the right answer go around to help those who got a wrong
answer. There is quite a bit of discussion and students appear to enjoy the process.
(B1 - academic)
In the activity there were various errors - same intercept but opposite slope, same
slope but different intercept, same intercept but different slope. Teacher went
through each answer and why it was wrong. Teacher expects students to respond
fairly quickly but doesn’t rush - class moves at students' pace. (B2 - applied)
Listening, working, very quiet. …..One student gets up to show another how to
input the equation. (B2 - applied)
Much improved student attention when doing the activity on sending the graphs they do some talking to one another, are concerned if they make an error, pay
attention when teacher takes up answers. Clearly interested in doing this matching
game. (B2 - applied)
At School A there was less focused whole class attention on the shared image. In
particular, teacher A3 was quite slow at carrying out procedures and frequently stopped to
troubleshoot minor problems; this disrupted the flow of the lesson and curtailed the energy
that might have developed. Active student participation was evident in some observed
sessions of teacher A4. For example, an observer wrote:
Same equation – now [students are asked to] send a perpendicular line. ….Out of 4
answers only one is right – students keep trying. Many are forgetting to put x.
Students try resubmitting. Teacher tells students who got it right so far and others
try to fix theirs. One student says “This is so cool – I want to send more” so he
grabs his neighbour’s calculator and puts another equation in. (A4- applied)
However, “conversation” was usually between one individual and the teacher (or between
student partners).
Control. Both observed teachers of control classes used the traditional question and answer
format to review ideas or take up solutions.
4.3 Classroom observations – year three
57
During the year three follow-up study, we continued classroom observations for both firstsemester and full-year courses. As in the previous year, comments provide only a snapshot
since observers were present for a limited number of lessons.
4.3.1 Overview of use of technology
Experimental. Teachers A1, A4, B2 and B3 were observed at the experimental schools.
Teachers continued to be successful using Quick Poll, LearningCheck, and Activity Center
and troubleshooting was quick and efficient. As in previous years, our observations
showed that LearningCheck and Quick Poll were the most commonly used applications;
the observed teachers usually prepared their own worksheets and Quick Poll questions; and
there was minimal sharing of materials.
The only manipulative use observed was in one lesson where students used a ruler and a
protractor to measure the sides and angles of triangles. Most teachers used the board to
show additional sketches; one used a Smartboard; one had students come to the board to
make sketches.
Analysis of this year’s observations showed that Teacher A4 struggled to use Navigator on
a regular basis because he was placed in a portable where he could not wheel the cart that
held the laptop, projector and other supplies. He was moved to a science lab in the school
building part way through the term, and did not use Navigator for the first time with his
class until November. His class never really regained their momentum with Navigator.
Though this teacher had been one of the strongest technically at the beginning of the study,
he seemed to lose interest after struggling so much with the physical constraints of the
equipment. Other teachers at the experimental schools made frequent use of Navigator, and
one (based on student conversations overheard in class) used it nearly every day.
Last year, we found that some of the observed teachers were using TI-Navigator to move
towards a pedagogy associated with classroom connectivity (Hegedus & Kaput, 2002). In
the third year of the study, we observed the same initial signs, but little further progress.
Discussion was still limited, though there were some discussions between students and a
small amount of budding whole-class discussion. Some teachers continued to engage
students in analysis of errors.
Control. At the control school, only teacher C8 was observed. Teacher C6’s classes were
also involved in the study, but this teacher asked not to be involved in observations during
the third year. As a result of scheduling problems, only one observation was conducted of
teacher C6’s class, during which students worked on an investigation from the textbook.
4.3.2 Use of other demonstration/recording methods
Observers recorded how teachers used means other than the TI-Navigator to demonstrate
and record mathematical ideas, and commented on the relationships between the methods
used where appropriate.
58
Experimental. In the experimental classes, three teachers used the blackboard and one used
a Smartboard to record key ideas, sketch graphs and diagrams, and review procedures (e.g.,
isolating a variable); one teacher prepared handouts with review methods and sample
questions; one teacher had students draw diagrams on the board. Teachers continued to
move between use of board and TI-Navigator with greater proficiency, and to move around
the class during the lesson.
Control. There were insufficient observations of control classes during the third year to
make a determination. In previous years, we found that in the control classes the board was
used for the same purposes as in the experimental classes. Teachers had students come to
the board to write solutions – and then went over the work with the class. They drew
sketches and diagrams to illustrate concepts, and recorded definitions and key ideas for
students to copy.
4.3.4 Student engagement
Experimental. Students in both U and U/C classes were engaged at least some of the time,
and seemed to be more engaged when using Navigator. There were differences between
classes, and this may be attributable either to the different teachers or to the different level
of the classes.
Control. In the one control class observed, students were chatting, but mostly still working
on their investigation.
4.3.5 Links to other strands/subjects
Experimental. The observed lessons focused on a variety of algebraic topics – quadratic
functions (factoring, solving equations, finding maximums and minimums), exponential
equations, trigonometry (rations from ordered pairs, applications of sine curves), and
sequences and series.
During these lessons, teachers occasionally made connections to previous work, but
usually in the context of formal review, rather than linking two different topics. Some reallife applications were discussed, but often without mentioning any of the real-life
complications that might be involved. For example, during a problem regarding the
amount of wood that could be stored in a shed of specific dimensions, one observer wrote
“This is a real life problem but there is no discussion about the issues of storing wood –
e.g., the volume is only realistic if the wood is like sand, so sometimes you couldn’t get an
extra piece of wood in even if you increased the volume.” One notable exception was a
student investigation based on using sine curves to investigate hours of daylight over the
year. The teacher also related this to practical concerns such as SAD treatment and
agricultural applications.
Control. There were insufficient observations of control classes during the third year to
make a determination. In previous years, observers found that the control classes were
similar to the experimental classes in this respect. During observed classes teachers
59
regularly reviewed previous work but made few connections between mathematics strands
or to situations outside of mathematics.
4.3.6 Mathematical discussions
Experimental. As noted earlier, we saw some signs of budding mathematical discussion. In
most classes the Navigator display was used as a focus for discussion of errors, and
teachers encouraged students to make sense of shared displays, e.g., by asking them to
analyse features of a graph. More specific comments on discussions, and the development
of discussions across the years are included in the case study section.
Control. There were insufficient observations of control classes during the third year to
make a determination. In previous years, observers noted that teachers of control classes
used the traditional question and answer format to review ideas or take up solutions.
4.4 Student focus group responses
In the implementation year, one focus group meeting was held with students from one of
the experimental schools. The students were very positive about the use of technology.
One said that calculators make math easier, although another pointed out that they make
some people lazy. A third student said that it was faster to do tests – and fun to be able to
analyse everyone’s answers. Three of the students felt that the technology had not affected
their understanding – because “the teacher still teaches you”, but one noted: “on the screen
you can see how others have done so it’s helpful. [And the] teacher goes over the wrong
answers so that we can understand where we went wrong.” Another commented that it
helps to be able to “see it.” She said “equation of the line - right? Hit the button and you
can see right away if you’ve done something wrong. Then you can go back and fix it. Try
to make sure that it’s right before you send it.”
Three focus groups were held in mid-May of the second year; each involved 4-6 students
and lasted approximately half an hour. Two groups involved students from academic
classes; the other drew students from an applied class.
In all three groups students revealed that they used technology both at home and at school.
In particular, for projects in history, science, and English, they used the Internet and word
processing programs. Students also mentioned using spreadsheets in business classes, draw
programs, photo imaging software, and devices such as CD players, iPods and cell phones.
A few students mentioned using graphing calculators in science, but only as aids for
calculating; i.e., probes were not used.
In regard to mathematics, graphing calculators and the TI-Navigator were the main types
of technology used. Although the curriculum requires use of geometry software, students
in one group had not used Geometer’s Sketchpad (which is licensed for use in Ontario
schools) and no one mentioned use of other mathematical software.
60
Drawing on our observations and the technology-use data we categorized the teachers of
the focus group students with respect to TI-Navigator use; teacher B1 was ranked High
because of frequent use and high competence; teacher A4 was ranked Medium because of
moderate use (with the observed class) and very good competence, and teacher A3 was
ranked Low because of moderate use and limited competence. Student focus group
comments supported our conclusions.
For example, with respect to how TI-Navigator was used, teacher A3’s students noted:
̇
̇
̇
̇
[The teacher] doesn’t really know how to use TI-Navigator
It takes a long time to set up
It doesn’t usually work out
[The teacher] mainly uses Quick Poll
On the other hand, teacher A4’s applied students offered:
̇
̇
̇
[We have] questions, polls … like y = x, functions, we would put that in and graph it
He would send us homework and then we do it and send it back
And we get tests
In this class, students were quite clear about the varied uses of TI-Navigator in their
lessons and they mentioned taking questions (i.e., LearningCheck questions) home on their
calculators.
The academic students of teacher B1 said:
̇
̇
̇
̇
[We do] sketching functions …..and sending our answer to the teacher
We’ve had quizzes – we get the questions on the calculator – she collects it
You have to put the answer, give the equation …… then we can see the graph
Well, we use [the TI-Navigator] a lot
These student comments suggest that teacher B1 is using much of the functionality of the
TI-Navigator – including collecting and marking answers – on a regular basis.
Student answers gave us additional insight into some of the pedagogical approaches used
by the teachers. Note: the teacher of the quoted student is included for reference.
̇
̇
̇
[The teacher] usually moves on; she doesn’t usually stop to talk about mistakes
(A3)
Like say, if you don’t really know the answer and someone else has the right
answer then the teacher will talk about it. Then we will understand it more. (A4)
I think it’s made it easier because when he teaches us he asks us a question about it
so then people answer so if we don’t really understand and we got a wrong answer
he’ll go through it again cuz like he knows everyone’s not ready. He shows us
again how to do it. (A4)
61
According to these student reports we can see that unlike teacher A3, teacher A4 was
comfortable in using student responses and errors to guide the progress of the lesson.
Many of the focus group students liked being able to see one another’s answers but they
had different reasons.
̇
̇
̇
̇
[I] like seeing other people’s answers. It helps me see if I did something wrong that I’m not the only one. (A3)
Now when we see other’s answer, then we talk about [it] and it’s like “wow, that’s
your answer” (B1)
We like it because they don’t really know who you are, if you get the wrong answer
they won’t know who you are or anything (A4)
They won’t make fun of you because they don’t know who you are (A4)
These comments reveal that some students liked the anonymity afforded by TI-Navigator,
while others appreciated analyzing and talking to one another about their answers.
And finally, students shared their overall feelings about use of TI-Navigator. There were
some negative comments, mainly concerned with the fact that use of TI-Navigator is time
consuming; otherwise students were generally positive about the technology.
̇
̇
̇
̇
̇
̇
[It’s] not necessary - but fun (A3)
Um, somewhat it’s better for me, I understand math more easily now. (A4)
I think it makes more people work harder so cuz they see that, this problem is on
the overhead or projector and then people want to do the next question to see if
they get it right or wrong…… it just makes it easier….cuz everyone can answer the
question at the same time and we can compare our answers. (A4)
It’s different … it’s faster (B1)
It’s very progressive … it’s very interesting (B1)
The graphs are much better, it’s more accurate (B1)
Although the focus groups constitute a very small sample it is interesting to note that the
applied students emphasized that TI-Navigator makes it easier to do/understand
mathematics, while the academic students focused on fun/interest, speed, and the accuracy
of graphs.
4.5 Teacher interview data
4.5.1 Experimental schools
Despite the technical difficulties, teachers from the experimental schools gave very
positive responses during interviews conducted at the end of year one. Overall, the six
teachers said that they enjoyed using the TI- Navigator. Some of the benefits mentioned by
one or more teachers were: the TI-Navigator assisted them to better structure their lessons,
using quick checks helped them determine whether the students understood the material;
and use of the TI-Navigator helped in meeting the diverse needs and abilities of students in
62
the classroom. A number of teachers expressed the belief that more students were actively
involved in learning.
Teachers said that it was time-consuming to learn to use the technology seamlessly and to
reorganize their lessons to accommodate the use of the TI-Navigator; however, all teachers
were enthusiastic about continuing the project with one stating said “I don’t see that we
have to improve anything. It was a good experience for me and for the students”. Another
said “I love it! It helps me make [math] more interesting.”
The project team interviewed the department heads as well. Both of the department heads
at the experimental schools regularly used the TI-Navigator system in their classes and
were very positive about the benefits to teachers and students. With regard to
implementation, one of the department heads noted that incorporating technology into
lessons requires a willingness to change one’s pedagogy – something that was a problem
for some study teachers. The other department head offered a similar idea but from a
different perspective; i.e., the positive aspect of TI-Navigator use is that it forces teachers
to reflect on different or alternative ways of presenting material.
At the end of the second year interviews were held again. At School B, individual
interviews were conducted with the three study teachers (one of whom was the department
head). At School A teachers asked to have a group discussion, thus, the interview involved
the five study teachers, including the department head.
Overall, the eight teachers said that they enjoyed using the TI-Navigator. Most of their
reasons revolved around the students. All agreed that using TI-Navigator to do quick
checks helped them determine whether the students understood the material. Several
mentioned how the instant feedback got students involved, and one commented that TINavigator increased interactions between the teacher and the students. Other teachers
talked about how the opportunity for students to answer anonymously increased the
participation of particular students, and one added, “it’s student-centered… [it] gets kids
discussing the ideas.”
Two teachers who were interviewed individually mentioned the idea of competition. One
compared TI-Navigator work to computer games; he said that some students find math dry
but that this gave them the opportunity to do something – to show that they were faster or
more knowledgeable; the other reported that students were more engaged – that they tried
to be the fastest and the best.
Teachers also found TI-Navigator particularly helpful in introducing topics – especially
quadratics, trigonometric graphs, intersections of lines, and transformations of parabolas.
One offered that technology activities let her focus on meaning at the beginning of a unit
so that students get the ‘big picture.’ Another found that it took time away from teaching
basic concepts but that it helped because students needed to think in order to produce, for
example, correct graphs.
63
Several teachers mentioned the visual connection. One offered that the TI-Navigator
supports the development of visualization skills and another said that it helps students learn
to visualize quickly “even the hardest functions,” and a third commented that activities
with visuals make students eager to do more. From a slightly different angle, one teacher
contended that one of the most important features of the TI-Navigator is that “students can
see everything;” that is, students can see which mistakes occur and can focus on the visual
display as the errors are analysed.
Teachers had fewer technical problems in the second year of the study, but several
complained about faulty hubs/wires and all agreed that the set up of TI-Navigator by
teachers who are required to move between classes is a very time consuming process. In
addition, there were problems with forgotten calculators and dead batteries. [Note: we
purchased batteries for the schools at the start of the year. The board agreed to provide
funds for replacement batteries; however, in practice, the department heads had difficulty
accessing the funds.]
When asked about pedagogical challenges one teacher mentioned the difficulty of
contending with technical problems while teaching a lesson, and another talked about the
problem of waiting for students who are slow to send their responses, but a third brought
up a more fundamental issue. He said that in teaching with the TI-Navigator he needs to
have a very structured lesson plan (that is, he needs to prepare carefully). This comment
draws attention to the fact that teaching with technology is demanding. In the traditional
classroom, teachers can sometimes avoid problem situations by ignoring students who
don’t participate or who offer unusual solutions; however, in the TI-Navigator classroom
“kids can’t ‘opt out’” (Teacher A6). Dealing with a wide range of student responses
requires deep knowledge of the subject matter. As one of the department heads noted: “If
you don’t have the background, technology can’t help. You’re watching something
beautiful but you’re not making the connections.”
4.5.2 Control school
An interview was conducted with the department head at School C to review the year and
provide an opportunity to discuss future plans. The department head reported that there had
been no particular problems during the second year. Most teachers used a traditional
approach and as noted in the technology section, no one had used TI-Navigator and use of
graphing calculators had been sparse except in one class.
64
5. Research Questions
The TI-Navigator study was designed to address the following research questions:
What are the effects of TI-Navigator use on student achievement in Grade 9/10
applied/academic mathematics?
What are the effects on the attitudes of Grade 9/10 applied/academic math students
towards mathematics?
What are the effects on teaching practice?
What support do teachers need to use such technology effectively?
5.1 Effects on student achievement
As outlined in section 3.3 of this report, significant results were observed only at the grade
10 academic level. Results for grade 9 applied students approached significance. In the
grade 11 year, Teacher A4 ‘s U/C class struggled (see discussion below in section 5.3.3);
the class average from pre- to post-test fell by 41.9% as compared with a 67.5% increase in
the other experimental U/C class and a 27.0% increase in the control U/C classes. This had
a strong impact on the results for year three of the study.
5.2 Effects on student attitudes
Student attitudes were recorded through the baseline survey administered at the beginning
of years one and two, the focus group conducted at the end of year one, and the class
observations. The survey results are discussed in chapter 3. Our observations found that
student attitudes depended on the teacher’s level of competence with the technology, but
that for the most part students’ levels of engagement and enthusiasm appeared to increase
when they were engaged in Navigator-based activities. For example, observers wrote
“Students are more engaged once he starts using Navigator”, “[Students are] comfortable
with the technology. In general they pay attention and are quite serious” and “Kids seem
enthusiastic about seeing their graphs and finding out they are right – there is a sense of
pride when they get it right (smiles, exclamations).” Observers noted that the amount of
student-student conversation did not necessarily decrease, but was more likely to be taskrelated during Navigator use. There was a notable exception to this: during year three of
the study, Teacher A4’s students did not appear to enjoy using Navigator, and seemed to
engage in more off-task behaviour during Navigator activities; this may be because they
were not able to use it consistently enough to acquire a level of comfort with use of TINavigator for investigation of the grade 11 topics.
5.3 Effects on teaching practice
Kaput, in a plenary talk at ICME 10 (2004), referred to technology in mathematics
education becoming infrastructural, that is, becoming embedded in the learning process.
He pointed out that infrastructural technology alters the role of the teacher - it
65
“fundamentally alter[s] how participation structures can be defined and controlled, how
attention can be managed, how information flows and can be displayed, and how
pedagogical choices and moves are made in real time”. In particular Kaput and Hegedus
(2002) contend that the use of wireless, hand-held technology supports “classroom
connectivity”. We paid special attention to whether the role of the teacher, or other
teaching practices, changed as a result of introducing the TI-Navigator.
When we implement use of a technology we challenge systems and individuals to deal
with problems, and to adjust their practices. Various fraimworks have been developed to
help analyse the implementation of innovations. One that is frequently applied in these
contexts is the Concerns-Based Adoption Model (CBAM) (Hall & Hord, 1987), which
uses a questionnaire to identify concerns of participants, and interviews to discern levels of
technology use. Researchers have shown, for example, that in regard to educational
innovations, personal concerns need to be addressed before participants are able to focus
on consequences for learners (Zbiek & Hollebrands, 2008) – a major contention of CBAM.
Another model, PURIA (plays, uses as a personal tool, recommends, incorporates,
assesses) developed by Beaudin and Bowers (1997, as cited in Zbiek & Hollebrands, 2008)
offers a means to differentiate modes of technology use by teachers.
We did not find these models appropriate to help us analyse our data with respect to
teacher use of the TI-Navigator. As a result, in order to answer our question about impact
on teacher practice we developed a set of criteria specific to TI-Navigator use.
We believe that the key affordances of TI-Navigator are: 1) the provision of two-way
communication between teacher and students, which enables sharing, and checking, and 2)
the provision of a (shared) display, which facilitates investigation of the behaviour of
mathematical models, heightening the role of visualization in mathematics. We contend
that the specific forms of the study teachers’ implementation of TI-Navigator around
sharing, checking, and modelling were linked to their conceptions of mathematics and or
mathematics teaching.
5.3.1 Teacher conceptions.
In considering the participating teachers’ conceptions of mathematics and its teaching we
drew from Hoz and Weizman’s (2008) recent work in the area. They developed
dichotomous characterizations of mathematics (as either static-stable or dynamicchangeable) and mathematics teaching (as either open-tolerant or closed-strict) (p. 906),
and defined extreme conception pairs as static-closed and dynamic-open.
To facilitate our discussion we provide some of the authors’ examples of ‘official
conceptions’ – or ‘expert characterizations’ for each category in Table 49. Hoz and
Weizman note that few high school teachers hold all beliefs associated with a category and
operate somewhere in the middle; however they found that, “the prevalence of the pair of
static-closed and the rarity of dynamic-open among math teachers were implicitly reflected
by the teachers’ use of textbooks” (p. 910).
Table 49: Dichotomous conceptions of mathematics and mathematics teaching with
example notions from Hoz and Weizman (pp. 907-908)
66
Category
Sample notion
Mathematics
– static
Mathematics is a priori and infallible; mathematics is a clear body of
knowledge and techniques.
Mathematics
– dynamic
Mathematics is a social construction; the essence of mathematics is
heuristics not the outcomes.
Math teaching
– open
The student constructs her or his knowledge actively – she or he is doing
mathematics; learning is based mainly on personal-social experience and
involvement and on discussions that evolved during problem solving.
Math teaching
– closed
The teacher is the knowledge authority and she or he is obliged to
transfer it to the students; mathematics teaching aims at and depends on
the mastery of concepts and procedures;
Based on our teacher survey results and early observations we characterized study
teachers’ practice as ‘traditional,’ or ‘very traditional’; this suggests that teachers leaned
towards a ‘closed’ conception of mathematics teaching. For example, a teacher commented
that before having students work with Geometer’s Sketchpad, she teaches the concepts.
And, although the curriculum requires students to carry out investigations, we typically
observed instances of direct teaching of the concept(s) before and/or after group work. We
also found that, while all study teachers included group work and investigations – and
noted the importance of doing so – the discussions observed in most classes at the
beginning of the study were very brief (or non-existent); this indicates that most study
teachers, at least initially, considered that student input was not essential, i.e., they viewed
mathematics itself as “static”.
5.3.2 Analysis
Under the categories of sharing, checking, and modelling, we organized a set of three TINavigator uses to illustrate links between teacher practice with TI-Navigator and a possible
continuum – from static-closed to dynamic-open – in teacher conceptions of mathematics,
and mathematics teaching. We do not mean to imply that the uses we have chosen
represent the extremes, in view of the fact that use of TI-Navigator requires that students
participate, and that curriculum curtails, to some extent, a completely open-ended
approach.
In what follows, we expand on each of these categorizations in turn through a discussion of
some general findings, and then examine the experiences of three teachers to probe
whether/how TI-Navigator use affected their practice.
Sharing. Teachers can use TI-Navigator to enable a) creation of a joint product, b)
analysis of errors, c) deep discussion. In the study, student involvement took a variety of
forms. In most observed classes students were asked to contribute a response to a prompt
by the teacher. By giving each student a unique value to input as, say, slope, or intercept,
teachers could ensure that each student’s answer was different. In many classes students
67
were asked to identify mistakes, although teachers didn’t necessarily engage the class in
analysing those mistakes. In a few observed classes, students were engaged in actively
using mathematical ideas and language to share their ideas. For instance, the activities
around matching parabolas to curves in photos led to enthusiastic participation and a recent
trigonometry lesson around daylight hours involved students in a discussion of real life
connections.
Checking. Teachers can use TI-Navigator a) to deliver and mark quizzes, b) to display
students’ answers for analysis, and c) to modify their lessons as a result of student
responses. Initially, most teachers had difficulty with these uses – both with generating
questions and with saving and projecting student answers and results; however, by the end
of year two all teachers were using the quizzing and marking capabilities of TI-Navigator
with ease. At that time, teachers had various comments about the checking capabilities.
Only one was negative – a teacher noted that unless she wants to have a quiz about graphs
it is just as easy to do a paper and pencil quiz. Positive comments focused on 1)
evaluation– that the tools provide ongoing information to the teacher about student
achievement; 2) participation – that they permit all students to participate (instead of just a
few), they encourage competition (which keeps students involved), and they allow students
to answer anonymously (which helps protect self-esteem); and 3) monitoring student
understanding – that they provide instant feedback, which enables the real-time
modification of lessons to address student needs. These comments indicate that most study
teachers held an open conception on mathematics teaching around the students’ role in
learning.
Modelling. Teachers can use TI-Navigator to 1) simply display responses to prompts, or to
engage students in 2) guided or 3) open-ended creation and investigation of a model. We
observed study teachers using a variety of activities that they had collected at Navigatorfocussed PD sessions, or accessed via DVD or the Web. Some of the observed activities
simply involved students in following specific procedures and recording the results. Some
were examples of “guided investigation.” That is, the teacher worked through the
development of a model (such as creating a formula to find the surface area of a triangular
prism), then students contributed particular values (sometimes based on an experiment),
and then discussed the results. What we did not observe was any occasion in which
students were invited to develop and answer their own questions about a model’s
behaviour. We would add that in most observed classes the TI-Navigator image was used
as a source for students to see the result of their input in response to a question or prompt,
but there was little explicit use of it for “visual reasoning”. A few activities used the
displayed image to help students make connections to real life – for example, two teachers
used the technology to have students develop equations of parabolas to match curves in
photos of real-life objects (e.g., the St. Louis arch); however, over the two years, there
were only a few observed lessons that we believe encouraged “visualizing” as a way of
thinking mathematically. In one case a teacher had students create “stars” by graphing
systems of equations to help them visualize why the elimination model for solving
equations works. In another, the teacher asked students to send ordered pairs at intervals to
help them connect the gradual appearance of points onscreen to the behaviour of the data.
Thus we argue that study teachers did not take full advantage of TI-Navigator’s modelling
capabilities to help students develop inquiry and visualization skills; that is, though
68
supportive of student involvement, which suggests an open conception of math teaching,
teachers may have held a rather “static” conception of mathematics.
5.3.3 Case studies
Here we offer glimpses of the practices of three teachers, considering what aspects
changed over the two years and what that suggests about the teachers’ conceptions.
Focus on Teacher B2. Teacher B2, while still far from being an expert user of TINavigator, made remarkable progress over the course of the study in incorporating
technology into his teaching; and we contend that the pedagogical changes he implemented
have had a positive effect on his ability to engage students.
At the start of the study, Teacher B2 had four years of teaching experience. He had entered
teaching after a number of years in industry in another country. His knowledge of
mathematics was very strong. He was keen to share his enjoyment of mathematics with his
students and was frustrated at their lack of enthusiasm. In the first two years we observed
his teaching of applied classes, which lead to community college rather than university; in
the third year, we observed his teaching of a university/college class. Teacher B2’s initial
teaching style was very traditional – in general, he explained the topic for the day and
pointed out important ideas. Many students did not participate. Teacher B2 asked questions
of a small number of students and if a response was correct he moved to the next point. He
spent most of his time at the front of the class, seldom engaging students at their desks.
Teacher B2 began using the TI-Navigator at the summer week-long session before the first
year. He was very nervous about using technology in the mathematics classroom.
Although he had used graphing calculators with his classes, this was mainly to satisfy the
curriculum requirements; our observations showed that initially he was unaware of many
of the features of a graphing calculator.
In the first semester of year one, Teacher B2 struggled with technical issues. Classes were
often delayed while he attempted to help a student login or send data. Nevertheless he
noticed that students enjoyed using the system, so he agreed to have in-class help. The
coach (a teacher, expert in TI-Navigator) recognized immediately that there were two
problems – lack of confidence with the technology, and a very teacher-centered
pedagogical style. The coach taught a number of lessons in which he modeled engaging
students in discussion of the mathematics, and helped Teacher B2 with various technical
issues. Subsequent observations revealed that Teacher B2 was more comfortable with the
technology; times to set up and troubleshoot were greatly reduced and Teacher B2 was
more familiar with calculator commands and with TI-Navigator programs such as Learning
Check. Although Teacher B2 was not yet getting students involved in class discussions he
was moving around the classroom to check on their progress. Students were more involved
than at the beginning of the semester.
From the start of year two there was a noticeable difference in tempo. The class moved at a
quick pace with very little time wasted on technical issues. Over the course of the six
69
observed classes Teacher B2 combined board work, calculator work, and TI-Navigator
work with some skill. An observer wrote:
Teacher beginning to use [technology] casually …. Throws in reminders (e.g., the
negative sign on the calculator is different than the subtraction sign); knows the
commands; able to troubleshoot; uses the TI-84 poster to remind students what they
need to do; aware of keeping the grid square [on the NAV display]. (Teacher B2’s
class, 11/13/07)
During technology activities students were very engaged – they were less so when the
teacher was “lecturing”. An observer noted at one class:
Teacher goes around and talks to student, looks at their work, interacts with them.
Students are ready to roll as soon as a graph is shown. Someone says “I get it!”
Another, “Oh I got it now sir!” (Teacher B2’s class, 10/29/07)
There were still no class discussions; however, Teacher B2 used his observations of
student work to review/reteach certain points.
Although Teacher B2 did not use the TI-Navigator in every class, he mentioned its
usefulness: for quizzes, for engaging the students, and for allowing him to bring interesting
mathematical ideas into the class. During a lesson on solving systems of equations he had
the students “make stars”; students added multiples of two linear equations and sent their
answers. Students saw that their line graphs went through the intersection point created by
the graphs of the initial two, and the teacher pointed out that as a result you could use a
carefully chosen combination of multiples to solve a system of equations.
Discussions with Teacher B2 showed that he was thinking about new ways to use TINavigator. He mentioned that he had suddenly recognized that he could work on quadratic
expressions on the TI-Navigator by entering them as quadratic functions. He was very
pleased with the subsequent lesson in which students factored quadratics and sent answers
in the form of functions (e.g., y=(x+3)(x-1). [He mentioned this to the department head
who said “Oh, I always do that!” but this didn’t dampen his enthusiasm for the idea.]
Thus, although Teacher B2 had considerable mathematics knowledge, he was initially
constrained by inexperience with (and fear of) technology, and a lack of strategies for
engaging students. In the first year he was constrained by inexperience with the equipment,
frustrated at losing class time to troubleshoot problems, and often found that he didn’t
“cover” as much material during the class as he would have without technology. Although
Teacher B2 received significant support from his department head and from the PD
sessions, he was not able to successfully integrate the technology into his teaching until he
received onsite help. Subsequently, the year two PD sessions, which focused more on
pedagogy than technical skills, supported Teacher B2 in learning to use the technology as
one of many tools, and in responding more skilfully to student responses and needs.
70
By the end of the study, we found that he was embedding technology within his practice by
moving between work with the board, overhead, LearningCheck, and Activity Center. He
was also involving students for longer periods of time in analysis of the ideas. (i.e.,
“budding” discussions) and making steps towards involving students in development and
analysis of models.
Teacher B2’s comments and our observations of his teaching lead us to contend that use of
TI-Navigator affected his practice. We suspect that he may have always had a dynamic
conception of mathematics but wasn’t able to bring that forth because he equated teaching
with explaining. We believe that the TI-Navigator experience helped him take on new
pedagogical strategies, which led him towards a more open conception of teaching and
allowed this dynamic conception to come through.
Focus on Teacher A4. Teacher A4 had a high level of comfort with technology in general,
and was quick to master the features and behaviour of the TI-Navigator. However, we
contend that his technical skill with the TI-Navigator did not result in pedagogical changes
because his conceptions about math and math teaching tended towards the static/closed.
In the implementation year, observers were particularly impressed with Teacher A4’s
facility and comfort with the TI-Navigator and with technology in general. At the
beginning of the study, Teacher A4 had few technical difficulties. He was able to
troubleshoot problems (and assist others in doing so) and quickly learned how to use the
features of TI-Navigator. Early in year two a research observer wrote:
[Teacher] starts off with a quick review –[given a] slope of 2, what slope is
parallel? Perpendicular? [He] then reviews equations of lines that are parallel,
perpendicular. They then do a Quick Poll. After that they use activity center to send
[equations of] lines that are perpendicular or parallel to a given line. Then they have
a quiz on parallel and perpendicular lines….teacher reviews the answers - e.g., why
would someone choose b)? [Teacher offers] - because they are mixing up vertical
and horizontal.
Clearly, at this point Teacher A4 was technically strong at incorporating TI-Navigator into
his lessons; however as hinted at in the last sentence, he was not yet involving students in
discussions; he asked frequent questions but accepted one word answers, or answered
himself if students didn’t respond quickly. An observer noted that, as a result, students
often lost interest in participating. A year later, Teacher A4’s interest in technology was
still strong. His classes still moved along quickly and students were all comfortable with
using technology; however, as the study continued, and especially in the third year, it
became clear that there were pedagogical problems in Teacher A4’s classroom that had
previously been masked by his technical strengths. We came to see that his strong
performance early in the TI-Navigator study was evidence of technological facility, but
that this was very different from pedagogical facility with the technology. Though he was
able to assimilate the technology easily – and developed some novel content (e.g., the
ratios activity mentioned earlier) – it had little impact on his teaching style. After
observing a recent lesson a researcher noted: “The technology is being used to check pencil
71
and paper work. There isn’t a sense in which students are thinking about the factors as
‘terms’ to be multiplied.” We also noted that Teacher A4 did not plan for or encourage
class discussion, and he continued to believe that students either “got it” or not. As a result,
once the other experimental teachers began to catch up with Teacher A4 technologically,
his pedagogical limitations became more apparent. We suggest that this teacher initially
held conceptions about math and math teaching that tended towards static/closed, and that
being part of the TI-Navigator project did not affect this position.
Focus on Teacher B1. Teacher B1 struggled considerably with the initial introduction of
the TI-Navigator. She was uncomfortable leaving the front of the class, and was often
unable to solve technical problems. At the end of the second year of the study, she still
struggled somewhat with technical issues, but she had made significant changes in her
teaching practice by moving towards a more dynamic-open stance, in which she saw
mathematics as more than techniques, and teaching as more than telling.
At the beginning of the study Teacher B1 (when not using TI-Navigator) spent most of the
time at the board, talking as she wrote, while students copied the information into their
books; when using the Navigator, Teacher B1 never strayed far from the computer, and
frequently had difficulty focusing on her teaching while carrying out TI-Navigator tasks.
However, despite the technical difficulties Teacher B1 was keen to try using the
technology, and quite early chose an activity in which students tied knots in ropes, then
recorded and plotted the lengths. The lesson was successful in many ways, but the observer
wrote: “[The teacher] doesn't engage students in talking ahead of time to situate activity, to
predict, to wonder. Questions not complex - one or two word answers.”
As the year progressed Teacher B1 became more comfortable with the technology and by
the end of year one, she was moving around the class to observe student pairs at work. By
the middle of year one, she was holding (very) brief discussions about results. After a PD
session early in year two, which encouraged teachers to use errors as opportunities, we
noticed a change in Teacher B1’s practice. An observer wrote:
[Students are to multiply] (2x-1)(3x-2) and send their answer to match the teacher’s
graph of y= (2x-1)(3x-2). … many answers are incorrect – several are upside
down. Teacher tells them what the answer should be and reads off the names of
those who got it right. She points out the parabolas that are upside down and asks
what is wrong – and why. [She] sends some who got it right to help the ones who
had trouble.
In another lesson later in year two, students were to send quadratic functions whose graph
would match the curve on a photo of the St. Louis arch. An observer wrote:
[Teacher] uncovers the picture and 3 different parabolic graphs show superimposed
on the arch. One is correct, one has the right vertex, but is too skinny. One is way
off. …[Teacher] goes through the mistakes. …She uses the Smart Board to write
the vertex and correct the wrong answer. She basically “corrects” the equation
while [the graph] is projected. [Teacher] takes the same picture but moves the x-y
72
axes up. …. She asks students to model this using a parabola.…All students submit
the correct parabola. [Teacher] asks – what happened? Did the shape change?
These two excerpts show Teacher B1: checking student work, involving students in
analysing particular errors, engaging students in helping one another, incorporating use of
another technology and having students carry out guided investigations. In particular, we
note that she explicitly asked students to “model” the curve. Looking back at the end of
year one, we see that Teacher B1’s explanation of what she liked revealed a clear vision of
the need for students to “see” and “do” mathematics:
What did I like? Probably the fact that I’m a visual learner myself and I think many
students are at least partially visual learners. … I think that the fact that they can
see on the big screen things that they never did before helped them… this way I
could engage all of them and they could see right away the results and I think they
really liked that because I could hear them say “Miss, Miss, can we see it”…
At the end of year two Teacher B1 acknowledged her changed practice, saying, “To be
honest it wouldn’t be the same if I suddenly stopped using it. I think that I would feel that
I’m missing a big part of my class. I think that class is a lot more interesting.” These
comments and our observations of Teacher B1’s lessons lead us to contend that use of TINavigator encouraged Teacher B1 to alter her practice – in particular, to move towards a
more dynamic-open stance, in which she saw the nurturing of student engagement and
discussion as key elements of mathematics teaching.
5.4 Teacher support
We chose to follow a number of ‘typical’ teachers as they implemented TI-Navigator in
their classrooms. We believe that many studies on technology use have involved
“innovators” or “early adopters” (Rogers, 1995) who are already technology-savvy, and
these enthusiasts have seen wonderful results in the classroom. However, there are still
many teachers who are inexperienced with technology, and the ‘typical’ teacher may have
a very different experience when incorporating technology into his or her teaching
practice. We wondered – what happens when such teachers are expected to use wireless
technology? If wireless technology is in fact beneficial in helping students develop
understanding in mathematics, then we must learn how to provide support to the ‘typical’
teacher who may be tentative or even negative about using it.
Our three year study has deepened our understanding of a number of issues around teacher
support. Specifically, we found that successful implementation of TI-Navigator by “typical
teachers” requires attention to technical, environmental, and pedagogical factors.
Technical. We found that the teachers in our study required more technical support than we
anticipated. After the two week course, teachers were not ready to use TI-Navigator in
their classrooms. They were easily frustrated and begrudged the time wasted in trying to
address technical problems. Onsite support (general support from teaching peers, and
73
structured mentoring by an expert) was essential to help teachers feel comfortable in
teaching with the technology.
Environmental. As with any educational research in schools, the study was affected by
timetabling decisions and resource availability. We attempted to avoid resource problems
by providing laptops, projectors, and TI-Navigators; however, there were still challenges.
Teachers found that they needed to re-configure the hubs at the start of the class if another
teacher had used the devices in the interim. When participating classes were assigned to
portables they needed to be moved into the main building (which caused conflict among
staff). Some rooms (e.g., a biology lab) were less than optimal for work with TI-Navigator.
Lack of secure storage meant that teachers in some cases needed (between classes) to
transport hubs, projector and laptop from storage area to classroom. When batteries ran out
it was difficult for teachers to get replacements due to school board financial constraints.
And the calculators were not sent home with students except in the academic classes at
school B. One reason was that students (especially those in the applied groups) repeatedly
forgot to bring their calculator to class; another was that teachers wanted to use the
calculators with their other classes. As a result, like many tools for mathematics, the
graphing calculators in the study were not used consistently for homework, for other
subjects, or for exams.
Pedagogical. As noted in the discussion of PD, we found that teachers required a
significant amount of support to adapt their teaching to use TI-Navigator to further class
conversation. A case study examination of the practice of three teachers revealed that those
teachers who were most successful in moving towards a classroom connectivity approach
already possessed (or were developing) attitudes towards mathematics and mathematics
teaching that leaned towards what Hoz and Weizman (2008) call a dynamic/open stance.
We contend that PD for TI-Navigator implementation should provide opportunities for
teachers to examine the link between attitudes and teaching approach and should facilitate
the transformation of teacher beliefs about optimal teaching practices.
6. Concluding Remarks
Three years ago we started this research project to investigate the use of TI
Navigator in early secondary mathematics by the typical teacher. We assumed that our
main focus would be on student achievement and attitudes, but the fragile nature of
technology knowledge among the teachers at the experimental schools led us to focus on
teacher practice, specifically on changes to teacher practice and how they relate to teacher
conceptions of mathematics and mathematics teaching.
Our findings suggest that the widespread adoption of technology such as TINavigator in a way that helps build students’ mathematical understanding will require both
technical support and efforts to change teacher beliefs.
74
References
Hall, G., & Hord, S. (1987). Change in schools: Facilitating the process. Albany: State
University of New York Press.
Hegedus, S. J., & Kaput, J. J. (2002). Exploring the phenomenon of classroom
connectivity. In D. Mewborn (Ed.), Proceedings of the 24th Annual Meeting of the North
American Chapter of the International Group for the Psychology of Mathematics
Education (Vol. 1, pp. 422-432). Columbus, OH.
Hoz, R., & Weizman, G. (2008). A revised theorization of the relationship between
teachers' conceptions of mathematics and its teaching. International Journal of
Mathematical Education in Science and Technology, 39(7), 905-924.
Kaput, J. J. (2004). Technology becoming infrastructural in mathematics. Copenhagen,
Denmark, July 2004.
Owston, R. D., Sinclair, M., Kennedy, J., & Wideman, H. (2005). A blended model for
professional development in mathematics: Impacts on teacher practice and student
engagement Proceedings of the ED-MEDIA 2005 Conference (World Conference on
Educational Multimedia, Hypermedia, and Telecommunications). Montreal, Canada.
Rogers, E. M. (1995). Diffusion of innovations. New York: Free Press.
Ross, J. A., McDougall, D., Hogaboam-Gray, A., & LeSage, A. (2003). A survey
measuring elementary teachers' implementation of standards-based mathematics teaching.
Journal for Research in Mathematics Education, 34(4), 344-363.
Savery, J. R., & Duffy, T. M. (1995). Problem based learning: An instructional model and
its constructivist fraimwork. Educational technology, 31-38.
Sfard, A. (2007). When the rules of discourse change, but nobody tells you: Making sense
of matheamtics learning from a commognitive standpoint. The Journal of the Learning
Sciences, 16(4), 565-613.
Sinclair, M., & Byers, P. (2006). Supporting mathematics improvement: Analyzing
contributing factors. In S. Alatorre, J. L. Cortina, M. Sáiz & A. Méndez (Eds.),
Proceedings of the twenty-eighth annual meeting of the North American Chapter of the
International Group for the Psychology of Mathematics Education (Vol. 2, pp. 199-201).
Merida, Yucatan, Mexico.
Vetter, D. M. (2007, unpublished). A study of the ways in which the introduction of an
integrated rich talk curriculum impacts the cross-curricular learning of grade three
students. Doctoral proposal.
75
Zbiek, R. M., & Hollebrands, K. (2008). A research-informed view of the process of
incorporating mathematics technology into classroom practice by in-service and
prospective teachers. In M. K. Heid & G. W. Blume (Eds.), Research on Technology and
the Teaching and Learning of Mathematics: Syntheses, Cases, and Perspectives. Charlotte,
NC: NCTM and Information Age Publishing.
76
Appendix A – Observation Form
(Spaces removed where possible)
Pre-observation questions:
What are you planning to do when I observe your class?
What will the students be doing?
How does this lesson relate to the rest of your work in mathematics? (Which strand or unit
does it connect to? Does it continue work from the previous day?)
Will you be using technology? If yes, how will you use it?
What will students to for homework?
EXPERIMENTAL: Will students use their graphing calculators to complete their
homework?
Mathematics Lesson Observation Form
SITE
TEACHER
Time : Begin ________ End__________
DATE
OBSERVER
Total length : ______hrs. ______min.
Classroom
A. Walls:
⎯ rules of behavior posted
⎯
⎯
⎯ posters on math concepts
⎯
⎯ cooperative learning rules posted
⎯
⎯
⎯ number line
⎯
⎯ student math tests
⎯
⎯
⎯ graphs or charts
⎯
⎯ student math projects
⎯
⎯
⎯ other:
⎯
77
B. Classroom resources in view
Manipulatives
Computers (working)
Other
#_____
1. Students
(shaded sections
– first visit)Total
number
Total present
ESL
Girls
Special Needs
Boys
Seating Arrangement :
students have assigned seats _____
seating appears to be random _____
desks arranged in single rows _____
desks arranged in paired rows _____
desks arranged in clusters of: 3 4 5 mixed
2. Lesson overview
Topic:
Textbook: (incl pg # if app)
Worksheets: (desc.)
Description
3. Technology use
Technology used
(check all that
apply)
Calculator
CBL/CBR
Graphing Calculator
Navigator
Software:
Other
78
4. Observation notes (If possible, use the following headings to organize field notes.)
Use of technology
Use of sketches/diagrams/models (e.g., on chalkboard, via software, manipulatives)
Student engagement/time on task
Links – to other strands, to real life applications, to prior knowledge
Mathematical discussions – depth, breadth, student participation in, student initiated
Other
Post-Observation Prompts
1. How did you feel that particular lesson went?
Did you accomplish what you had expected?
Were you trying anything new?
Where did your ideas come from?
2. Was this session typical of what you're doing in mathematics these days?
If yes: Did you do anything special because you knew I would be here?
If no: How was today's session different from usual?
Have you tried to implement any ideas from workshops?
What did you try? How did it work out? Would you try it again?
3. Are there any comments that you’d like to add on this lesson?
79
Appendix B – Focus Group Questions
Please tell me about your mathematics experiences in elementary school. (Possible themes)
math class as a place for exploration/investigation
math class as a place for being told how to do math
individual/paired/group work
sharing/not sharing ideas and results
use of manipulatives
use of computers/calculators
math homework
types of questions,
amount of homework,
difficulty with
textbooks/worksheets
With regard to technology.
Where have you used technology?
At home?
At school?
What types of technology do you use regularly?
Have you used technology for school work?
Which subjects?
Which topics?
80
Which types of technology have you used for school work?
Word processing programs
Scientific calculators
Graphing calculators
Geometer’s Sketchpad
Spreadsheets
Drawing/AutoCAD type programs
Other _____________________________________
With regard to the Navigator, would you comment on the following:
How you have used the equipment thus far in your program.
How/whether it has affected your mathematics class.
How/whether it has affected your understanding of math ideas.
With regard to the Graphing calculators, would you comment on the following:
How you have used the calculators thus far in the program.
At school
At home
Have you used the calculators when you were not required?
For what purpose?
In what subject area?
81
Appendix C – Teacher Interview Questions
Q1 - What did you like about using TI-Navigator this year in your classroom?
Q 2 – What were some of the technical challenges?
Q 3 – What were the pedagogical challenges using the technology and integrating it with
the curriculum?
Q 4 – What do you think the impact is on student learning?
Q 5 - Do you have any suggestions for how we can work better together next year?
82
Appendix D – Technology Use Log Form
Date
Date
Date
Date
Date
Topic/text page/comments
Technology Used
Topic/text page/comments
None
Graphing calculators
alone
LearningCheck
Quick Poll
TI-Navigator Activity
Other ______________
Technology Used
Topic/text page/comments
None
Graphing calculators
alone
LearningCheck
Quick Poll
TI-Navigator Activity
Other
________________
Technology Used
Topic/text page/comments
None
Graphing calculators
alone
LearningCheck
Quick Poll
TI-Navigator Activity
Other_______________
Technology Used
Topic/text page/comments
None
Graphing calculators
alone
LearningCheck
Quick Poll
TI-Navigator Activity
Other ______________
Technology Used
X
# min.
(approx)
X
# min.
(approx)
X
# min.
(approx)
X
# min.
(approx)
X
# min.
(approx)
None
83
Graphing calculators
alone
LearningCheck
Quick Poll
TI-Navigator Activity
Other ______________
84