Marshall, Petrosino Martin (2010)
Marshall, Petrosino Martin (2010)
Marshall, Petrosino Martin (2010)
DOI 10.1007/s10956-010-9206-y
Taylor Martin
Abstract We present results of an investigation of pre- 1989) and inquiry focused on ‘‘questions generated from
service secondary mathematics and science teachers’ con- student experiences’’ and ‘‘real phenomena’’ (National
ceptions of project-based instruction (PBI) and their Research Council 1996, p. 31). Project-based instruction
enactments of PBI in apprentice (student) teaching. We (PBI) focuses on an authentic task as a vehicle for learning
evaluated their thinking and implementations within a (Kliebard 1995). During the past few decades, PBI has
composite framework based on the work of education enjoyed a resurgence, in which implementers have sought
researchers. We analyzed survey responses, both qualita- to address concerns that the method is too concentrated on
tively and statistically, from three cohorts of preservice students’ immediate interests rather than on future course-
teachers both before and after apprentice teaching. In work and employment (Krajcik et al. 1999; Polman 2000a).
addition we interviewed and observed a subset of these Project-based instruction centers on an authentic task
future teachers. We found that in general the preservice but is distinguished from other forms of inductive learning
teachers held superficial views of PBI, as compared to the by its focus on the creation of a product, often a report
researcher framework. Participants reported time and cur- detailing the student’s response to a driving question, as a
riculum restrictions as major barriers; however, teachers driver for learning (Prince and Felder 2006). A recent
for whom enactment of PBI was presented as an explicit development in PBI is a shift in focus from students’
goal, and who were given support toward that end, were immediate interests toward supporting long-term learning
more likely to enact authentic implementations, regardless goals. Barron et al. (1998) described the challenges of
of previous reservations about PBI. Without this additional implementing PBI and presented four design principles that
scaffolding, even teachers with high affinity for PBI were reflect this emphasis on broader learning goals: (a) defining
unlikely to implement it authentically. learning-appropriate goals that lead to deep understanding,
(b) providing scaffolds, (c) providing opportunities for self-
Keywords Project based instruction Preservice assessment and revision, and (d) developing social struc-
teachers Teacher preparation Project based learning tures that promote participation and a sense of agency.
The first two are aimed primarily at developing content
knowledge, the second two at general educational skills.
Background and Significance Many studies have demonstrated the efficacy of PBI in
both science and mathematics (Boaler 2002; Krajcik et al.
Today’s science educators are called upon to engage their 1998; Lehrer et al. 2001, Petrosino et al. 2003), especially in
students in authentic, context-rich, activities (Brown et al. cases where the instruction tended to include the elements
prescribed by Barron et al. (1998). However, with some
notable exceptions, little work has concentrated on preser-
J. A. Marshall (&) A. J. Petrosino T. Martin vice teachers’ learning of PBI. Windschitl and Thompson
The University of Texas at Austin, Science and Mathematics
(2006) reported on preservice, secondary-level, science
Education, 1 University Station D5705, Austin,
TX 78712-0382, USA teachers’ engagement in project-based learning, but focused
e-mail: marshall@mail.utexas.edu primarily on their development, use, and understanding of
123
J Sci Educ Technol
models in the course of doing projects and whether they were: (a) a driving question; (b) a tangible product; (c) an
planned to invoke the modeling paradigm explicitly in their investigation (d) use of cognitive tools; (e) collaboration (f)
own classrooms in the future. assessments; (g) scaffolding; and (h) length. Table 1
Likewise, little is known about how novice teachers, summarizes the essential elements as described by each
apprentice (student) teachers in particular, implement PBI program.
and the challenges and benefits they encounter in doing so. With an overarching goal of informing our preservice
An extensive review of literature found that new science teacher preparation in PBI, we were interested in finding
teachers face many challenges in understanding and out how our students viewed PBI after coursework in the
implementing effective instructional strategies, but none of subject, and how those views affected whether and how
the studies included in the review dealt explicitly with PBI they went on to implement PBI as ATs (student teachers) in
(Davis et al. 2006). Polman (2000b) reported issues that their own classrooms. We were also interested in discov-
arose as an experienced teacher implemented PBI in a high ering what these preservice teachers perceived as likely
school earth science classroom: ambiguity and risk of barriers to the implementation of PBI, what barriers they
unfamiliar, open-ended practices; students’ adherence to actually encountered if implementing PBI, and how those
transmission epistemology; students’ perception of time experiences changed their views of PBI. The goal was to
available for a task; and the limited time available to the provide programmatic support to these beginning teachers.
teacher. He concluded, ‘‘The promise of projects in schools Our study was guided by research questions in three broad
will only be realized when educators acknowledge and deal areas:
effectively with the realities of their problems as well as 1. Preservice teachers’ conceptions of PBI:
their potential benefits’’ (Polman 2000b, p. 12).
• What do preservice teachers in our secondary science
Our study was designed to expand our understanding of
and math preparation program identify as the key
what preservice teachers gain from professional develop-
elements of PBI?
ment concerning PBI, and how their attitudes toward PBI
• How do their descriptions align with the views of educa-
influence whether and how they implement it in their own
tion researchers who have developed this method?
classrooms, in particular as student teachers. As such, it is a
• Do their descriptions change as a result of the student
limited step toward addressing the need for longitudinal
teaching experience?
evidence on the effectiveness of particular elements of
teacher preparation programs and toward understanding the 2. Perceived efficacy of PBI:
transitions between different phases of professional
• What do these preservice teachers perceive and report
development (Windschitl 2005), specifically the transition
as barriers or drawbacks to implementation of PBI?
from coursework to apprentice teaching. Understanding
• How do these preservice teachers perceive the efficacy
how experience with reform pedagogies in professional
of PBI with regard to the stated learning goals in the
development courses translates into practice in the class-
classrooms where they were planning to teach?
rooms of novice teachers has become more relevant in light
• Do these perceptions change as a result of the apprentice
of legislation emphasizing accountability systems and
teaching experience?
testing, which may have the inadvertent consequence of
privileging certain pedagogies over others. This leads to 3. Implementation of PBI:
challenges in terms of curriculum implementation, plan-
• Do preservice teachers (ATs) implement PBI in their
ning, and assessment, all of which interact in complex
own classrooms during apprentice (student) teaching?
dynamics when attempting to forge sites for apprentice
• If so, what do their implementations look like?
teachers (ATs) to hone their craft under the guidance of
• Finally, how do their perceptions of PBI relate to
practicing mentor teachers.
whether and how they implement PBI as ATs?
As a framework for interpreting our students’ concep-
tions of PBI, we first consider conceptions of PBI as
articulated by science educators who specialize in this area
(Barron et al. 1998; Polman 2000a; Singer et al. 2000; Method
Petrosino n.d.; Linn et al. 2003; WISE n.d.). Identifying the
definitive components of PBI is not straightforward, as the Prior to the start of research, a protocol describing the
instruction given this label can vary widely. In our intended study was submitted to, and approved by, the
approach, we identified major research-based instantiations Institutional Review Board of the large research university
of PBI that have been documented in peer-reviewed liter- where the authors are faculty members and where all the
ature. As might be expected, essential elements were var- participants were enrolled in teacher certification course-
ied, but there was considerable overlap. Common elements work. The participants were drawn from three cohorts of
123
J Sci Educ Technol
Barron Connects activities and Tangible, real Complex inquiry Frequent SMART tools Collaborative activity Simulated, constrained Extended
et al.a underlying concepts world outcome formative, problems, contrasting
self cases, social organization
assessment to promote agency
and revision
CoVisb Open-ended questions New knowledge or Authentic practice, learn by Reflect Collaboration & Access to scientist Seeding inquiry with Long-term
artifacts doing principles visualization mentors, individual powerful ideas
or group
PBSc Broad, authentic open- Artifact or product Sustained, situated in life of Learning Learning communities, Teacher (benchmark Relatively
ended, authentic. that represents student. Students gather, technologies including more lessons), learning long
Encompass response to analyze data; draw, share, mirror those knowledgeable materials, technology term
worthwhile science driving question support conclusions used by community members
content scientists
Petrosinod Driving, Real world Requires Active investigation, learn Cognitive tools Learning community
problem, multiple culminating concepts, apply info,
content areas product represent knowledge in
multiple ways
WISEe Provided by web site Accessible, independent Online Computer tools Online communities Reflection notes, online About a
activities, require sustained assessments for discussions, online week
reasoning visualization, support, hints
modeling
a
Barron et al. (1998)
b
http://www.covis.northwestern.edu
c
http://www.umich.edu/*pbsgroup/whatPBS.html
d
http://college.hmco.com/education/pbl/background.html
e
http://wise.berkeley.edu/pages/about.php
123
J Sci Educ Technol
preservice, secondary-level, science and mathematics addition, five individuals who did not respond to the pre-
teachers who were engaged in apprentice (student) teach- survey responded to the postsurvey, for a total of 70.
ing at the time of the study. Approximately half of these The survey consisted of both Likert-scale and open-
future teachers were being certified to teach science of ended items (see ‘‘Appendix A’’). Each Likert-scale item
some kind and half to teach mathematics, the majority at presented an evaluative statement about PBI paired with an
the high school level but some also in middle school. All implication for classroom instruction, such as, ‘‘PBI rep-
had previously completed a professional development resents best practices in secondary math and science
course in PBI as part of the required sequence of courses instruction; all instruction should be done in this format.’’
for certification.1 During this course, students developed an Thus, agreeing with a statement required students to
extensive PBI unit as part of course requirements. The first endorse both a view of PBI and its consequences for the
two cohorts were from the same year. In that year there was classroom. Students were asked to rate their agreement or
an acknowledged effort on the part of the teacher prepa- disagreement with each statement. Likert-scale items were
ration program staff to coordinate the PBI course and analyzed quantitatively using a statistics package (SPSS).
Apprentice Teaching course, in particular to facilitate and Respondents without both pre- and postsurveys were
encourage the implementation of units developed in the omitted from the statistical analysis.
PBI course as part of the AT experience. Due to conditions For each AT, an affinity score for PBI was calculated
in some of the precollege classrooms where ATs were based on how much the individual agreed or disagreed with
placed, it was not possible to make this an absolute the Likert-scale statements from the survey. There were six
requirement, but efforts were made to facilitate imple- statements, two that were positive about PBI as a teaching
mentation of these units where possible. In the first cohort method, two that were negative, and two that represented a
all ATs attended a seminar on implementing PBI during more balanced view. Responses to the individual state-
apprentice teaching. ATs in Cohort 2 were offered the ments were combined to create a composite affinity score
opportunity to work with their prospective mentor teacher for PBI, from 0, indicating a complete lack of belief in the
as they designed their PBI units in the PBI course— efficacy of the method, to 12, indicating the highest
although none of those interviewed for this study actually enthusiasm.
did so. The open-ended questions asked students to describe
The third cohort was from another year. Cohort 3 was what they perceived as the critical elements of PBI, what
required to design a PBI unit in the PBI class, but program they perceived as drawbacks to PBI, and either their plans
staff did not encourage implementation of a PBI unit during for PBI (presurvey) or their implementation of PBI (post-
apprentice teaching. All three cohorts were required to sub- survey). Open-ended responses were coded according to
mit a portfolio, including a unit of lesson plans. Preservice themes. At least two independent coders categorized each
teachers often submitted the unit they had designed in the PBI open-ended response, with typically a 70–80% agreement
course. Acceptance of the portfolio was a requirement for between independently generated codes. Codes were then
certification. consolidated and standardized, and the final code assigned
Preservice teachers in all three cohorts were informed to each response was negotiated between the raters. Codes
that the ultimate goal of the study was to improve the were recorded in a spreadsheet and frequencies were
teacher preparation program in which they were enrolled, determined based on the entries.
particularly with regard to PBI, but they were not given To gain more insight into the nature of instruction in the
details of the research questions. They were informed that cases where the ATs felt that they had implemented PBI, at
participation was voluntary, and informed consent was least in part, and into the reasons why they did or did not
obtained from all those who volunteered to participate. implement PBI during their apprentice teaching semester,
In order to determine how these preservice teachers con- all respondents to the survey were invited to be interviewed
ceptualized PBI, and how they perceived its efficacy with about their implementation of PBI in apprentice teaching.
regard to learning objectives, a survey (see ‘‘Appendix A’’) They were assured that the intent of the interviews was
was developed and administered to volunteers just before simply to inform the design of the preparation program and
they started apprentice teaching. To determine whether the that their responses would not affect their status in the
AT experience had any effect on their thinking, the same program in any way and would not be revealed to any of
survey was administered at the end of the apprentice teaching their instructors. All volunteers for whom a suitable
semester. A total of 79 preservice teachers responded to the interview time could be arranged (N = 17) were inter-
presurvey, 65 of whom also responded to the postsurvey. In viewed. They were roughly equally distributed among the
three cohorts, representing 10–20% of each cohort. Those
1
Two of the participants were taking the course concurrently with who were interviewed were given token compensation for
the study. their time in the interview. The interviews were conducted
123
J Sci Educ Technol
by the authors and graduate students working under their Table 2 Key elements of project-based instruction: percentage of
supervision. Interviews followed a flow chart indicating a responses
course of questions for different response options and Response Presurvey (%) Postsurvey (%)
allowing latitude for interviewers to follow up new threads
Projects 34 44
that arose in the interview (see ‘‘Appendix C’’). All inter-
views were audio recorded and transcribed. Discovery, inquiry 32 39
Classroom observations, when available, were used to Group work 25 17
aid in interpreting the interview and survey results. Col- Long term 25 6
lected artifacts included lesson plans, student handouts, and Teacher behavior 18 29
in some cases video of implementation. Field notes during Student driven 18 36
each observation provided detailed descriptions of the Theme/unit 16 11
activities, patterns of student engagement and participation, Research 11 3
and any assessments of learning outcomes. Authentic, real-life task 10 7
The interviews, observations, and artifacts were used to Driving question 9 11
characterize the preservice teachers’ implementation of Scaffolding 9 3
PBI in the classroom. Implementations were assigned a Product 8 7
score based on a rubric developed by the authors (see Student interest 5 7
‘‘Appendix B’’). Two independent evaluators used inter- Open-ended 5 6
view transcripts, artifacts, and observation field notes to Assessment 4 7
rank each implementation on a three-point scale according
to each of seven elements: (a) driving question; (b) tangible
product; (c) investigation; (d) use of cognitive tools; (e)
collaborative nature of the activity; (f) the nature of classroom?’’) addressed the first research question explic-
assessments for the task; and (g) the scaffolding provided itly. Table 2 lists all the codes that were generated by
for the task. Because of time constraints imposed by the responses from more than one preservice teacher in any
apprentice teaching context, we did not evaluate the final cohort, along with the percentage of responses citing that
key element identified by education researchers, i.e., code in the pre-survey (prior to apprentice teaching) and
length. The ATs only taught completely autonomously for the post-survey (after apprentice teaching). As some
6 weeks, and were required to meld with their mentor respondents listed more than one key element, they were
teacher’s curriculum at the beginning and end of that time. assigned more than one code, and responses that were only
Therefore the range of reported project lengths only varied given by a single respondent are not included. Thus, the
from a few days to a couple of weeks, the lower end of the percentages do not necessarily add to 100%.
length range considered optimal by experts. As can be seen in Table 2, the most common description
The independent scoring for each of the components for PBI given prior to apprentice teaching (34% of
was then compared, and any discrepancies were discussed responses) was simply that it involved something labeled
and final scores negotiated. The scores for the seven indi- as a ‘‘project,’’ with no qualifications about the nature of
vidual elements were then averaged to create a composite the project or how it would be implemented. This was
implementation score. followed closely by descriptions indicating that PBI
involved inquiry or discovery learning (32% of respon-
dents), group work (25%), or involved an extended time
Results period (25%). The next most commonly listed attributes
were descriptions of how a teacher should behave in PBI
Preservice Teachers’ Conceptions of PBI (i.e., not lecturing, 18%), that the work should be student
driven (also 18%), and that there should be a ‘‘unit,’’ a
What did preservice teachers in our secondary science and series of lessons characterized by a theme (16% of
math preparation program identify as the key elements of respondents). Students did not indicate how this theme
PBI, and how did their descriptions align with the views of should be used to enhance instruction or how it differed
education researchers who have developed this method? from the label of a curriculum unit, as in the standard
How did their descriptions change as a result of the curriculum unit on DNA or on quadratic equations.
apprentice teaching experience? In comparing these student characterizations with those
The first question on our survey (‘‘What do you consider of science educators who specialize in this area (Table 1),
to be the key elements of PBI? In other words, how would students tended to focus more on superficial aspects of PBI
you recognize PBI in a secondary math or science than did the experts. For example, students focused on the
123
J Sci Educ Technol
existence of something called a ‘‘project’’ and the length of Table 3 Possible barriers to implementing project-based instruction:
the activity, rather than what a project should consist of, or percentage of responses
why an extended time period is required and how it should Barrier Presurvey (%) Postsurvey (%)
be used. Only 9% of students felt that a driving question
Time 37 30
was essential and only 8% felt that a tangible product was
necessary, whereas experts identify both of these as key (High school) students 37 34
components. Students agreed with experts that PBI should Specified curriculum 32 21
involve hands-on, discovery learning, but fewer than 20% Resources 15 11
went on to align with experts in stating that this also should State testing requirements 14 15
be student driven and involve a complex task. Collabora- School culture 10 13
tive activity, a critical element in the expert framework, Cooperating teacher 10 4
was only mentioned by one-fourth of the preservice Policy/administration 9 6
teachers who responded to the survey, as noted above. Project-based instruction won’t work 8 0
Scaffolding and formative assessment, also critical for PBI Parents 8 2
instruction (Barron et al. 1998), were mentioned by fewer Lack of experience 6 2
than 10% of respondents. The use of cognitive tools, the Difficulty of curriculum design 6 4
final element of the expert description, was not mentioned
at all.
These descriptions of PBI were fairly stable from the ATs. Some ATs (fewer than 10%) also cited ‘‘PBI won’t
beginning to the end of the apprentice teaching experience, work,’’ objections from parents, their own lack of experi-
as can been seen from Table 2. The top two characteriza- ence, and the difficulty of designing PBI curriculum as
tions, project and inquiry/discovery, actually increased in barriers to PBI implementation. Responses were very
frequency from the pretest to the posttest, as did descrip- similar on the survey given after apprentice teaching; stu-
tions of PBI as student centered and characterizations dents, time, and a specified curriculum were once again the
involving teacher behavior (not lecturing or acting as a most commonly listed barriers. The perception of state
guide, for example). The most dramatic change was the testing requirements and school culture as barriers
decrease in responses describing PBI as involving an increased slightly, whereas all the other perceived barriers
extended time period. This decrease was consistent within were cited with less frequency on the postsurvey. Notably,
each of the cohorts as well as in the group as a whole. As the notion that PBI simply would not work was not listed as
noted above, ATs were constrained to implement PBI on a barrier to implementation by any AT on the postsurvey.
shorter time scales than in the books and articles they had
read in the PBI class; this may have contributed to their Perceived Efficacy of PBI
perception that an extended time period was not such an
essential element of PBI. How did these preservice teachers perceive the efficacy of
PBI with regard to the stated learning goals in the classrooms
Barriers to Implementation of PBI where they were planning to teach? As described above, we
computed an affinity score for each preservice teacher, using
What did preservice teachers perceive as barriers or responses from the pre- and postsurvey characterizing the
drawbacks to implementation of PBI? Another question on respondents’ belief in the efficacy of PBI. Ranked agree-
the pre- and postsurveys asked the preservice teachers to ments with statements 2a and 2b from the survey (from 1 to 4
describe what they saw as possible barriers to implement- for strongly disagree to strongly agree, see ‘‘Appendix A’’)
ing their plans for PBI (Question 5 in ‘‘Appendix A’’). were summed as the positive PBI score for each survey,
Table 3 lists the most common responses, along with the likewise statements 2c and 2d as neutral PBI, and statements
percentage of surveys that listed each of them. Again, the 2e and 2f as negative PBI. Positive scores had a possible
percentages do not necessarily sum to 100%. range of 2–8, as did negative scores. We computed the
The most common barriers on the presurvey were composite score as the difference between the positive-PBI
inadequate time to implement PBI, lack of initiative or and the negative-PBI scores so that the composite score had
discipline on the part of precollege students, and curricu- a possible range of -6–6. We added 6 to all the composite
lum specifications by the department or school, all of scores, shifting the possible range from 0 to 12 for the
which were cited by more than one-third of the ATs. These convenience of the statistical analysis. We performed initial
were followed by resource limitations, state testing analyses that showed no significant effect of cohort (i.e., the
requirements, school culture, and mandates from cooper- three cohorts were not significantly different on their affinity
ating teachers, all cited by 10% or more of the responding scores). All other analyses, therefore, were performed
123
J Sci Educ Technol
without this factor. A 2 9 3 repeated-measures analysis of implementation due to constraints of the teaching environ-
variance (ANOVA) was performed on the composite affinity ment. The other 16 reported various levels of project
scores. There were two between-subjects factors: time (pre implementation. Table 4 lists the driving questions or tasks,
vs. post) and statement type (positive-PBI, neutral-PBI, and descriptions of what the projects entailed for the high school
negative-PBI). No significant difference between pre- and students, mathematics or science learning goals, whether
posttests was found. the project involved collaboration, and the approximate
There was a significant difference between the positive- length. As seen in Table 4, about half of the projects were of
PBI, neutral-PBI, and negative-PBI scores, F(2, 122) = 1 week or less in duration and could not be considered the
126.68, MSE = 1.62, p \ 0.001. Thus, the respondents as a type of extended inquiry required for full implementation of
group viewed these three groups of statements as conveying PBI. The math and science was clearly evident in these
distinctly different sentiments, interpreting them as we had projects, but the directing influence of the classroom stu-
intended. dents was not always at the forefront. Neither was the
The lowest affinity score for any student in either cohort broader goal of ‘‘doing with understanding’’ (Barron et al.
on the presurvey, on a scale from 0 to 12, was a 3 and the 1998, p. 274) always in evidence.
highest a 12. However, a majority of the students (77%) Composite implementation scores were calculated for
fell in the medium affinity category, with scores from 5 to the ATs who were interviewed, as described in the Method
8. Twenty-three percent of responses fell in the highest section. The AT who reported no implementation received
category, with scores from 9 to 12. The average score was an implementation score of 0. Of those who reported that
7.4 with a standard deviation of 1.6, indicating that on they implemented PBI, implementation scores ranged from
average the respondents felt more positive toward PBI than 0.7 to 2.8 (very nearly completely authentic implementa-
negative. Affinity scores were also calculated for the tion, according to the expert rubric) with a standard devi-
postsurvey responses and changed little on average. The ation of 0.5. In general, very few individual elements were
average postsurvey score was 7.2 with a standard deviation rated as a 3, or full implementation. The largest group of
of 1.7. The highest postsurvey score was a 10 and the implementations was rated between minimal and moderate
lowest a 4, coinciding with the very slight shift in the (nine out of those interviewed). Seven implementations
average toward more neutral responses. were rated between moderate and full, although all but one
of those were closer to moderate. One implementation was
Implementation of PBI rated as less than minimal.
Thus, as might be expected of first-time teachers, the
Information regarding how the ATs implemented PBI (if at ATs did not, in general, implement PBI at a level that
all) came from self-reports on the postsurvey; interviews; would be consistent with an authentic implementation, as
and, in a few cases, actual classroom observations. In described by education researchers specializing in PBI. In
response to the postsurvey question, nearly half of the large part, the preservice teachers we interviewed agreed
respondents, 35 out of 71, said that they had implemented with this assessment. In their postimplementation inter-
PBI minimally or not at all. Slightly more than 10% of the views, some expressed reservations about whether they had
ATs who responded said that their students did a project of implemented PBI authentically, describing their instruction
some kind, and another 10% described a design project as ‘‘not as pure project-based as we possibly learned,’’ ‘‘it
they assigned to their students, in which a product of some wasn’t how we were kind of taught to do things in PBI,’’
kind was required. Nine percent of the respondents simply ‘‘it wasn’t project-based, like, I guess that there was no
said that their students had engaged in group work of some driving question,’’ and ‘‘just a little different than what I
kind, and 7% described having had their students use had been shown.’’ The preservice teachers gave various
multiple sources to gather information. The other most reasons for assessing themselves this way: ‘‘While I think I
common responses were having students engage in inquiry, learned some of the ideals in PBI of what it should look
assigning student-centered activities, and having a unit as like, I didn’t learn practically how to think those through
part of the curriculum (6% each). Four percent of respon- until I actually taught it,’’ ‘‘the students weren’t really
dents indicated that the only PBI they had implemented given any lessons by me,’’ and ‘‘it’s very difficult to
was to have students do laboratory work. Thus, the implement that [pure PBI] with the resources that I had and
majority of students did not implement all the seven ele- the timeframe that I had.’’
ments, by their own report, and nearly half did not report Some of those who did report that their implementation
any part of their apprentice teaching experience as being was authentic focused on only one aspect of PBI or a
PBI. superficial characterization, such as the presence of a uni-
Among the interviewed students, one AT (who appren- fying theme. For example, when asked whether she had
tice taught in a 9th grade biology classroom) reported no implemented PBI authentically as she understood it, one
123
Table 4 Descriptions of projects implemented by ATS
AT Driving question or task Learner product Investigation Science/math addressed Collab. Length
123
1 1. What would the decoration on a coke can look 1. 2D net of Coke can 1. Students investigated what the 2D net of an 1. 2D nets of 3D objects Yes 1 day
like flattened out? decoration image would be 2. Properties of right triangles (each)
2. How can we measure the height of a cell phone 2. Measurement of 2. Students made clinometers from instructions
tower with a clinometer? height and discovered ways to use them to measure
height of a cell phone tower
2 What makes bridges fall down? Model of bridge, scale Students designed and tested bridges with the Properties of triangles, Yes 2–3 weeks
drawings, goal of greatest weight to payload ratio triangle formulas
description of design
3 What makes bridges break? Model of bridge, scale Students designed and tested bridges with the Properties of triangles, Yes 2–3 weeks
drawings, goal of greatest weight to payload ratio triangle formulas
description of design
4 Create a map of the Earth indicating important Map and presentation Students were assigned roles in investigating Properties of plate boundaries; Yes 2 weeks
geologic features and use it to predict future properties of Earth regions, creating an causes of volcanoes,
geologic trends annotated map, and making predictions earthquakes
5 Student-generated driving questions involving Written report and Students researched and reported on genetic Genetics, DNA Yes 3 weeks
DNA technology (e.g., cloning) presentation issues of their own choosing
6 Are my friends making me sick? Report Students read case studies, performed Microbes, pathogens Group labs, 1 week
laboratory investigations, made report individual
product
7 How do robots work? What makes them move? Robot Students built a robot and documented its Control algorithms, Yes 3 weeks
programming; competed against other robots programming language
in completing assigned task
8 Create a series of drawings of 2 and 3D objects, scale Booklet Students created a booklet with prescribed Properties of 2 and 3D figures No 2 weeks
them and calculate properties; use what you have exercises and an open ended summative (surface area, volume), scale
learned to create an original image activity creating an image factors (proportions)
9 Come up with a system of inequalities to meet the System of inequalities; Students complete a series of challenges Linear inequalities Yes 3 days
following constraints; model data using linear predictions and involving linear inequalities and data
inequalities answers to questions
10 Create your own tessellation tile that would cover the Tessellation tile and Students created their own tiles and described Translation, rotation, Yes 1 week
floor with no gaps (and the instructions for making instructions how they used them to fill a plane, reflection, plane filling
it) incorporating translation, rotation, reflection (graphing)
11 Make a travel brochure for the planet of your choice Brochure Students researched characteristics of planets Properties of planets Yes 4 days
and created brochure
12 Construct your own parabola/ellipse and Parabola or ellipse, Students constructed geometric figures and Properties of parabolas and 2 days
mechanically discover its focus, vertex, and measured their properties ellipses
directrix
13 How do we figure out quality of our water? Report Students engaged in lab tests of kinds, Water quality, ion Group labs, 8 days
researched water issues. concentration, acid–base individual
chemistry product
14 See how natural selection (predation, reproduction) None Students engaged in a simulation of natural Natural selection Whole class 1 day
works through a simulation (game) selection with assigned parts
J Sci Educ Technol
J Sci Educ Technol
2 weeks
Length because it was the theme of, ‘Are your friends making you
sick?’ So we examined different aspects of bacteria, how
they make people sick, how they make people well. So I
think so.’’
Collab.
deviation of 1.5.
Since so few students (and none of those interviewed)
scored in the lowest category, the graph in Fig. 1 includes
organisms and observed
in media and reported
only the medium (to the left of the dividing grid line) and
high (right of the dividing grid line) affinity scores.
Implementation scores are divided into minimal, moderate,
Investigation
and full, by the three horizontal grid lines. For the entire
group, there was no apparent relationship between affinity
score and implementation score. A strongly expressed
belief in the efficacy of PBI did not necessarily translate
into an authentic implementation of PBI. When examin-
ing the data for the individual cohorts, however, trends
Learner product
emerged.
Terrarium
Report
3
Cohort 1-2
Cohort 3
will allow the most organisms to survive in it
Implementation Score
2
Build a self-contained ecosystem that
Are natural disasters getting worse?
1
AT Driving question or task
Table 4 continued
0
4 8 12
Affinity Score
16
123
J Sci Educ Technol
In Fig. 1, the diamond markers indicate data from the Further, although these preservice teachers had opportu-
first two cohorts combined, both from the first year of the nities to study, observe, and design project-based curricu-
study. Recall that students in these two cohorts were lum units in previous coursework, until their apprentice
strongly encouraged to implement the units that they or teaching semester they had only limited opportunities to
other students had developed in the PBI class. In these two actually implement PBI. Repeating the words of one pre-
cohorts, the ATs in the medium affinity category came service teacher after implementing a PBI unit, ‘‘While I
closest to satisfying the criteria for PBI defined by experts think I learned some of the ideals in PBI [previous course]
(upper left sector). Although the small numbers and spread of what it should look like, I didn’t learn practically how to
in the data do not justify an exact fit, a downward trend think those through until I actually taught it.’’
appears; ATs in these cohorts with higher affinity scores On the other hand, the apprentice teaching semester
were not more likely to enact authentic implementations, itself did not seem to change these characterizations sig-
according to these data. nificantly. Given that the majority of ATs either did not
In contrast, teachers in the third cohort (indicated by implement PBI at all, or did not implement it in authentic
squares in Fig. 1) who scored in the highest implementa- way, this is not surprising. Only the emphasis on length in
tion bin were all from the high-affinity category (upper the descriptions of PBI seemed to change as a result of the
right sector of the graph). Although, again, the small apprentice teaching experience. We conjecture that as
numbers and large spread in the data do not merit an exact students began to rationalize their understanding of PBI
fit, there does seem to be a trend toward more authentic with what they were able to accomplish and witness in
implementation with higher affinity. All the ATs with less classrooms, they lost the emphasis on length. In particular,
than minimal implementation scores were from this cohort, the regimen of curriculum requirements forced students to
and they all had the lowest affinity score. Recall that a limit themselves to the time periods given in prescribed
univariate analysis found no significant difference between curriculum guides. If the unit was associated with triangles,
the cohorts on the affinity scores themselves. This pattern it needed to fit within the assigned period of study for
of the contrasting trends between the two groups, and the triangles, regardless of how much time their students might
general pattern of data distribution, held true for each need to respond to a driving question or task. In the words
element of the implementation rubric separately as well, of one preservice teacher after apprentice teaching,
indicating the average scores are reflective of the authen-
Time, I think, is the biggest issue. You just felt like
ticity of the implementation of each element separately.
you had to move on and you had to be a certain place
Of course, these data represent a select subgroup who
at a certain time in the [prescribed curriculum]. I don’t
volunteered to be interviewed, and they may show a
think there’s any way I could have fit that [the unit he
selection bias. As mentioned above, none of the four ATs
had originally planned] in.
whose affinity scores were 4 or less on either the pre- or
postsurvey volunteered to be interviewed. However, the Another stated, ‘‘But, you know, being the student tea-
fact that the average affinity score for the subgroup who cher, I just had to go by the guidelines that they had set,
were interviewed was not significantly different from the and we had this certain amount of time. Nobody else was
average for the entire sample we studied speaks against a doing it this way.’’
large sample bias. Indeed, time was one of the mostly commonly cited
barriers to implementation on both the pre- and postsurveys.
In general, the barriers to implementing PBI cited on the
Discussion survey prior to apprentice teaching were the same ones
identified at the end of the apprentice teaching semester.
The preservice teachers surveyed tended to focus on the These appear to have been realistic perceptions, as the same
more general characteristics of PBI as a student-centered barriers, particularly time and a prescribed curriculum, were
approach (hands-on, discovery learning, group work) and often described in depth in the interviews. The drop off in
on its more superficial aspects (length or existence of an respondents citing that ‘‘PBI won’t work’’ as a barrier to
activity labeled as a ‘‘project’’), as opposed to the unique implementation may indicate that increased experience with
characteristics identified by experts (e.g., driving question; the actual curriculum implemented in schools led to a rec-
tangible outcome or product; authentic, student-driven ognition that the fact the something does not work does not
task) or the elements necessary for ‘‘doing with under- prevent it from being implemented; further, the fact that it
standing’’ (cognitive tools, continuous assessment, and does work does not necessarily make it easier, or even
other scaffolds; Barron et al. 1998, p. 272). This finding is possible, to implement. All we can say is that by the end of
perhaps not surprising given that novices of all sorts tend to the semester, these ATs no longer gave a lack of belief in the
focus on more superficial aspects of tasks (Chi et al. 1981). efficacy of PBI as a reason why they did not implement it.
123
J Sci Educ Technol
In fact, the ATs who had more restrained views of the Surprisingly, students with higher affinity scores (those
efficacy of PBI but still enacted full implementations on the right half of the graph in Fig. 1) were not always
(upper left sector of graph in Fig. 1) felt that the experience more likely to carry out authentic implementations of PBI.
benefited their students. They did not report the precollege One might speculate that the students who were more
students’ behavior or preparation as a barrier in their realistic about PBI, those who saw some value in the
interviews. One AT stated, ‘‘I would suggest [to future approach but also were cognizant of the barriers to
ATs] to at least try it, because it’s a good experience. It’s implementation and the tradeoffs likely to be encountered
certainly different than just doing it lesson by lesson.’’ in implementing the approach, were best prepared to enact
Similarly, another said, ‘‘I think if you want them [teach- PBI in the classroom. The lack of a trend toward more
ers] to do a project, I think it’s a good experience, because authentic implementation scores corresponding to higher
some of us who start our first year of teaching are going to affinity scores clearly is strongly influenced by those ATs
go in there and be a little intimidated.’’ Another AT stated, from the first two cohorts who adopted PBI, albeit reluc-
tantly, in response to programmatic encouragement. The
I think in terms of [the high school students], I think
pro-PBI stance of the preparation program in the Appren-
it was [successful] in some ways. It gives them
tice Teaching course during that time might have raised the
hands-on experience. It gives them… I don’t know.
implementation scores of those with low affinity, leveling
Doing something that appeals to creative people.
off the affinity versus implementation trend or even
Like, one of the kids in my class that I would have
reversing it.
never thought would get excited about anything got
It is also possible that the students who fell in the high-
really excited about it and [was one of the most
affinity category had a simplistic understanding of the
successful].
method, that it only involved something which might be
One participant noted, ‘‘I think it taught my students to called a project, or that any sort of discovery, student-
be really self-reliant and to depend on their group mem- centered learning would constitute PBI. Thus, basing their
bers.’’ Another cited classroom management: unqualified endorsement on that categorization, the stu-
dents did not include all the essential elements recognized
I think that without even instructing [the classroom
by experts in their implementation. This conclusion is
students] or telling them what to do, they, they were
supported by the large number of ATs who simply
self-motivated, and I think that having projects like
described PBI as being inquiry, discovery learning, or
this really gets kids to actively think, and you’re not
group work.
trying to force them to do anything, but they ask
Some preservice teachers with high affinity for PBI in
questions on their own, and they’re looking for
Cohort 3, which received no added support toward imple-
alternative ways to modify their creation, and they’re
menting PBI as part of apprentice teaching, might have
looking around to see how other kids did their pro-
experienced (or perceived) barriers to authentic imple-
jects, so I think classroom management is pretty
mentation that they were unable or unwilling to surmount.
good.
One AT explained her reasons for not implementing more
Another AT believed that the PBI was ‘‘a good foun- authentic PBI:
dation’’ for later coursework, adding ‘‘I think it’s always
more useful when you have somthing that’s practical and We moved so fast. It’s actually kind of…I had never
not something that’s abstract.’’ been in a class where we moved so fast. They take a
Some of ATs with moderate- to full-implementation test once a week in the class. And I was in honors
scores continued to express concerns about the barriers to classes in high school and I don’t remember moving
implementation: that fast….it was just a lot of planning for a class that
was already moving so fast… And I would have
I think that you have to take in for account that some
wanted to do a long-term project, and I did think
students are going into the schools that have a set
about it. I would have wanted to do it, but I just…-
curriculum, like that’s just unavoidable. It happens.
they complained every day (jokingly complained)
I got lucky that my school had this. They have a
about the amount of homework and stuff. Because
couple each semester, and I just happened to be in
pre-AP [Advanced Placement] kids, they have a lot of
one of the 6 weeks where it happened. But some
homework in all their other classes, too. But I could
schools don’t have that.
definitely (do PBI) in the future when I felt more
Even these teachers, however, were adamant that they prepared to implement stuff like that, I could defi-
would implement their PBI units again, albeit possibly with nitely do it. And I think it would be something
refinements. worthwhile to do. So, that was my reasons, we tested
123
J Sci Educ Technol
every week and gave them mounds of homework. So, some of the ATs in our first two cohorts did so very suc-
as a first-semester teacher, that’s pretty daunting to cessfully, even though they were cool to the idea initially.
see and deal with. Our results show that an emphasis on implementing PBI as
part of apprentice teaching can pay off in terms of more
For this AT, scheduling and prescribed curriculum
reluctant certification candidates actually implementing
constraints were insurmountable. For others, the need to
PBI, even authentically. None of the ATs who actually
align with what their mentor teacher was doing inhibited
implemented PBI in our study described the experience as
implementation:
negative. Our results corroborate those of Clayton (2007),
I would have liked to use it, and it would have been who found that for some 1st- and 2nd-year teachers, chan-
really nice, but as a student teacher, what you’re ges in thinking about reform curriculum must be preceded
doing, what you do is based on what your cooperating by actually implementing the curriculum: Learning comes
teacher does, and …there’s also that element of, with doing. Clayton also noted that the positive pressure to
‘‘We’ve got to get through all this other information enact curriculum projects provided by cohort-based pro-
and get ready for the TAKS [high-stakes, standard- fessional development ‘‘was the support that was necessary
ized Texas Assessment of Knowledge and Skills] to inspire risk taking in an accountability environment’’
test.’’ (p. 227).
Apprentice teachers also cited the benefits of practical
experience with PBI:
So I think I understand PBI better now after the class
Conclusions and Implications
and after having to teach it, than I did necessarily
learning it in the class. Because while I think I
We see implications for preservice teacher programs aris-
learned some of the ideals in PBI of what it should
ing from each of our findings. First, practical design fea-
look like, I didn’t learn practically how to think those
tures, like the design principles espoused by Barron et al.
through until I actually taught it. And then it gives
(1998), need to be made explicit in teacher preparation
you that experience to reflect on and say, ‘‘Now I can
coursework. The preservice teachers we studied were much
do it.’’
more conversant with inquiry-oriented and collaborative
instruction than with the design elements that experts view Another said,
as essential to making PBI work. Our teachers were gen-
I’m going to know now a little bit about what the
erally comfortable with small scale, highly structured,
pitfalls might be or what I need to say ahead of time
inquiry-oriented activities, possibly tied together by a
or maybe how much time I need to spend doing what.
theme, as evidenced in many of the projects in Table 4.
I really think that it’s been helpful to have this kind of
However, they needed encouragement, time, and experi-
first dry run, if you will.
ence to see how these smaller activities might be incor-
porated, possibly as benchmark lessons, into full projects These findings run counter to stage models of teacher
driven by open-ended, authentic questions or tasks, development that assert that ‘‘the beginning science teacher
becoming more student directed while still meeting the is just concerned with surviving and can only consider the
same learning goals. most pressing issues related to teaching’’ (Luft et al. 2007,
As noted above, one AT summarized this well by say- p. 25). Lowering expectations about what kinds of curric-
ing, ‘‘While I think I learned some of the ideals in PBI of ulum novice teachers can implement may deny them
what it should look like, I didn’t learn practically how to important opportunities for growth; such opportunities are
think those through until I actually taught it.’’ Another said, most likely to come to fruition when accompanied by the
support of a professional development program. As one AT
I think it was worth doing…for me especially. [I was]
said,
learning some practical things about how to teach
projects and how to make them or how to think them If you’ve tried a project, good or bad, you tried a
through really well beforehand to where everything project, you know how it went, you know what you’d
that you’re teaching them is needed and relevant probably change or what you’d keep or what you
towards the end goal of a project. So it was helpful would adapt for other projects, it gives you a little bit
for me. of insight for the future, a little bit of hope.
Second, novice teachers need opportunities and encour- Finally, an unsupportive school environment can serve
agement to implement PBI in their classrooms. In response as a major impediment to novice teachers’ intentions and
to the expectation that they would implement a PBI unit, desires to implement PBI, despite encouragement and
123
J Sci Educ Technol
support from professional development programs. This is necessarily generalize to other programs. Similar research
particularly true for teachers in training who are still on how preservice teachers in other programs view PBI and
operating under the auspices of a classroom mentor tea- how their conceptions relate to implementation in
cher, but also for other novices (Clayton 2007) and, in fact, apprentice (student) teaching is also needed. As noted
all teachers. In the words of one AT we interviewed, above, this work represents only a first step in responding
to the Windschitl (2005) call for longitudinal evidence on
I think that [to require ATs to implement PBI] would
the effectiveness of particular elements of teacher prepa-
have been great—in Candy Land. [laughs] I think
ration programs. A study of program graduates’ imple-
there’s got to be a lot more work between [the cer-
mentation of PBI as they move into their own classrooms is
tification program] or whoever is doing this and the
also needed to resolve the question of how these future
school district and the cooperating teachers. …When
teachers would act absent the supervision of a mentor
some of those problems are fixed, then you’re going
teacher, but also further in time from the influence of
to have an environment where the teachers can
program coursework. Such a study of program graduates
implement what they do. But I could not imagine
has been implemented at a minimal level, and plans are
being a student teacher coming in without those
underway for a more thorough investigation.
resources and trying to scramble to implement
something like that. Acknowledgments This work was funded, in part, through three
National Science Foundation grants: Collaboratives for Excellence in
Therefore, care should be taken in situating apprentice Teacher Preparation (CETP) grant NSF-DUE-9953187, NSF-EEC-
teachers in the most supportive environments possible. 9876363 (VaNTH-ERC) and NSF-DUE-0831811 (UTeach Engineer-
In the end, those supporting the professional development ing). We would like to thank the reviewers and editors for their con-
of teachers must have high expectations for success, but structive input. We would also like to thank Rachel Barrera, Nourah
Caskey, Theodore Chao, Thanapun Charlee, Adem Ekmekci, Jessica
also must provide the necessary scaffolding. Gordon, Amy Moreland, Jenny Mount, Jiyoon Park, Mina Rios, Kate
Walker and Candace Walkington for their assistance in interviewing
and coding. The ideas expressed in this work are those of the authors
Limitations and Further Work and do not necessarily reflect the ideas of the granting agencies.
123
J Sci Educ Technol
1. What do you consider to be the key elements of Project-Based Instruction? In other words,
how would you recognize PBI in a secondary math or science classroom?
2. Rank how much you agree with each of the following statements.
a. In theory, PBI represents best practices in secondary math and science instruction; all
instruction should be done in this format.
Strongly agree Agree Disagree Strongly disagree
b. In practice, PBI represents best practices in secondary math and science instruction; all
instruction should be done in this format.
Strongly agree Agree Disagree Strongly disagree
e. PBI is useful as a motivator to get students to learn material. PBI should serve as a
reward in secondary math and science classrooms but is not a way to convey content to
students.
Strongly agree Agree Disagree Strongly disagree
f. PBI is a distraction in secondary math and science classrooms. This format of instruction
does not contribute to learning.
Strongly agree Agree Disagree Strongly disagree
3. Briefly describe how you plan to implement PBI this semester (if at all). Please include the
source for any curriculum materials you will be using.
4. Which of the following best characterizes your feelings about how the [program] courses have
prepared you to implement PBI?
a. I feel completely prepared for a full implementation of PBI.
b. I feel fairly well prepared to implement PBI at some level.
c. I don’t feel that I am any better prepared to implement PBI that I was before I entered
[program].
d. I feel less confident and enthusiastic about PBI after my [program] experiences to date
than I did before.
Please explain your answer and list what experiences in [program] courses were particularly
helpful in preparing you to or discouraging you from implementing PBI.
5. What do you see as possible barriers to implementing your plans for PBI?
123
Appendix B
See Table 5.
J Sci Educ Technol
Driving question Question is supplied by instructor; Students have some say in selecting or narrowing Question is meaningful to students, real-world problem
predetermined answer question. Requires multiple data sources. Answer with multiple, interdisciplinary data sources. Answer
not completely constrained not previously known
Tangible product Tangible product of some kind Students use new concepts and apply information Students new concepts and apply information to create
to create tangible product tangible product; includes multiple ways of
representing information
Investigation Hands-on, minds-on activity Student-driven, complex task Authentic, student driven, complex. Students learn
concepts, apply info, represent knowledge in multiple
ways
Cognitive tools Access to learning tools of some kind Access to multiple tools Cognitively oriented collaboration and visualization
tools
Collaborative activity Students report results to others Task includes collaboration between students to Task requires collaboration between students and others
generate product to generate product
Assessment Some form of formative assessment Authentic formative and summative assessments Authentic assessment requiring multiple forms of
knowledge representation
Scaffolding Some form of scaffolding Instructor provides scheduling milestones, inquiry Scheduling milestones, benchmark lessons, social
is seeded with powerful ideas structures to facilitate collaborative learning
123
J Sci Educ Technol
Appendix C
Interview protocol.
Were you able to use this unit that you designed in the PBI class at all?
If YES If NO
If they implemented PBI, but If they did not implement PBI at all
during apprenticeship then..
not with the project they
What was the driving question of your designed then…
PBI unit?
What deep principles were you trying to What was the source of your unit? Why not?
cover?
What was the driving question of your PBI What experiences from
How did your PBI unit work in terms of: unit? class influenced you not
to implement PBI?
• Classroom management?
• Student learning in
What deep principles were you trying to
math/science? cover? What would have had to
• Student motivation or happen in order for you
engagement? to implement PBI?
How did your PBI unit work in terms of:
In what way do you consider this to be an
authentic implementation of project-based • Classroom management? What do you consider to be the
instruction?
• Student learning in ideal environment for PBI
math/science? implementation?
• Student motivation or
What are some barriers or difficulties you engagement?
encountered while implementing this unit? What elements of PBI, if any,
did you incorporate during
your apprentice teaching?
In what way do you consider this to be an
authentic implementation of project-based
instruction?
In what terms was this unit successful? (in terms
or student learning.. if not, in any terms?)
123
J Sci Educ Technol
123