Promoting Data Final For Release 0
Promoting Data Final For Release 0
Promoting Data Final For Release 0
This report is funded through a generous grant from the Bill and Melinda Gates Foundation.
2013 New America Foundation This report carries a Creative Commons license, which permits noncommercial re-use of New America content when proper attribution is provided. This means you are free to copy, display and distribute New Americas work, or include our content in derivative works, under the following conditions: Attribution. You must clearly attribute the work to the New America Foundation, and provide a link back to www.Newamerica.net. Noncommercial. You may not use this work for commercial purposes without explicit prior permission from New America. Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one. For the full legal code of this Creative Commons license, please visit www.creativecommons.org. If you have any questions about citing or reusing New America content, please contact us.
Contents
Introduction........................................................................... 1 The Federal Role in Educational Data...................................2
Teachers Data Usage in Classrooms: The Next Step................2 What Assessments Lead to Good Data?.................................3 Federal Efforts to Improve the Use of Data in Classrooms: An Overview...................................................... 4 Figure 1: Statewide Longitudinal Data Systems: Budget History......................................................................... 4 Figure 2: Race to the Top Awards: 2010-Present.....................5
Conclusion........................................................................... 18 Notes................................................................................... 19
Introduction
Throughout the past decade, states and school districts across the United States have designed new ways to expand and inform teachers use of data in K-12 classrooms. The shift is, in part, a function of the growing availability of student data. As a result of federal requirements and state initiatives, states now collect more data on students, teachers, and academic environments than ever before. Every state maintains a student-level longitudinal data system and many of them provide important data points to school district officials, school leaders, and teachers to help inform instruction. In short, education stakeholders in school districts across the country have a variety of facts, figures, and statistics at their disposal. The data requirements states face today are largely a product of the No Child Left Behind (NCLB) Act of 2001. In requiring states to publicly report both aggregated and disaggregated data on student demographics and achievement annually, NCLB forced them to develop more sophisticated methods of tracking such information. Subsequent federal policies furthered that trend. Taking their cues from the Data Quality Campaign, a non-profit organization that launched in 2005 to help build a data-driven educational world, lawmakers passed the America COMPETES Act in 2007. The law spells out a dozen critical elements that states must build into their data systems within a certain period of time. The America COMPETES Act also created a new competitive grant program called Statewide Longitudinal Data Systems (SLDS) Grants to help states establish or expand student data systems.1 States have made substantial headway in building new data systems and collecting and culling these new data points. The Data Quality Campaign, which tracks states yearly progress on the data systems, found in its 2012 annual
report that every state had at least the foundation for a data system. However, that report also noted a sobering reality: most states do very little to train teachers and administrators in how to use these data to inform and improve classroom instruction.2 A few states, however, are bucking that trend and are working to arm teachers with the tools they need to effectively use information in their classrooms. Through data-oriented professional development programs, they are giving teachers new skills that will allow them to better evaluate students learning deficits and differentiate instruction based on students needs. A teacher equipped to analyze and respond to student data might, for example, examine his third-graders math scores and discover that while most students are grasping the new concepts, others are struggling with basic skills that prevent them from mastering more advanced math concepts. He could then provide those students intensive interventions, allowing them to catch up with the rest of the class. This paper explores two states pursuing such professional development programs, Oregon and Delaware, that are models for both the successes and challenges other states are likely to face when implementing such programs. Their efforts share many common themes, which are instructive for policymakers who hope to replicate their accomplishments in other states. As federally funded efforts, both the Oregon and Delaware programs also demonstrate that the federal government can play a significant role in providing opportunities for innovation across the nation. This paper also examines how federal grant programs can be better targeted to encourage states and districts to adopt new efforts to encourage classroom data use and to implement them fully.
The No Child Left Behind Act launched a decade of development in state educational data systems. With its passage in 2001, states and school districts faced a new statutory requirement to produce publicly available report cards that included student achievement and graduation rate data both in the aggregate and by racial and socioeconomic subgroup.3 At the time, many states were ill-equipped to track these data. Although states were not required under the law to create new data systems, the logistical task of reporting so much information de facto necessitated the creation of longitudinal data systems.4
In 2009, Congress began requiring SLDS award recipients to incorporate the 12 elements outlined in the America COMPETES Act.7 Lawmakers injected an additional $250 million into the SLDS grant program through the American Recovery and Reinvestment Act (ARRA). Funding for the SLDS program, for which regular appropriations have fluctuated between a high of $65 million in fiscal year 2009 and a low of $38 million in 2012, has been contingent on recipients implementing the 12 elements ever since.8 The strong push for those key elements has paid off, according to the U.S. Department of Education. The most recent report for the SLDS grant program, which examined all fifty states and the District of Columbia in 2010, shows that over a dozen states now include all 12 of the America COMPETES elements, and only four have fewer than half of the elements.9
The No Child Left Behind Act launched a decade of development in state educational data systems.
Initially, states had little support as they hurried to build out their data systems and meet the new reporting demands Congress placed on them. In 2005, however, Congress established the Statewide Longitudinal Data Systems (SLDS) grant program to provide competitive, three- to five-year grants to help states establish or expand their longitudinal databases. Since then, 47 states and the District of Columbia have received at least one grant. Lawmakers passed the America COMPETES Act in 2007 to further shore up states efforts.5 The law identified 12 principles of aligned data systems. Those 12 elements contained both basic data points to provide teachers with valuable information about student learning and several more advanced components. These include unique student identifiers to link data across grades; data validity assessment tools; student test scores and transcript information; and alignment between data from K-12 and postsecondary institutions.6 Essentially, the America COMPETES Act established a framework for the distribution of federal dollars to create and grow statewide data systems and focus states efforts on the critical components of a useable data system.
skills as a condition of certification and as a requirement for approval of teacher preparation programs.12 Federal efforts to guide state data systems have also been lacking. The SLDS program requires that stakeholders have access to the data, but only suggests that recipients of SLDS funds should offer professional development to assist data users in understanding and using the data.13 In the words of one school district official, We see more data drowning than data scarcity, and too much data doesnt do much good either.14 Teachers must both understand the demographic and achievement data available about their students and be able to analyze and apply those data to instructional practice. Other education agencies also fall short of this imperative, such as teacher preparation programs. A National Council on Teacher Quality study of teacher preparation programs noted that most teacher training programs do not include training in data literacy. Only 2 percent of 180 teacher preparation programs surveyed sufficiently explored the analyt-
ical skills needed to understand student data from assessments, and none of the programs gave teacher candidates the training necessary to apply data in their classrooms.15
Another obstacle to using data to inform instruction is the actual information available in state longitudinal data systems. Many of the data points currently available to teachers are not the rich, informative metrics necessary to help educators design targeted student educational plans. Instead, most of the data available to teachers come from end-of-year summative exams. By the time accountability assessments have been administered and teachers receive the scores, the students have usually moved on to the next grade. The most powerful data for teachers instructional improvements actually come from formative exams, those administered throughout the instructional process and
used to mold future instructional steps.16 These exams allow teachers to correct their instruction mid-process and offer opportunities to diagnose and address student needs before the school year ends. Because NCLB penalizes schools and districts based on their summative exam scores, it is hardly surprising that most states and schools use those data as the primary measure of student performance.18 However, this singular focus on summative scores both restricts teachers abilities and willingness to use data in the classroom and hinders students potential for academic success.
Funding for the program has fluctuated in recent years. In fiscal year 2012, the Department of Education had more than $38 million to distribute to states. Through the 2009 American Recovery and Reinvestment Act, Congress funneled an additional $250 million into the program.20 Since the original grant competition in 2005, the Department of Education has spent more than $500 million creating education data infrastructures in nearly every state. Those federal dollars have laid significant groundwork for state data systems. Still, an analysis from the Data Quality Campaign said that the hardest work remains making those data available and useful for teachers and administrators.21 The Obama Administration launched the Race to the Top competitive grant program in its first term through ARRA. Congress originally endowed the Race to the Top (RttT) fund with $4.35 billion$4 billion for grants to states to make systemic reforms, and $350 million to fund cooperative, cross-state development of high-quality assessments.22 In its first round, only Delaware and Tennessee won grants to implement comprehensive school reform plans. Race to the Top applicants were required to write plans that included efforts towards improving the quality and use of education data, one of the four reform pillars spelled out in the legislation. Moreover, the application required states to document the sophistication of their statewide data systems, including the number of America COMPETES Act data elements they had.23 Though RttT was designed as a one-time investment in school reform, Congress has continued
2007 49.2
2008 48.3
2009 335.0*
2010 58.3
2011 42.2
2012 38.1
Sources: New America Foundation, U.S. Department of Education *Includes American Recovery and Reinvestment Act funding
Total Amount Awarded ($ millions) 600 3,325 200 445 133 373
to provide funding for it each year since. After the first-round winners were announced in 2010, the Department of Education conducted a second round of grants using the remaining ARRA funds in late 2010. In 2011 the Department distributed a third round of grants using funds provided for the program through fiscal year 2011 regular appropriations.24 In total, 21 states and the District of Columbia had received funds for the Race to the Top K-12 program by the end of the third round of grants, and 46 states and the District of Columbia had applied for at least one round of the competition.25 Since then, the Department of Education has expanded the programs mission to include both an Early Learning Competition and a District competition, also with a focus on data. Finally, the Improving Teacher Quality (ITQ) State Grants program provides funding to every state and the District of Columbia for teacher quality activities under Title II, Part A of the No Child Left Behind Act. Unlike the Statewide Longitudinal Data Systems and Race to the Top grant programs, both of which are awarded on a competitive basis, ITQ grants are formula-
funded. That means every stateand about 95 percent of all local educational agencies receives some funds through the program.26 Funds are designed for recruiting high-quality teachers, shrinking student-to-teacher ratios in the classroom, providing professional development, and improving teacher and principal preparation programs.27 Research from the U.S. Department of Education suggests that over the past decade, states have increasingly devoted a large share of the Title II funds to professional developmentup to 44 percent of funds in 2012 from 27 percent in 2003.28 It is unclear how much of the Title II-funded professional development work is focused on enhancing teachers understanding of data use in the classroom. However, the programs statutory language explicitly allows recipients to do so.29 While there are many federal programs available to assist in the development and use of student data collection systems, states rarely use them expressly to provide the access and training necessary to ensure that teachers use these data to improve instruction.
Oregon and Delaware are both working hard to ensure that teachers have the skills necessary to put education data to work in their classrooms. Both operate comprehensive training programs that provide valuable models for this type of work. The Oregon DATA Project is a statewide program that relies on Statewide Longitudinal Data Systems grants, as well as additional grants from the U.S. Department of Educations Institute of Education Sciences (IES), to provide training for teachers to support best practices in the use of
data. The Delaware Race to the Top Data Coach Program is funded with federal RttT dollars. Both projects provide insight into the federal role in a new field of professional development: helping teachers improve their instruction with the use of student data. Though both states have struggled to create effective, valuable programs, their experiences provide lessons to states that seek to follow in their footsteps.
The Oregon DATA Project (ODP) provides teachers with voluntary, job-embedded professional development opportunities to learn to use data to improve instruction. Teachers engage with their colleagues in finding innovative ways to use data to assess student skills and craft responsive lesson plans. Teachers are grouped in teams, known as Professional Learning Communities (PLCs), which are run by trained staff who guide teachers through the ODP-created curriculum.
links K-12 data with the states community colleges and public universities. The Longitudinal Growth Model provides data on student progress to schools and school districts.31 However, even with all of those data available, Oregon teachers lacked the expertise to effectively use the information, and in some cases didnt even have access to the statistics they needed. As Garrison stated, many of Oregons school districts assumed an if you build it, they will come approach, but quickly found that no one did come.32 As a former educator, Garrison had discovered the power of data. In her school, she created data teams to examine and analyze student information and build new reform strategies based on their discoveries. Throughout that process, her school made substantial strides and narrowed the achievement gap between low- and higher-income students.33 In 2007, the Oregon Department of Education tapped Garrison to write a grant proposal for the U.S. Department of Educations Statewide Longitudinal Data Systems competition. That proposal ultimately launched the Oregon DATA Project with a $4.7 million grant to advance
Oregons work. This work included developing a statewide professional development program to encourage the analysis and use of data from the Oregon Assessment of Knowledge and Skills,interim district-level measures of student achievement, and formative assessments. It also provided access to teachers and administrators to track student progress and create reports. ODP launched in the 2007-08 school year.
tances between school districts, such a statewide effort was unheard of in Oregon at the time.
Many of Oregons school districts assumed an if you build it, they will come approach, but quickly found that no one did come.
Sometimes, only a small group of teachers at a school would agree to participate, but the program expanded as word of mouth spread and interest snowballed.
As a condition of joining the ODP, districts were required first to identify time during the regular workday for teachers to meet in their PLCs. This guaranteed that teachers would have the time necessary to incorporate data analysis into their workdays. Some administrators took the challenge very seriously; in La Grande School District, for instance, Superintendent Larry Glaze appealed to the school board and won late start times every Monday so that staff could meet in small groups with their data coaches before classes began.36 Districts worked with ODP staff to develop implementation plans at both the
school and PLC levels, with PLC plans tailored to teachers skill levels and needs. Garrison worked with school districts to identify trainers for the Professional Learning Communities from among teachers, administrators, and other stakeholders. Each district took its own approach to selecting trainers. In several districts, every school administrator went through the process to become a certified ODP trainer. That training, Garrison said, gave those administrators the foundation necessary to support their teachers in their use of data. The ODP team designed the trainer curriculum so it could be exported to remote or online training sessions. As of 2013, the state has certified 600 trainers. According to ODP staff, however, there were not enough volunteers with sufficient data experience around the state to conduct the trainer certification courses. As a result, ODP staff led the majority of the trainings.37 More than 10,000 people across the state have received some form of training to lead PLCs, and the ODP training sessions are always full.
that allow workshop participants to better understand and master advanced subjects. As the final task of the training session, participants are given a dataset and asked to analyze it and produce a plan for improving classroom instruction based on their findings. ODP executive team members observe and score their performance. Not all participants in the training sessions earn a state certification. ODP staff counsel potential trainers who do not initially meet the ODP standards until their data analysis skills improve. Today, the Oregon DATA Project has expanded into more school districts. ODP staff does not conduct the same regular trainings. Though staff members conduct a handful of statewide trainings each year, the sessions are now run regionally by trained district staff. Certified teachers and administrators from districts around the state provide hours of training and follow-up coaching and support each year to new participants, and Garrison conducts regular check-ins via webinar to assist districts as they bump up against new hurdles.39
More than 10,000 people across the state have received some form of training to lead PLCs, and the ODP training sessions are always full.
According to Garrison, the trainer certification process comprised one of the most important parts of the ODP. She found that trainers must already be comfortable with student data before coming into the certification classes so that they could dive into deeper issues with data; as a result, the instructors are better able to provide consistent, reliable assistance for teachers.38 The trainer certification has transformed since the initial sessions. It began as a formal presentation in a classroomstyle meeting, similar to other professional development programs, and developed into a workshop that combines traditional classroom instruction with small-group exercises that lead a team through a data analysis. ODP staff members present the essential components of the Oregon DATA Project to potential trainers first. Later, they combine that instruction with hands-on exercises in smaller groups
In schools that fully adopted the Oregon DATA Project, data use has become a way of life.
Additionally, Garrison found that it was vital for participants to feel comfortable using data, even if it exposed their weaknesses. Otherwise, they would be reluctant to dive deeply into potential problems in their classrooms. ODP also found that teachers needed to feel secure enough with their data teams to take professional cues from other teachers who demonstrated better results and apply those ideas in their own classrooms.40 Initially, PLC work focused on examining school district-
and school-level summative datathe student subgroup testing data available for all schools as a result of NCLB reporting mandatesand understanding how to use those metrics in developing curricula and evaluating academic programs.41 Because summative data can only provide after-the-fact analysis, it is far less useful to classroom teachers than the interim or formative assessments that occur mid-instruction. The Oregon DATA Project is now piloting training around formative assessments.
skills necessary to use data for advanced decision-making purposes. The Oregon DATA Project analysis found approximately a 10 percent increase in the proportion of teachers who believed they had those skills just in the one-year period from 2010 to 2011.44 As a condition of ODP participation, districts are frequently evaluated in on-site visits from ODP staff. Garrison meets with districts as often as they need, face-to-face or via online virtual meetings to check in, review progress, and identify strengths and weaknesses.46 She does the same with regional staff every six weeks. This allows staff to address regional needs, and to allocate resources where they are most needed. Through these check-ins, Garrison has identified a number of positive outcomes from ODP participation. She believes that the program has unified school districts across a geographically expansive state where there is typically limited cross-regional interaction.47 It has created networks of teachers who communicate across the state where previously none existed. The new training mechanisms have advanced professional development for Oregons teachers and administrators in valuable and meaningful ways, reinvigorating schools and instilling a culture of data use.48
The evaluation found that the data project has resulted in statistically significant improvements in student test scores at participating schools.
Next Level Evaluation, Inc., an independent contractor, conducted an evaluation of the Oregon DATA Project for the 2009-10 and 2010-11 school years, which was completed in Spring 2011.43 The evaluation found that the data project has resulted in statistically significant improvements in student test scores at participating schools (Figure 3). While students from ODP schools performed below their peers at non-ODP schools in reading in the 2008 school year, they out-performed them just four years later. In mathematics, students at ODP schools have closed the gap in scores compared to students at non-ODP schools in the same four-year period. The independent evaluation also found that the ODP has had significant effects with regard to promoting data- and evidence-based instructional decision-making. A prior analysis of schools with data systems found that fewer than half of teachers in those schools felt they had the
The ODP team found that success was contingent on addressing teachers concerns about using data to assess instruction.
The Oregon DATA Projects successes were not achieved without challenges. The ODP team found that success was contingent on addressing teachers concerns about using data to assess instruction. ODP had to respond to participants needs and questions before teachers were willing to learn to use data and apply that knowledge in the classroom. This meant ensuring that teachers had adequate time during the regular school day to examine data and plan related instructional changes. Job-embedded training has proven critical in eliciting teacher buy-in and resolving educators concerns. However, some challenges are more fundamental. The Oregon DATA Project primarily uses summative data gleaned from statewide accountability tests, as well as
interim and formative data where available. Though teachers recognize the importance of formative data, the state has neither the time nor the capacity to develop standardized formative assessments. The ODP team has provided some training on such assessment data, but not enough to fill this need.
competition and received another three-year, $10.5 million grant from IES to continue its work through June 2013; approximately $1.0 million of that goes to the ODP.50 After that, the future of the programs funding is uncertain. As Garrison said, Oregon is among the walking wounded of the economic recession. Without federal assistance, the state will likely be unable to continue a project of this magnitude. ODP staff members are exploring grant opportunities, state dollars, and foundation contributions to continue the program.
Sources: New America Foundation, Oregon DATA Project Note: Oregon raised the cut scores for the 2010-11 math exam, explaining the decline in scores for that year.45
10
The Delaware Data Coach Program was one component of the states winning RttT grant application in 2010. It places trained professionals in schools that teach educators and school leaders to analyze student data to improve their instructional efforts and outcomes. Through this professional development, which is required of core subject teachers in grades 2 through 12, educators have a forum to collaborate and practice analyzing student data. Teachers meet in Professional Learning Communities, like those implemented in the Oregon DATA Project.
every K-12 student a unique identifier so that each can be tracked longitudinally; the Statewide Pupil Accounting System (eSchoolPLUS) tracks data for all public school students; and the Delaware Educator Data System (DEEDS) captures teacher data, from participation in educator preparation programs to certification and professional development.52
In spite of all the data systems in place, Delaware education official Donna Mitchell had to acknowledge that the state was data rich and information poor.
In spite of all the systems in place, though, state education official Donna Mitchell had to acknowledge that the state was data rich and information poor.53 To improve the accessibility and usability of these data, the states Race to the Top application included creating the Delaware Comprehensive Assessment System, a computer-based system to provide more valuable student assessment scores to teachers. This
Race to the Top grant competition announced. Delaware begins application process. Delaware submits RttT application. Department of Education announces Delaware and Tennessee are RttT Round One winners. Delaware launches data coach pilot program in 20 schools around the state. Delaware expands Data Coach Program to all schools in the state with data coaches participation in Professional Learning Community meetings. State releases survey results from 2011-12 participants.
August 2012
11
setup would support teachers in using those and other data in the classroom to improve instruction.54 With fewer than 130,000 students in public education across the state, Delaware is one of the smallest states in the country.55 That gives the projects leader in the state Department of Education, as well as data coaches across the state, greater ability to meet regularly with education stakeholders than in other states. As a result, the Delaware Data Coach Program was better able to respond to teachers needs and to correct implementation mid-course.56
mentation. In total, the projects leadership comprises five members, including a Wireless Generation official and a lead data coach who was hired specifically for the program.59 Prospective coaches completed rigorous interviews with the Delaware Department of Education and Wireless Generation. Each candidate was given the same data set and 48 hours to prepare an analysis and recommendations to the projects leadership in a simulated school presentation. Applicants also had to lead a mock Professional Learning Community meeting for teachers and school administrators to demonstrate their coaching skills.60 Each of the coaches has an average of 14 years of education experience, including several years of classroom teaching.61 Once they were hired, the coaches went through a ten-day training program known as the Coaches Institute. They were introduced to the components of the Delaware project and familiarized with the terminology and culture of the project.62 After the Coaches Institute, they were sent out to their assigned school districts. The coaches participate in continued professional development each month, and meet with other coaches at least twice a month.63 A monthly session brings together all 29 state data coaches, as well as the Delaware Department of Education and Wireless Generation staff assigned to the project. During these meetings, they reflect on their efforts and participate in continuous professional development.64 Data coaches lead PLCs to provide teachers the skills necessary to harness academic data to improve instruction. Coaches also work individually or in small groups with teachers who require additional instruction and observe teachers in the classroom to provide instructional input.
Much like Oregon, Delaware sought to address this skills vacuum through a professional development programthe Data Coach Program.
Delaware also organized teacher participants in Professional Learning Communities. Rather than training school-based personnel to lead data-use professional development, the Delaware Department of Education officials contracted with an outside organization to provide coaches to do this work.
Data Coach Recruitment and Project Design: A Partnership with Wireless Generation
Rigorous selection of data coaches is one of the Delaware programs key components. Beginning in January 2011, the Delaware Department of Education contracted with New York-based Wireless Generation to provide data coaching services to schools. Wireless Generation administers the Data Coach Program, but state officials oversee the imple-
12
facilitators alongside the data coaches. Additionally, the states teacher evaluation system aligns with the implementation of the Data Coach Program, requiring teachers to participate in the PLCs and maintain student data on formative assessments, attendance, and other classroom factors.69, 70 The teacher evaluation system takes into account whether teachers use data to tailor their instruction to students based on the outcomes of those assessments.71 Teachers who absorb the data skills taught during the PLCs and apply them in their classrooms receive higher scores on their evaluations. During PLCs every other week, the coaches model and then facilitate teachers efforts to analyze data on student progress and apply those results in the classroom; in the intervening weeks, the teachers lead their own conversations.72 The PLCs provide teachers the opportunity to learn through application, as they analyze their own student data. Each district has leeway to focus on the issues most central to their needs. While some focus on math for virtually all of their coaching sessions, other schools place more emphasis on reading. The coaches also conduct classroom observations to assist teachers in applying the lessons learned in the PLCs to their classes.73
13
Cycles of Inquiry: Analyzing data to identify students who need additional help, develop new teaching strategies, assess student growth, analyze the results, and repeat for the next curriculum topic Collaborative Data Conversations: Providing lowstakes opportunities for discussions and collaborations with colleagues76 Once teachers have internalized these four components, they will be well equipped to use student data to improve their instruction. Eventually, the PLCs are expected to be self-sustaining, eliminating the need for coaches. According to Donna Mitchell, the projects director, The challenge was determining how to build capacity at the local level so that when the money is gone, [the data coaches] will have coached themselves out of a job.77
end of the 2012-13 school year.81 By the start of the 201213 school year, the Data Coach Program had 29 coaches in all 41 school districts in the state. Delawares 200 schools contain roughly 1,500 Professional Learning Communities, which engage approximately 7,500 educators in the use of data in classrooms.82
The pilot showed that most schools were more advanced than expected, already at phase two or phase three at the start of the Data Coach Program.
Each school district selected one of two models for data coaches. Initially, 40 percent of districts adopted the Direct Facilitation model, in which Wireless Generations data coaches work directly with schools PLCs through weekly or bi-weekly meetings.83 The remaining 60 percent of districts selected the Coach-the-Coach model, in which a teacher or administrator in each PLC adopts the responsibilities of a data coach and leads the PLC. Since that time, the data coaches have worked to coach themselves out of a job, and very few local educational agencies remain under the Direct Facilitation model. The Coach-the-Coach model was designed for schools that already had PLCs as part of their professional development strategies. At these schools, teachers or administrators designated to take on data coach responsibilities were originally required to attend the Wireless Generation data coach training program for four hours each month.84 However, the coach-candidates struggled to find the required hours. As a result, the programs leaders adapted the Coach-the-Coach method to better account for the amount of time teachers had available. Teacher coaches rarely fulfill their four hours of training in one or two sittings; instead, professional development is spaced over the course of the month.85
14
To accurately measure the work of the Professional Learning Communities, data coaches rate each PLC monthly based on how well the participating teachers understand the data analysis material and use of the relevant strategies in their classrooms. Coaches rate the PLCs green, yellow, or red and report the ratings to the state. That way, state leaders and coaches know at a glance how their PLCs are doing. No PLC stays in the red zone for long without a rapid intervention from the coaches. Coaches also keep more detailed metrics of their PLCs between monthly official ratings, from which they can identify and isolate problems and design detailed action plans to respond.87 By the 2012-13 school year, most PLCs were rated green, with few yellow or red ratingsa significant improvement over the programs first year of implementation.88 According to project leader Mitchell, the schools that demonstrated some improvement in student assessments on mid-year tests tended to be those that had the cleanest implementation of the Professional Learning Communities. They typically had few logistical or scheduling disputes and enjoyed the support of teachers and administrators within the school.89
Additionally, the survey showed that the most effective PLCs were those school administrators regularly attended. Administrators were most likely to attend the PLCs in elementary schools. Nearly 60 percent of elementary school teachers said administrators frequently or almost always attended the PLCs, whereas only 36 percent of high school administrators did.92
Eighty-seven percent of teachers felt that looking at student data provided valuable information in providing differentiated instruction to their students.
15
The Oregon DATA Project and the Delaware Data Coach Program were born of different approaches, but shared a common philosophy: student data are of little use unless teachers can harness that information in the classroom to tailor instruction to student needs. Other states seeking to implement similar efforts have much to learn from Oregon and Delaware.
Both projects are predicated on guaranteed planning time for participating teachers. Each participating school is required to provide teachers with a set amount of time to regularly participate in the instruction and collaborative portions of the project. Although in some cases this led teachers to resent the lost classroom planning time, it strongly contributed to the success of the projects by enhancing cooperative efforts and building team uses of the data.
Data are of little use unless teachers have the power to harness that information in their classrooms. Other states seeking to implement similar efforts have much to learn from Oregon and Delaware.
16
Several federal programs already exist that states and school districts can leverage to improve the use of data in the classroom through job-embedded professional development. However, it is clear that without clear guidance and financial and technical support, states are unlikely to use these programs. Policymakers can reshape these existing federal programs in key ways to encourage states and school districts to take on these new instructional efforts.
est-performing schools; and the expansion of data systems and their use. As the Delaware project shows, the RttT program already gives states a significant opportunity to implement data projects. However, the data portion of RttT is only a small section of the overall grant program; the data systems component comprises only 9 percentage points in the RttT application rubric, and using data to improve instruction makes up less than 40 percent of that section. States that receive grants must implement a multipronged, comprehensive reform effort, through which the data elements could be lost in the shuffle. It is not a well-targeted, exclusively data-oriented program, and major data reforms can likely be better accomplished in a program with a more narrow focus.
States that receive RttT grants must implement a multipronged, comprehensive reform effort, through which the data elements could be lost in the shuffle.
A redesigned SLDS program that focuses on the use, rather than simply the existence, of data could become a subgrant, like the one Oregon received for its Oregon DATA Project.
Additionally, Race to the Top began as a one-time funding infusion through the American Recovery and Reinvestment Act of 2009. Though Congress has appropriated funding to the project every year since, including nearly $550 million in fiscal year 2013, pre-sequestration, continued financial support is far from certain. President Obamas fiscal year 2014 budget request included funding only for a higher education RttT competition, and it remains to be seen whether Congress will appropriate any funding for the program in 2014. Particularly in a challenging fiscal environment, it is ill-advised for states to rely on the continued existence of a recently developed program.
17
for expanding data-driven instruction. Currently, most of the funds are used for class size reduction or isolated professional development activities. The program issues funds through a formula, rather than competitively. That means every state and most school districts receive a portion of the funds. The program is an ongoing component of the U.S. Department of Educations annual budget, and the policymaking community operates with the tacit understanding that the program will receive at least the same approximate level of spending it did in the prior year. To better leverage the nearly $2.5 billion program to promote data-focused professional development, the Department of
Education could extend or change the grants professional development goals. Similarly, federal dollars could be buttressed with state and local dollars to ensure state-level oversight and local-level implementation. Though the Improving Teacher Quality State Grants program does currently allow schools and districts to use the funds for professional development projects like those in Oregon and Delaware, Congress should, when it reauthorizes NCLB, explicitly promote those activities within the program. Incentives should be built into the existing program, possibly as a set-aside of federal funds, to focus specifically on data-oriented and job-embedded professional development activities.
Conclusion
Federal policy has made great strides in the past decade with respect to the collection and availability of student data. A number of policies, taken together, now ensure that states collect and report data on student achievement on state standardized tests, and virtually every state has a longitudinal data system in place. Yet those policies hold greater promise than simply making more information available on student performance. If policymakers
focus on those student data at the classroom level, and arm teachers with the skills necessary to use the data to inform instruction, they can open new doors for teachers and students alike. Many more states could join the ranks of Oregon and Delaware if policymakers ensure that states have sufficient fundingand sufficient incentivesto develop their own models of professional development in data literacy.
18
Notes
1 America COMPETES Act, P.L. 110-69, 110th Cong. U.S. Government Printing Office: http://www.gpo.gov/fdsys/ pkg/PLAW-110publ69/pdf/PLAW-110publ69.pdf. 2 Data for Action 2012: Focus on People to Change Data Culture, Data Quality Campaign (2012): http://dataqualitycampaign.org/files/DFA2012%20Annual%20Report. pdf. 3 No Child Left Behind Act of 2001, P.L. 107-110, 107th Cong. U.S. Department of Education: http://www2.ed.gov/ policy/elsec/leg/esea02/107-110.pdf. 4 Nancy Smith, Building Longitudinal Data Systems in Kansas and Virginia, in A Byte at the Apple: Rethinking Education Data for the Post-NCLB Era, ed. Marci Kanstoroom and Eric C. Osberg (Washington, DC: Thomas B. Fordham Institute, 2008), 116-142. http://www.edexcellencemedia. net/publications/2008/200811_abyteattheapple/20081117_ ByteAtTheApple.pdf. 5 America COMPETES Act, P.L. 110-69, 110th Cong. U.S. Government Printing Office: http://www.gpo.gov/fdsys/ pkg/PLAW-110publ69/pdf/PLAW-110publ69.pdf. 6 U.S. Department of Education, Statewide Longitudinal Data Systems fact sheet, Washington, DC: U.S. Department of Education, 2009: http://www2.ed.gov/programs/slds/ factsheet.pdf. 7 Alignment Between the DQCs 10 Essential Elements and the America COMPETES Acts 12 Elements, Data Quality Campaign, March 2011: http://dataqualitycampaign.org/files/COMPETES_vs.pdf; and Summary of Discretionary Funds, FY 2008 FY 2013 Presidents Budget, U.S. Department of Education, February 2012: http://www2.ed.gov/about/overview/budget/budget13/ summary/appendix1.pdf. 8 Ibid. 9 Characteristics of Statewide Student Data Systems 2009-2010, National Center for Education Statistics, U.S. Department of Education, July 2011: http://nces.ed.gov/ programs/slds/summary.asp.
10 Cheryl James-Ward , Douglas Fisher, Nancy Frey, and Diane Lapp, Using Data to Focus Instructional Improvement (Alexandria, VA: ASCD, 2013), 57. 11 Data for Action 2012: Focus on People to Change Data Culture, Data Quality Campaign (2012): http://dataqualitycampaign.org/files/DFA2012%20Annual%20Report.pdf. 12 Ibid. 13 Grants for Statewide, Longitudinal Data Systems, CFDA Number 84.372, Institute of Education Science, U.S. Department of Education, September 2011: http://ies. ed.gov/funding/pdf/2012_84372.pdf. 14 Barbara Means , Christine Padilla, Larry Gallagher, and SRI International, Use of Education Data at the Local Level: From Accountability to Instructional Improvement, U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, 2010: http://www2. ed.gov/rschstat/eval/tech/use-of-education-data/use-ofeducation-data.pdf. 15 What Teacher Preparation Programs Teach About K-12 Assessment, National Council on Teacher Quality, March 2012: http://www.nctq.org/p/edschools/docs/assessment_ publication.pdf. 16 Cheryl James-Ward , Douglas Fisher, Nancy Frey, and Diane Lapp, Using Data to Focus Instructional Improvement (Alexandria, VA: ASCD, 2013), 24. 17 Tracy Huebner, What Research Says AboutBalanced Assessment, Educational Leadership 67.3 (November 2009): 85-86. Available online: http://www.ascd.org/publications/educational-leadership/nov09/vol67/num03/ Balanced-Assessment.aspx. 18 Julie A. Marsh, John F. Pane, and Laura S. Hamilton, Making Sense of Data-Driven Decision Making in Education: Evidence from Recent RAND Research, RAND Corporation, 2006: http://www.rand.org/content/dam/ rand/pubs/occasional_papers/2006/RAND_OP170.pdf. 19 Statewide Longitudinal Data Systems Grant Program: Grantee States, National Center for Education Statistics, U.S. Department of Education, 2012: http://nces.ed.gov/ programs/slds/stateinfo.asp.
19
20 Department of Education Budget Tables, U.S. Department of Education, 2013: http://www2.ed.gov/ about/overview/budget/tables.html?src=ct. 21 Data for Action 2012: Focus on People to Change Data Culture, Data Quality Campaign (2012): http://dataqualitycampaign.org/files/DFA2012%20Annual%20Report. pdf. 22 Justin Hamilton, Delaware and Tennessee Win First Race to the Top Grants, U.S. Department of Education, March 29, 2010: http://www.ed.gov/news/press-releases/ delaware-and-tennessee-win-first-race-top-grants. 23 Race to the Top Program Webinar: Understanding the Application, U.S. Department of Education, November 24, 2009: http://www2.ed.gov/programs/racetothetop/ webinar-understanding-the-application.pdf. 24 Phase 3, Race to the Top: Notice of Proposed Requirements, Federal Register 76 (12 September 2011): 59124. http://www2.ed.gov/programs/racetothetop/interim-noticeproposed-requirements.pdf. 25 Department of Education Awards $200 Million to Seven States to Advance K-12 Reform, U.S. Department of Education, December 23, 2011: http://www.ed.gov/news/ press-releases/department-education-awards-200-millionseven-states-advance-k-12-reform. 26 Findings from the 2011-12 Survey on the Use of Funds Under Title II, Part A, U.S. Department of Education, March 2012: http://www2.ed.gov/programs/teacherqual/ finalfindings32312.pdf. 27 Improving Teacher Quality State Grants: Overview, U.S. Department of Education, 2009: http://www2.ed.gov/programs/teacherqual/index.html. 28 Findings from the 2011-12 Survey on the Use of Funds Under Title II, Part A, U.S. Department of Education, March 2012: http://www2.ed.gov/programs/teacherqual/ finalfindings32312.pdf. 29 20 USC 2113(c)(11); and 20 USC 2123(a)(3)(B)(v). 30 Donna Mitchell, phone interview by Clare McCann, April 12, 2013.
31 Oregon Growth Model, Oregon Department of Education, 2012: http://www.ode.state.or.us/search/page/?id=3797. 32 Mickey Garrison, phone interview by Jennifer Cohen Kabaker, May 1, 2012. 33 Ibid. 34 The Oregon DATA Project: Summary, Oregon Department of Education, Summer 2009: http://www.oregondataproject. org/system/files/ProjectSummaryBW_2009-0810_0.pdf. 35 Ibid. 36 School Start Times, La Grande School District, 2013: http://www.lagrandesd.org/school-start-times. 37 Mickey Garrison, phone interview by Jennifer Cohen Kabaker, May 1, 2012. 38 Mickey Garrison, phone interview by Clare McCann, March 27, 2013. 39 Ibid. 40 Ibid. 41 Mickey Garrison, phone interview by Jennifer Cohen Kabaker, May 1, 2012. 42 Ibid. 43 Karee E. Dunn, Oregon DATA Project Final Evaluation Report, Next Level Evaluation, Inc., 2011: http://oregondataproject.org/content/evaluation-report. 44 Barbara Means , Christine Padilla, Larry Gallagher, and SRI International, Use of Education Data at the Local Level: From Accountability to Instructional Improvement, U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, 2010: http://www2.ed.gov/rschstat/eval/tech/use-of-education-data/use-of-education-data. pdf.; and Karee E. Dunn, Oregon DATA Project Final Evaluation Report, Next Level Evaluation, Inc., 2011: http:// oregondataproject.org/content/evaluation-report. 45 Ibid.
20
46 Mickey Garrison, phone interview by Jennifer Cohen Kabaker, May 1, 2012. 47 Ibid. 48 Ibid. 49 Mickey Garrison, phone interview by Clare McCann, March 27, 2013. 50 Ibid. 51 Data Coaches Project Overview (presentation, Wireless Generation, August 24, 2011). 52 Race to the Top Application for Initial Funding: The State of Delaware, CFDA Number 84.395A. U.S. Department of Education, January 2010: http://www2. ed.gov/programs/racetothetop/phase1-applications/delaware.pdf. 53 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 54 Race to the Top Application for Initial Funding: The State of Delaware, CFDA Number 84.395A. U.S. Department of Education, January 2010: http://www2. ed.gov/programs/racetothetop/phase1-applications/delaware.pdf. 55 Delaware, Federal Education Budget Project, New America Foundation, 2013: http://febp.newamerica.net/ k12/DE. 56 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 57 Race to the Top Application for Initial Funding: The State of Delaware, CFDA Number 84.395A. U.S. Department of Education, January 2010: http://www2. ed.gov/programs/racetothetop/phase1-applications/delaware.pdf. 58 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 59 Ibid.
60 Ibid. 61 Ibid; and Full-time Data Coach, Wireless Generation, 2011: www.doe.k12.de.us/news/2011/FullTimeDataCoach Posting.doc. 62 Data Coach Pilot Program Summary Report, Wireless Generation, 2011. 63 Mickey Garrison, phone interview by Clare McCann, March 27, 2013. 64 Ibid. 65 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 66 Laura Bornfreund, An Ocean of Unknowns: Risks and Opportunities in Using Student Achievement Data to Evaluate PreK-3rd Grade Teachers, Washington, DC: New America Foundation, May 14, 2013: http://newamerica. net/publications/policy/an_ocean_of_unknowns. 67 Donna Mitchell, phone interview by Clare McCann, April 12, 2013. 68 Mickey Garrison, phone interview by Clare McCann, March 27, 2013. 69 Delaware Performance Appraisal System: DPAS II Guide Revised for Teachers, Delaware Department of Education, 2012: http://www.doe.k12.de.us/csa/dpasii/ files/DPASTeachFullGuide.pdf: Criterion 4b. 70 Donna Mitchell, phone interview by Clare McCann, April 12, 2013. 71 Delaware Performance Appraisal System: DPAS II Guide Revised for Teachers, Delaware Department of Education, 2012: http://www.doe.k12.de.us/csa/dpasii/ files/DPASTeachFullGuide.pdf: Criterion 1d. 72 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 73 Taking Action with Data: A Methodology and Framework for Data Coaching Services, Wireless Generation, 2011.
21
74 Response to Request for Proposals for Professional Services to Provide Data Coaches, Wireless Generation, August 31, 2010. 75 Ibid. 76 Taking Action with Data: A Methodology and Framework for Data Coaching Services, Wireless Generation, 2011. 77 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 78 Data Coach Pilot Program Summary Report, Wireless Generation, 2011. 79 Ibid. 80 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 81 Data Coaches Project Overview (presentation, Wireless Generation, August 24, 2011). 82 Request for Proposals for Professional Services to Provide Data Coaches, RFP Number DOE 2011-02, Delaware Department of Education, 2011: http://bidcondocs.delaware.gov/DOE/DOE_100810Data-Rev_rfp.pdf; and Donna Mitchell, phone interview by Clare McCann, April 12, 2013. 83 Race to the Top Delaware Report: Year 2, School Year 2011-2012, U.S. Department of Education, February 2013: http://www2.ed.gov/programs/racetothetop/performance/delaware-year-2.pdf. 84 Data Coach Pilot Program Summary Report, Wireless Generation, 2011.
85 Donna Mitchell, phone interview by Clare McCann, April 12, 2013. 86 Statement of Agreement, DOE-C11-41, Wireless Generation and Delaware Department of Education, January 13, 2011. 87 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 88 Donna Mitchell, phone interview by Clare McCann, April 12, 2013. 89 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 90 Donna Mitchell, phone interview by Clare McCann, April 12, 2013. 91 Professional Learning Community Participant Survey Report (2011-2012), Delaware Department of Education, August 2012: http://www.doe.k12.de.us/tleu_files/ PLCSurveyReport_2012.pdf. 92 Ibid. 93 Donna Mitchell, phone interview by Jennifer Cohen Kabaker, June 12, 2012. 94 Ibid. 95 Ibid. 96 Mickey Garrison, phone interview by Clare McCann, March 27, 2013; and Ibid.
22
1899 L Street, NW Suite 400 Washington, DC 20036 Phone 202 986 2700 Fax 202 986 3696
www.newamerica.net