Photo

Since its founding, SREE has dedicated itself to improving the rigor and quality of educational research. It has done so in part by emphasizing the importance of high quality social science methods as a means of reaching causal conclusions about the impacts of initiatives designed to improve student outcomes. Many researchers, program developers and policy makers now acknowledge the benefits of designs strong enough to support causal claims about educational effectiveness.

Strong designs alone do not necessarily yield unambiguous findings. There are instances where research evidence about specific programs appear consistent, including findings about the role of early interventions in improving reading skills, or the effects of small class size on educational performance and college enrollment. Yet there are well-designed and executed studies that yield quite different conclusions about the efficacy and replicability of these types of interventions. Studies of school choice, charter schools, after-school and expanded learning time initiatives, professional development, and whole school reform models, among others, have yielded results that may be interpreted as mixed or inconclusive. Contradictory findings exist about many initiatives, independent of the scale and complexity of the intervention.

Persistently mixed results have serious consequences, including undermining the credibility of findings, diminishing the willingness of potential school, community and district partners to participate in studies that use the strongest designs, and unending searches for definitive results to guide program and policy decisions. The end result may be missed opportunities to learn systematically, through purposeful re-examination of program design and implementation, or through deliberate review of how contradictory findings may inform research design, measurement, and analysis.

The theme of the Spring 2013 SREE Conference, Capitalizing on Contradiction: Learning from Mixed Results, highlights the importance of stepping back from the specific details of individual studies to consider the lessons one may learn when considering results across multiple studies of specific initiatives, programs, and interventions. Symposium, panel and paper presentations focused on efforts to address contradictions or inconsistencies in prior research, while continuing to heed the role of rigorous experimental or quasi-experimental designs, are encouraged.

Topics of particular interest include:

  • How may contradictory findings be used in program design and implementation?
  • How may contradictory findings improve the quality of future research?
  • How may the conversation among practitioners, policymakers and researchers be expanded to acknowledge both consistency and contradictions in robust research studies?
  • How may we better communicate what contradictory findings suggest about practice and program development to practitioners?
  • Which statistical methods may we utilize or develop to examine and test for (1) robustness of results across studies, and (2) contradictory or mixed results across studies?

Sections

  • Early Childhood Education
  • Instructional Improvement
  • School Climate and Culture
  • Transitions for Youth
  • Education Policy
  • Research Methods