SREE Webinars & Workshops
UPCOMING
An Introduction to Open Science Practices for Educational Effectiveness Research
Monday April 24, 2023, 1:00pm - 2:00pm ET
Speaker: Sean Grant, University of Oregon
This webinar will provide an overview of open science for education researchers. Topics will include factors motivating interest in transparency and reproducibility, research practices associated with open science, and stakeholders engaged in, and impacted by, open science reform efforts. The webinar will conclude with a discussion about the open science policies at the Journal for Research on Educational Effectiveness.

Culturally Responsive and Equitable Evaluation (CREE) Learning Opportunities
Data Equity Workshop Series
This series of short workshops will explore practical tools and choice points in aligning educational research with equity goals. Each workshop will be facilitated by the We All Count team. Each will include a 45-minute presentation followed by a 15-minute educational case study example followed by 30 minutes of live discussion and Q&A. Each workshop is an independent event. You can attend as many or as few as you’d like and you don’t need to attend the previous ones to attend the next one. Recordings will be made available for those unable to attend the live workshop. Register once to receive the link to join any of the workshops.

Monday, December 12, 2022, 1:00pm - 2:30pm Eastern Time Equity and Causal Analysis with Quasi-Experimental Design: Models and Methods for Equity
There are many many ways to design a piece of research that is out to discover what works, how it works, and who it works for. In order to align these studies with our equity goals, there are several key choice points that need to be considered by the research team and participatory ways to ensure the models and methods are aligned with the community’s needs. We’ll look at these choices and how to make them effectively.
Tuesday, January 17, 2023, 1:00pm - 2:30pm Eastern Time Key Elements of Equity for RCTs: How to Embed Transparency and Account for Cultural Bias
The experimental RCT is a bedrock of understanding cause and effect. These types of studies are often required for educational initiatives to be approved and deployed. We’ll explore how to embed transparency and accountability in an RCT while maintaining scientific rigor.
PREVIOUS
Causal Analysis and Machine Learning: How to Account for Equity October 17, 2022
Machine learning and predictive analytics are under development and deployment across many educational institutions. This includes proprietary data that is purchased as well as custom models being built to assess and support effective impact. How to align these types of models with equity is a critical skill for anyone building models or making decisions with models in the sector.
People from Multiple Races or Identities: Best Practices in Handling Complex Social Identity Data September 14, 2022
As data collection becomes more culturally responsive, the social identity data we’re handling is becoming more complex. From federally available datasets to local research databases, the choices made in how to handle these nuanced data have both human and mathematical implications.
Social Identity Variables in Analysis: When and How to Include Them July 12, 2022
Whether doing descriptive, predictive or causal analysis, the question of when and how to include social identity variables in a model and subsequent interpretation of the results is critical for data equity. We’ll look at how to make this decision in a rigorous way based on the question you’re trying to answer, what is being measured by the social identity data, and the purpose of the analysis.
Culturally Responsive & Equitable Evaluation (CREE) Training Part 1: Foundations of Data Equity (Fall 2021) Register here to view the recordings.
In collaboration with We All Count, SREE is offering a Culturally Responsive & Equitable Evaluation (CREE) training for education researchers. Researchers will learn the tools they need to make their own research and evaluations culturally responsive, inclusive, and equitable. This training has been developed based on the seven-step framework and is specifically targeted to the SREE community. During the 3-part training, researchers learned how to make their research and evaluations more culturally responsive, inclusive, and equitable. The workshop includes steps researchers can take at every stop of the research process - from formulation of the research question through dissemination.
Culturally Responsive & Equitable Evaluation (CREE) Training Part 2: Advancing the Data Equity Framework (Spring 2022) Recording will be posted soon
This live, online workshop is designed for people who have completed the initial Foundations of Data Equity workshop. We will dive deeper into the nuances of how to apply the Data Equity Framework to your work as well as look at a wider variety of potential applications.
SREE-Researchers of Color webinar
Considerations for Convening Reading/Thinking/Struggling Groups June 29, 2022 Presenter: Claire Mackevicius, Northwestern University
This session will provide an opportunity for people to plan how to help organize, convene, and support reading/thinking/struggling groups especially centered on topics relegated to margins of mainstream academic (particularly quantitative education research) spaces. There will be a brief presentation of some considerations people might engage in before establishing a group. Then, it will be put into practice around a brief pre-reading, Coalitional Refusal in a Neoliberal Academy (Gonzales & Shotton 2022). Participants will engage in reflections and exercises to concretize how to commit to action going forward.
Researchers of Color Brown Bag - QuantCrit June 15, 2021 Presenter: Wendy Castillo, Princeton University
‘QuantCrit’ (Quantitative Critical Race Theory) is a rapidly developing approach that seeks to challenge and improve the use of statistical data in social research by applying the insights of Critical Race Theory (CRT). Scholars have adopted multiple different approaches to this task. This presentation of QuantCrit is intended to provide concrete strategies for more critical quantitative research and explore a range of questions that prompt users to be engaged critics, weighing the plausibility of the study, and questioning how the material was produced, analyzed, and presented.
‘QuantCrit’ (Quantitative Critical Race Theory) is a rapidly developing approach that seeks to challenge and improve the use of statistical data in social research by applying the insights of Critical Race Theory (CRT). Scholars have adopted multiple different approaches to this task. This presentation of QuantCrit is intended to provide concrete strategies for more critical quantitative research and explore a range of questions that prompt users to be engaged critics, weighing the plausibility of the study, and questioning how the material was produced, analyzed, and presented.
This session is the first in a series that will be held quarterly. The purpose of the brown bag series is to give researchers of color and allies a forum in which to gather, explore relevant issues, and network.
Use of Research Evidence: Increase the Impact of Your Research: A Workshop Series on Strategies for Knowledge Mobilization
Strategies for Knowledge Mobilization is a workshop series for SREE members, academic researchers, private research staff, and graduate students interested in making their work impactful, usable, accessible, and engaging to practitioners and policymakers. Through case studies and surveys from nearly 4,500 educators and 350 education researchers nationwide, the Center for Research Use in Education (CRUE) has found that researchers whose work has the most salient impacts on policy and practice tend to collaborate with schools and districts, they leverage intermediary organizations to connect their research with practice communities, and they use multifaceted and targeted dissemination strategies. Through four sessions facilitated by CRUE staff, participants will learn how to achieve greater impact based on findings from our studies and specific advice from researchers and practitioners at the forefront of connecting research, practice, and policy. Sessions will cover topics key to effective knowledge mobilization and engagement with non-academic audiences, including school/district and intermediary stakeholders.
Workshops will be highly participatory and will provide practical knowledge, skills, and tools that can be used right away. Join us online this spring for one or multiple sessions! Registration is required for each one. Recordings will be made available at a later date.
SESSION 4: Leveraging the Intermediary Space: Connecting with Influential Organizations to Expand Your Reach June 16, 2022
Webinar Recording
Relationships between research and practice are most often mediated; educators rarely connect directly with research or researchers but rather turn to organizations they know and trust when seeking out information and strategies. Partnering with these intermediary organizations is a promising knowledge mobilization strategy. In this webinar, researchers from the Center for Research Use in Education will highlight findings from their national survey of nearly 4500 educators on the types of organizations and media sources that educators rely on for research information. Representatives of key intermediary organizations will share their work linking research and practice, as well as opportunities for researchers to engage with them and/or their audiences and constituents.
The panel will include:
-
Dr. David Barnes, Associate Executive Director at the National Council of Teachers of Mathematics
-
Dr. Lisa Thomas, Associate Director at the American Federation of Teachers
-
Tiffany Neill, Executive Director of Curriculum and Instruction, at Oklahoma State Department of Education
-
Matt Dawson, Director of Efficacy and Implementation Research at Curriculum Associates
The session will end with a discussion of how researchers can strategically identify and work with intermediaries in their fields and disciplines.
SESSION 3: Actionability and Compatibility: Meeting the Research Needs of Educators June 2, 2022
Webinar Recording
Effective knowledge mobilization strategies meet the needs of an audience, yet we consistently hear concerns about the relevance and usefulness of educational research. This session focuses on better understanding educator perspectives on educational research, specifically exploring what makes research actionable and relevant. We will hear directly from a panel of educators about their research use practices, including their perspectives on findings from the Center for Research Use in Education, and we will have an open conversation with the panel about how researchers can produce and communicate research that better meets their needs.
The panel includes a district superintendent, staff from a district research office, a teacher, and Kim Marshall, author of the weekly Marshall Memo, written to keep educators well-informed on current research and best practices.
SESSION 2: What Is Knowledge Mobilization and What Does It Look Like? May 19, 2022
Webinar Recording
Knowledge mobilization (KMb) is the process of helping to connect research with practitioners, policymakers, and the broader community. Even though there is a broad and multidisciplinary literature on KMb, most education researchers are not familiar with it. As such, it’s a field that can be confusing, with varying terminology, definitions, and understandings of what ‘good’ KMb looks like. This workshop will help to clarify what KMb is and provide tools and resources to support researchers in developing KMb strategies. In addition, attendees will hear from, and have a chance to engage with, acclaimed education researchers who are utilizing a variety of knowledge mobilization strategies, including direct engagement with education stakeholders, to increase the impact of their research.
Dr. Samantha Shewchuk will lead this session featuring expert insights on knowledge mobilization strategies to achieve impact. Dr. Nanette Dietrich (Professor, Educational Foundations, Millersville University) and Dr. Carolyn Sattin-Bajaj (Associate Professor, Gevirtz Graduate School of Education, UC Santa Barbara) will present their experiences engaging in knowledge mobilization efforts.
SESSION 1: The Evidence about Evidence Use: New Findings from a National Survey of Teachers and School Administrators May 5, 2022
Webinar Recording
The Center for Research Use in Education (CRUE) has spent the last 6 years studying schools’ research use. In this kick-off session for the Strategies for Knowledge Mobilization Workshop Series, CRUE researchers will share their findings—highlighting a number of barriers and facilitators to successful use of research evidence by educators working in schools and districts. Participants will be invited to brainstorm and share expertise on how the research community can maximize their effectiveness in mobilizing research evidence based on CRUE’s results. This workshop sets the stage for successive workshops over the coming two months.
The presenters for this session will be Dr. Henry May, Dr. Elizabeth Farley-Ripple, Dr. Samantha Shewchuk, and Dr. Kati Tilley, researchers at the Center for Research Use in Education, an IES-funded research and development center.
Ask Me Anything May 6, 2022
Recording Not Available
Join senior scholars Henry May and Elizabeth Farley-Ripple from the Center for Research Use in Education for an hour of conversation and questions. SREE Board Member Ruth Neild will moderate the session. The purpose of the Ask Me Anything hour is to provide an informal opportunity for SREE members to be able to have the kind of conversations that might have taken place at the conclusion of an in-person seminar or at a reception. Questions can be about the first webinar in the series or Use of Research Evidence in general. Students and early career scholars are especially encouraged to attend.
Applying Standards for Excellence in Education Research
The Institute for Education Sciences (IES) has launched the Standards for Excellence in Education Research (SEER) to make education research more transparent, actionable, and focused on consequential outcomes. To support this effort, IES is producing a series of practical guides for researchers on how to implement SEER to improve the quality and relevance of their studies. SREE is excited to partner with IES to sponsor webinars that cover the recommendations from a specific guide aligned with the SEER. These webinars are free, open to the public, and relevant to all researchers who seek to ensure their studies are useful to policymakers and educators. Registration for each webinar is required to receive the link to join.
UPCOMING WEBINAR:
Webinar 3: Sharing Study Data: Implications for Education Researchers March 28th, 2:00pm - 3:30pm ET Speakers: John Czajka, Mathematica. Jessica Logan, Vanderbilt University, and John Diamond, MDRC
Register Here
In the era of open science practices, researchers are more frequently being asked to share data from their studies. Multiple federal agencies and private funders have added policies and requirements for their grantees and contractors to share study data. Consistent with this priority and the Standards for Excellence in Education Research (SEER), the Institute of Education Sciences (IES) requires researchers to make data from IES-funded studies available to others, while also protecting the rights of study participants and meeting the requirements of data providers. Such policies, along with growing interest in open science practices, means that many education researchers are looking for practical, achievable strategies for making their data available to others.
In this 90-minute webinar, Ruth Neild from Mathematica will present an overview of IES’s guide, Sharing Study Data: A Guide for Education Researchers, and a panel of researchers will share their expertise and experiences with three key aspects of sharing study data:
Managing disclosure risks
- Documenting and organizing data for other researchers
- Depositing data for access by other researchers
The session will include a panel discussion and audience Q&A.
Panelists:
- John Czajka is a Mathematica Senior Fellow Emeritus. Dr. Czajka has expertise on disclosure risk management and has vast experience connected to statistical applications of program administrative data, the analysis of survey data from large national samples, panel studies, and specialized surveys.
- Jessica Logan is an Associate Professor of Special Education at Vanderbilt University’s Peabody College. Dr. Logan is a quantitative methodologist who focuses on improving the data management and data quality practices of researchers in the education sciences, encouraging transparency in science, and improving education researchers’ statistical conclusion validity.
- John Diamond is MDRC’s lead for Data Management and Integrity and a senior research associate for the Postsecondary Education policy area. He chairs MDRC’s Data Integrity Board, Data Management Initiative, and Data Security Team, and coordinates MDRC’s data sharing with ICPSR at the University of Michigan.
PREVIOUS WEBINARS:
Webinar 2: The BASIE (BAyeSian Interpretation of Estimates) Framework for Interpreting Findings from Impact Evaluations June 13, 2022 Speakers: John Deke and Mariel Finucane, Mathematica
Webinar Recording
In studies that evaluate the impacts of educational interventions, educators and policymakers typically want to know whether an intervention improved outcomes. However, researchers cannot provide a simple yes/no answer because all impact estimates have statistical uncertainty. Researchers have often used statistical significance and p-values to assess this uncertainty, but these statistics are often misinterpreted and cannot help educators and policymakers figure out how likely it was that the intervention had an effect.
This webinar lays out a Bayesian framework for interpreting impact estimates without the pitfalls of relying only on p-values. This framework allows researchers to calculate the probability an intervention had a meaningful effect, given the impact estimate and prior evidence on similar interventions. The webinar will explain key concepts and introduce a convenient, easy-to-use Excel tool for applying this framework. With these concepts and tools, researchers can extract more accurate and interpretable lessons from impact findings to support informed decisions about educational interventions. The webinar is based on IES’s recently released guide on this topic.
Dr. John Deke is an economist and senior fellow at Mathematica. His work has focused primarily on the statistical methodologies used in impact evaluations and systematic reviews, especially in K-12 education.
Dr. Mariel Finucane is a principal statistician at Mathematica. Her work uses Bayesian hierarchical modeling and tree-based methods to study social policies.
Webinar 1: Enhancing the Generalizability of Impact Studies in Education May 16, 2022 Speakers: Elizabeth Tipton, Northwestern University, and Robert Olsen, George Washington University
Webinar Recording
Generalizability in education research indicates how well the results of a study apply to broader populations of interest to educators and policymakers. However, in studies that evaluate the impacts of educational interventions, the generalizability of study findings is often unclear because each study is conducted with only a specific sample. Educators and policymakers may question whether such findings can help them make decisions about how to best support the populations they serve.
This webinar lays out recommendations that researchers can take to enhance generalizability when planning impact studies in education. It walks researchers through the key steps involved in identifying a target population of schools, developing a list of schools in this population, and selecting a representative sample. It also provides an overview of steps to recruit schools into the study, assess and adjust for differences between the sample and population, and report generalizability appropriately. The webinar is based on IES’s recently released guide on this topic.
Dr. Elizabeth Tipton is an Associate Professor of Statistics, a Faculty Fellow in the Institute for Policy Research, and the co-Director of the Statistics for Evidence Based Policy and Practice (STEPP) Center at Northwestern University. Her research focuses on the development of methods for improving generalizations from experiments, both through the design and analysis of field experiments and through the use of meta-analysis.
Dr. Rob Olsen is a Senior Fellow at the George Washington Institute of Public Policy at George Washington University. His research focuses on the generalizability of randomized trials in education. Olsen is the Principal Investigator of a study testing different sampling methods for selecting sites for impact studies, and he is the co-Principal Investigator for a study testing different regression methods for predicting the impacts of educational interventions in individual schools using data from multi-site randomized trials. Finally, he consults for national research organizations on how to design and implement impact studies for greater generalizability.
Critical Perspectives in Quantitative Methods Series
Previous Webinars
Critical Perspectives in Quantitative Methods Series Webinar 7: Pedagogical Possibilities of Critical Quantification
February 13, 2023 | 1:00pm - 2:30pm ET Speakers: Derek A. Houston, Southern Illinois University Edwardsville
The purpose of this conversation will be to explore the pedagogical considerations and possibilities of critical quantification. Additionally, thoughts on the complexities of the sustainability of critical quantification will be offered.
Webinar Recording
Critical Perspectives in Quantitative Methods Series Webinar 6: A Conversation with Foundations about Critical Education Research December 2, 2022 | 1:00pm - 2:30pm ET Speakers: Krystal Villanosa, Spencer Foundation, Kevin Close, Spencer Foundation, and Stephen Glauser, Russell Sage Foundation
For this discussion, we invite Program Officers from private foundations to offer their perspectives about what a shift toward utilizing critical quantitative methods means for both researcher- and foundation-initiated funding opportunities. They will also offer thoughts about how scholars, practitioners, policymakers, and graduate students can and should be more intentional about criticality in quantitative educational research.
Webinar Recording
Critical Perspectives in Quantitative Methods Series Webinar 5: Designing a Critical Race Mixed Methods Study October 21, 2022 Speaker: Jessica T. DeCuir-Gunby, PhD, Professor of Educational Psychology, USC Rossier School of Education, University of Southern California
Webinar Recording
DeCuir-Gunby will explain Critical Race Mixed Methodology (CRMM), the combining of Critical Race Theory and mixed methods research. She will focus on the relationship between a researcher’s inquiry worldview and methodological choices, centering on the role of positionality. The workshop will explain the role theory plays in guiding the mixed methods research process. A review of the basic mixed methods designs will be provided along with a discussion of the major components of a mixed methods study. Participants will have the opportunity to reimagine their own traditional mixed methods studies into CRMM studies. Implications will be provided for conducting critical race research in education.
Critical Perspectives in Quantitative Methods Series Webinar 4: A Critical Perspective on Measurement: MIMIC Models to Identify and Remediate Racial (and Other) Forms of Bias May 20, 2022 Speakers: Matthew Diemer, University of Michigan & Aixa Marchand, Rhodes College
Webinar Recording Slide Deck Article
Sound measurement is foundational to quantitative methodology. Despite the problematic history of measurement, it can be repurposed for critical and equitable ends. MIMIC (Multiple Indicator and MultIple Causes) models simply and efficiently test whether a measure means the same thing and can be measured in the same way across (e.g., racial/ethnic and/or gender) groups. This talk considers the affordances and limitations of MIMICs for critical quantitative methods, by detecting and remediating racial, ethnic, gendered, and other forms of bias in items and in measures.
Matthew Diemer is a Professor in the Combined Program in Education & Psychology (CPEP) and Educational Studies programs at the University of Michigan. Diemer harnesses advanced quantitative methods to examine how young people develop critical consciousness - the capacity to reflect on, negotiate, and challenge racial, ethnic, socioeconomic and other constraints in school, college, work, and civic/political institutions. His research is currently funded by the Spencer Foundation, William T Grant Foundation, Institute for Education Sciences, the Mental Research Institute, National Institute of Health, and the National Science Foundation. Diemer teaches Psychometrics and Structural Equation Modeling courses and provides statistical consultation to the campus community at the University of Michigan. He has delivered invited lectures and workshops on these and other quantitative topics domestically and internationally. He also served as the Statistical Consultant for the Psychology of Women Quarterly. Diemer was nominated for the "Golden Apple" teaching award and in 2019, received the Provost's Teaching Innovation Prize for Advancing Diversity, Equity & Inclusion via Advanced Quantitative Methods, both at the University of Michigan.
Aixa Marchand is an assistant professor of psychology and educational studies at Rhodes College. Dr. Marchand graduated with a Ph.D. in education and psychology and a certificate in African American Studies from the University of Michigan in 2019. Her main research focuses on the attributions that Black parents make about educational inequities and how these attributions may relate to their school engagement. Other related research inquiries include illuminating how students and parents of color critically analyze school structures; elucidating how familial processes, such as familism and parent racial socialization, impact adolescents’ academic outcomes and socioemotional wellbeing; and the use and development of rigorous methodological tools to address societal inequities.
Critical Perspectives in Quantitative Methods Series Webinar 3: Racial and Ethnic Identities and Administrative Data April 8, 2022 Speakers: Dominique J. Baker, Southern Methodist University & Samantha Viano, George Mason University
Webinar Recording
Constructing and using race and ethnicity measures for administrative or secondary data sets presents many logistical and theoretical challenges. This workshop provides guidance on some of these challenges and potential solutions based on the literature.
Dr. Dominique Baker is an assistant professor of education policy in the Annette Caldwell Simmons School of Education and Human Development and a faculty affiliate of the Data Science Institute at Southern Methodist University. Her research focuses on the way that education policy affects and shapes the access and success of minoritized students in higher education.
Dr. Samantha Viano is an assistant professor in the College of Education and Human Development at George Mason University. Her research focuses on evaluating policies and assessing school contexts that predominantly affect minoritized student populations and their teachers, including policies on school safety and security, online credit recovery, teacher retention, and methods for studying racial equity.
Critical Perspectives in Quantitative Methods Webinar 2: All Else Being Equal (When It’s Not Equal): Applying Theories on Race in Quantitative Models and Research February 25, 2022 Speaker: Richard Blissett, University of Georgia
Webinar Recording Slide Deck
The purpose of this discussion is to emphasize the role of critical consciousness in quantitative research design and communication, as well as to integrate critical perspectives on race and racism into practical decisions about the conduct of quantitative research that seeks to account for race (e.g., regression).
Richard Blissett is an Assistant Professor of educational policy in the department of Lifelong Education, Administration, and Policy at the University of Georgia. Their research primarily focuses on attitudes and ideologies in the politics of educational equity and justice, as well as the spaces in which public ideologies translate into policy action. Work in this area has included research on democratic structures in education governance (particularly, school boards) as well as anti-racism activism and social movements in education. They are also the principal investigator of the Democracy and Equity in Education Politics research group, and the principal coordinator for the Just Education Policy institute.
Critical Perspectives in Quantitative Methods Webinar 1: Introduction, Historical Origins and Future Possibilities October 22, 2021 Speakers: Veronica Velez, Western Washington University, Nichole Garcia, Rutgers University, and Jay Garvey, University of Vermont
Webinar Recording Slide Deck
This session will introduce attendees to the contours of critical quantitative research, particularly QuantCRIT (a methodological subfield of Critical Race Theory) by (a) examining the history, assumptions, and principles of QuantCrit and other critical perspectives in quantitative research and methods; and (b) illustrating these approaches in practice. Attendees will learn about the historical origins of critical quantitative methods, contemporary uses of critical quantitative methods, and possibilities for advancements in critical quantitative methods.
2020 & 2021 Workshops & Webinars
Designing Simulations for Power Analysis (and Other Things):
A Hands-on Workshop Series Using R Part 1: May 20 & 27, 2021 Part 2: June 3 & 10, 2021 Instructors: James E. Pustejovsky, University of Wisconsin - Madison & Luke Miratrix, Harvard University
Recordings available to registrants.
Course Description: This course will cover how to design and program Monte Carlo simulations using R. Monte Carlo simulations are an essential tool of inquiry for quantitative methodologists and students of statistics, useful both for small-scale or informal investigations and for formal methodological research. As a practical example, simulations can be used to conduct power analyses for complex research designs such as multisite and cluster randomized trials (potentially with varying cluster sizes or attrition). Simulations are also critical for understanding the strengths and limitations of quantitative analytic methods. In many situations, more than one modeling approach is possible for addressing the same research question (or estimating the same target parameter). Simulations can be used to compare the performance of one approach versus another, which is useful for informing the design of analytic plans (such as plans included in pre-registered study protocols). As an example of the type of questions that researchers might encounter in designing an analytic plan: In analysis of a multi-site experiment, what are the benefits and costs of using a model that allows for cross-site impact variation?
This course will cover best practices of simulation design and how to use simulation to be a more informed and effective quantitative analyst. We will show how simulation frameworks allow for rapid exploration of the impact of different design choices and data concerns, and how simulation can answer questions that are hard to answer using direct computation (e.g., with power calculators or mathematical formula). Simulation can even give more accurate answers than “the math” in some cases! Consider algebraic formulas based on asymptotic approximations that might not “kick in” if sample sizes are moderate. This is a particular concern with hierarchical data structures that include 20-40 clusters, which is what is typically seen in many large-scale randomized trials in education research.
Course structure: Our course will consist of four webinars, each 1.5 hours in length, delivered over four weeks. We will begin by describing a set of general principles for designing simulations and demonstrating how to implement those principles with code. We will then dive into how to think about data generating processes as a core element of simulation. We will then give a standard recipe for designing and implementing multi-factor simulations (simulations that explore the role different factors all at once). We will illustrate this design and build process by walking through (and modifying) a simulation for conducting a power analysis for multisite experiments. In this case study we will discuss how to build simulations component-wise to keep things orderly, how to standardize one’s models to keep different scenarios comparable, and how to visualize results to interpret and present findings. We will also introduce parts of the “tidyverse,” a suite of packages that can greatly ease the coding burden of this type of work. The course will be hands-on, with students running and modifying code to solve exercises throughout, so as to maximize the utility of the content. There will be small, optional “homework” assignments provided between the sessions, which will task involve studying, modifying, and adapting provided R code.
Prior experience needed: Students should have some familiarity with R. At the minimum, you should know how to load data, plot your data, and run linear regressions. Ideally, you should also be comfortable working in RStudio or another integrated development environment.
Data Collection Methods for Cost-Effectiveness Analysis of Educational Interventions May 18, 2021 Presenters: Rebecca Davis, University of Pennsylvania & Viviana Rodriguez, Columbia University
No recording available.
The Center for Benefit Cost Studies in Education at the University of Pennsylvania is proud to partner with SREE to offer a webinar on data collection in cost analysis. Cost studies offer important context to effectiveness work and are increasingly being required by funders, yet the “how to” of cost estimation is still ambiguous to many researchers. This webinar will offer clarity on the data collection phase of the cost estimation process. Using the ingredients method (Levin, McEwan, Belfield, Bowden, & Shand, 2018), we will explore data collection methods useful to researchers hoping to include estimation of costs in their existing studies or in funding proposals. This workshop will cover how to develop a cost data collection plan, potential sources of data, and potential pitfalls to avoid. We will discuss how the integration of cost data collection with other study elements can be helpful in efficiently adding cost estimation to a larger study. A preliminary understanding of the ingredient’s method is recommended but not required, and a brief introduction will be provided. Additional materials will be shared to help participants successfully plan each phase of their cost analysis.
Panel Data Methods for Policy Evaluation in Education Research April 20, 2021 Presenter: Avi Feller, University of California - Berkeley
Recording available for registrants.
Many important interventions and policy changes in education occur at an aggregate level, such as the level of the school, district, or state; prominent examples include school finance policies, curriculum development, and accountability changes. These policies are often difficult to evaluate experimentally, and education researchers instead rely on research designs based on repeated observations ("panel data") at the aggregate level. For example, we might estimate the impact of a new reading program using school-level average test scores at multiple time points surrounding its introduction. In this workshop, we will review the growing set of statistical tools for estimating causal effects with panel data of this form. We will first review common methods, such as difference-in-differences, fixed effects models, and Comparative Interrupted Times Series, as well as key conceptual issues, such as changes in measurement. We will then discuss complications that arise when treatment timing varies. Finally, we will briefly introduce some more recent methods that also incorporate matching and weighting. Throughout, we will use plenty of cartoons and bad jokes.
Designing and Reporting Your Study to Facilitate Evidence Clearinghouse Reviews and Meta-Analysis March 15, 2021 Presenter: Sandra Wilson, Abt Associates
No recording available.
Have you ever received a request for more information about your intervention research from someone doing a systematic review or meta-analysis? Would you like to learn more about how to report your study and its findings to facilitate inclusion in an evidence clearinghouse or meta-analysis? This 90-minute webinar will involve a comprehensive discussion of the types of study information needed by systematic reviews, meta-analyses, and evidence clearinghouses when reviewing intervention research.
The webinar will begin with a brief overview of systematic review and meta-analysis methods and their purposes. The presenter will then highlight the common types of information needed by evidence reviewers when identifying and locating studies, screening them for eligibility, assessing their quality, and extracting information about the study characteristics and findings, including technical information about study design and methods, study findings, and characteristics of interventions, comparison groups, study participants, and implementation strategies. The webinar will also review the variety of reporting guides and resources that are available for researchers to facilitate study reporting.
Testing the Effectiveness of Strategies to Promote Research Use: Learning from Studies in Education and Mental Health Settings March 5, 2021 Organizer: Ruth Neild, Mathematica Speakers: Kimberly Becker, University of South Carolina; Bruce Chorpita, University of California - Los Angeles; Aubyn Stahmer, University of California - Davis Discussant: Adam Gamoran, William T. Grant Foundation
Webinar Recording
This is the second webinar in a series focused on the use of research evidence in education. During this moderated discussion, two research teams will describe their studies that rigorously test the effectiveness of strategies for promoting research use in mental health and education settings.
This session, funded by the William T. Grant Foundation, is part of SREE’s virtual convening, Examining Education Research through the 2020 Lens.
Making Research Matter: Insights from Research on Research Use February 10, 2021 Presenter: Vivian Tseng, William T. Grant Foundation

Webinar Recording Presentation Slides
Many of us in the research community conduct research because we hope it will make a difference in policy or practice, and yet research often fails to have the kind of impact that we aspire to achieve. In this presentation, Vivian Tseng will discuss what we know from research on the use of research evidence in policy and practice. She will discuss when, how, and under what conditions research is used, in addition to what it takes to improve the use of research evidence. Vivian will draw upon the William T. Grant Foundation’s support for over 60 studies on this topic over the past dozen years, as well as insights from studies in other countries and from sectors as diverse as environmental policy, health, and human services. Our hope is that what you learn from research on research use may help you: 1) develop your own studies of research use, and 2) inform your efforts to get your research used more frequently and productively in policy and practice.
Bayesian Interpretation of Impact Estimates from Education Evaluations July 28, 2020 Presenters: John Deke and Mariel Finucane, Mathematica
Webinar Recording
This webinar will illustrate the pitfalls of misinterpreting statistical significance in evaluations of education interventions, describe BASIE (BAyeSian Interpretation of Estimates), an evidence-based alternative to p-values that assesses the probability an intervention had a meaningful effect, and provide examples of BASIE in action, including a simple spreadsheet tool. The webinar will be appropriate for people without any familiarity with Bayesian methods, as well as those with some knowledge who are interested in learning about the use of Bayesian methods in educational evaluations. There will be opportunity for Q+A at the end of the session.
John Deke is a senior fellow at Mathematica with 20 years of experience designing evaluations of education interventions. Mariel Finucane is a senior statistician at Mathematica who has led Bayesian analyses on evaluations spanning multiple fields, including health and education.
Proposing Cost and Cost-Effectiveness Analyses July 10, 2020 Presenter: Brooks Bowden, University of Pennsylvania
Webinar Recording Presentation Slides
This short session provides guidance and tips for those applying for IES research grants. The session builds upon basic knowledge of costs to provide examples of how to design a study to integrate a cost component and meet the quality standards set forth by the ingredients method (Levin et al., 2018) and the IES SEER standards. The session is tailored to the current IES RFA goal/research structure.
REES 101: A Guided Tour of Study Registration Presenter: Jessaca Spybrook, Western Michigan University
Webinar Recording

|