Bi-monthly newsletter header image

 

March-April 2021 Newsletter

In This Issue...

  1. Registration Open for April and May Research Methods Webinars
  2. 2021 Summer Graduate Student Fellowship Applications Will Open May 1
  3. Use of Research Evidence Webinar Records Now Available
  4. SREE to Offer Virtual Workshop:  Designing Simulations for Power Analysis (and Other Things): A Hands-on Workshop Series Using R
  5. Members in the News
  6. SREE First Virtual Convening was a Success
  7. Dates to Keep in Mind: SREE 2021 Conference Timeline
  8. Institutional Member Corner: Insight Policy Research
  9. Institutional Member Corner: MDRC
  10. Institutional Member Corner: Decision Information Resources

Registration Open for April and May Research Methods Webinar

Panel Data Methods for Policy Evaluation in Education Research 
Tuesday, April 20, 2021, Noon-1:30 PM EDT 

Presenter: Avi Feller, University of California - Berkeley 

Registration Rates: Nonmembers ($50), Members ($25), Student members (FREE) 

Many important interventions and policy changes in education occur at an aggregate level, such as the level of the school, district, or state; prominent examples include school finance policies, curriculum development, and accountability changes. These policies are often difficult to evaluate experimentally, and education researchers instead rely on research designs based on repeated observations ("panel data") at the aggregate level. For example, we might estimate the impact of a new reading program using school-level average test scores at multiple time points surrounding its introduction. In this workshop, we will review the growing set of statistical tools for estimating causal effects with panel data of this form. We will first review common methods, such as difference-in-differences, fixed effects models, and Comparative Interrupted Times Series, as well as key conceptual issues, such as changes in measurement. We will then discuss complications that arise when treatment timing varies. Finally, we will briefly introduce some more recent methods that also incorporate matching and weighting. Throughout, we will use plenty of cartoons and bad jokes. 

 
Data Collection Methods for Cost-Effectiveness Analysis of Educational Interventions 
Tuesday, May 18, 2021, Noon-1:30 PM EDT 

Presenters: Rebecca Davis, University of Pennsylvania & Viviana Rodriguez, Columbia University 

Registration Rates: Nonmembers ($50), Members ($25), Student members (FREE) 

The Center for Benefit Cost Studies in Education at the University of Pennsylvania is proud to partner with SREE to offer a webinar on data collection in cost analysis. Cost studies offer important context to effectiveness work and are increasingly being required by funders, yet the “how to” of cost estimation is still ambiguous to many researchers. This webinar will offer clarity on the data collection phase of the cost estimation process. Using the ingredients method (Levin, McEwan, Belfield, Bowden, & Shand, 2018), we will explore data collection methods useful to researchers hoping to include estimation of costs in their existing studies or in funding proposals. This workshop will cover how to develop a cost data collection plan, potential sources of data, and potential pitfalls to avoid. We will discuss how the integration of cost data collection with other study elements can be helpful in efficiently adding cost estimation to a larger study. A preliminary understanding of the ingredient’s method is recommended but not required, and a brief introduction will be provided. Additional materials will be shared to help participants successfully plan each phase of their cost analysis. 


2021 Summer Graduate Student Fellowship Applications Open May 1

SREE, in collaboration with Grantmakers for Education's (GFE) Data Impact Group, is pleased to be offering the SREE Summer Fellows Program again this year. SREE student members will have the opportunity to spend the summer working on a 'real world' research project answering a question that the philanthropic community is asking to help inform their work. For more information on the program and application requirements, visit the SREE websiteProjects will be posted around May 1. 

May 3 – May 20: Fellowship applications accepted 

May 22 – May 26: Application review and semi-finalist selection 

June 1 – June 3: Interviews 

June 4: Final selection and notification 

June 15 – September 3: Fellows’ research period


Use of Research Evidence Webinar Recordings Now Available

The following recordings can be accessed for free on the SREE webinar page: 

  • Making Research Matter: Insights from Research on Research Use, given by Vivian Tseng, William T. Grant Foundation 

  • Testing the Effectiveness of Strategies to Promote Research Use: Learning from Studies in Education and Mental Health Settings, featuring Kimberly Becker, University of South Carolina; Bruce Chorpita, University of California - Los Angeles; Aubyn Stahmer, University of California – Davis, and Adam Gamoran, William T. Grant Foundation (Discussant)  

Be on the lookout for additional SREE webinars on the topic of use of research evidence in the coming months!

SREE to Offer Virtual Workshop:  Designing Simulations for Power Analysis (and Other Things): A Hands-on Workshop Series Using R

May 20, May 27, June 3, and June 10, 3:00 PM - 4:30 PM EDT

Instructors: James E. Pustejovsky, University of Wisconsin - Madison & Luke Miratrix, Harvard University

Registration Open

Registration Rates: Nonmembers ($130), Members ($90), Student members ($25)

Course Description: This course will cover how to design and program Monte Carlo simulations using R. Monte Carlo simulations are an essential tool of inquiry for quantitative methodologists and students of statistics, useful both for small-scale or informal investigations and for formal methodological research. As a practical example, simulations can be used to conduct power analyses for complex research designs such as multisite and cluster randomized trials (potentially with varying cluster sizes or attrition). Simulations are also critical for understanding the strengths and limitations of quantitative analytic methods. In many situations, more than one modeling approach is possible for addressing the same research question (or estimating the same target parameter). Simulations can be used to compare the performance of one approach versus another, which is useful for informing the design of analytic plans (such as plans included in pre-registered study protocols). As an example of the type of questions that researchers might encounter in designing an analytic plan: In analysis of a multi-site experiment, what are the benefits and costs of using a model that allows for cross-site impact variation?

More information is available on the SREE website.


Members in the News

Lindsay Page

This summer, Lindsay will transition from the University of Pittsburgh to Brown University to join the Education Department and the Annenberg Institute for School Reform as the Annenberg Associate Professor of Education Policy.

 

Heather McCambly

Heather is completing her PhD in Human Development and Social Policy at Northwestern University and has taken a position as an Assistant Professor of Critical Higher Education Policy at the University of Pittsburgh's School of Education in the Department of Educational Foundations, Organizations, and Policy.

Greg Duncan

The National Academies of Sciences’ Division of Behavioral and Social Sciences and Education, or DBASSE, has announced the 2021 Spring Webinar Series of the Hauser Policy Impact Fund. SREE Board Member, Greg Duncan, will be a speaker on the topic of Ending Child Poverty: Examining Poverty Trends and Policy Implications. More information can be found here.


SREE First Virtual Convening was a Success

From February 22 – March 5, presenters from across the country came together virtually to examine and discuss the role of research, as it applies to the pandemic and inequities in education, among other topics. The first SREE Virtual Convening saw more than 150 individuals logged on to attend 10 sessions and 2 networking opportunities throughout the 2 weeks!  

Thank you to the presenters, organizers, and attendees who made the first Virtual Convening a success! 

Registered attendees may click here to access the recordings at any time.

Registered attendees may click here to access the recordings at any time.


Dates to Keep in Mind: SREE 2021 Conference Timeline

Call for Papers Timeline 

February 25, 2021: Abstract submission siteopens. 
April 15, 2021: Abstract submission deadline. 
July 1, 2021: Decision notifications sent. 
July 1, 2021: Preliminary program online. 
July 1, 2021: Registration opens. 
August 21, 2021: Early registration deadline.


Institutional Member Corner: Insight Policy Research

We are very excited to announce that Insight Policy Research turned 20 last month! Our work continues and builds on the strong foundation our principals created two decades ago—addressing issues that affect vulnerable populations. Our work over these years has involved rigorous program evaluations, technical assistance that effectively translates evidence-based research into practice, and cutting-edge data analytics and visualizations.  

Today, Insight’s researchers, data analysts, and learning and improvement experts reach thousands of policymakers, practitioners, educators, and program participants across the United States. Our success results from taking on projects we are passionate about, working collaboratively, and consistently exceeding client expectations. That is the Insight way.

We extend a big thank-you to all our clients and industry partners we have collaborated with to reduce barriers, enhance outcomes, and ensure equity across health, education, labor, and food and nutrition policy areas. We are particularly grateful for the relationships we have built throughout the past two decades and look forward to jumping into the next decade together!


Institutional Member Corner: MDRC

March marked the release of version 1.0 of MDRC’s The Higher Education Randomized Controlled Trial (THE-RCT) restricted-access dataset on ICPSR. THE-RCT is the largest individual-participant dataset from higher education randomized controlled trials, containing data from 25+ RCTs encompassing 45+ higher education institutions and 55,000+ students. MDRC researchers are currently using this database to explore research questions such as: 

  • Once a program is over, are student gains maintained? Do they fade out? Do they grow? To find out what we’ve learned (and see how the database can be used) see here. 

  • Which program components (for example, enhanced advising) are associated with larger improvements in student success? 

  • What sized improvements in student success should various programs expect to achieve? 

  • Do short-term program effects predict long-term program effects? 

MDRC and ICPSR hope this dataset will inspire new scholarship that will help improve outcomes for low-income, underrepresented, and underprepared students, who have long been a focus of MDRC’s higher education studies. It also can serve as an excellent resource for methodological research and teaching and learning. The dataset will continue to grow as MDRC and other organizations add additional higher education RCTs. For more information on using the database, visit ICPSR. If you’re conducting a higher education RCT and want to include it in THE-RCT, please reach out to [email protected] or [email protected]. 

Acknowledgements: Funding for The Higher Education Randomized Controlled Trials (THE-RCT) project was provided by the Institute of Education Sciences, U.S. Department of Education, through grant R305A190161 to MDRC. Opinions expressed are those of the authors and do not represent views of the Institute or the US Department of Education. The database for THE-RCT was created with generous support from Arnold Ventures


Institutional Member Corner: Decision Information Resources

Dr. Sylvia R. Epps, Chief Operating Officer at Decision Information Resources, Inc. (DIR), has been selected as an Affiliate Researcher for the Center for Culturally Responsive Evaluation and Assessment (CREA) at the University of Illinois at Urbana-Champaign. CREA brings together researchers and practitioners to address the growing need for policy-relevant studies that understand the nature and influence of cultural norms, practices, and expectations in the design, implementation, and evaluation of social and educational interventions. At DIR, an African American owned research and evaluation firm based in Houston, Texas, Dr. Epps also serves as the Director of Research Operations where she has directed multiple large-scale data collection and national evaluation projects all focused on advancing social policies and improving programs for underserved populations and racial minorities.

Dr. Epps’ selection as a CREA Affiliate serves as a significant step towards expanding DIR’s commitment to conducting culturally responsive research. For more than 36 years, DIR has offered unique perspectives and adaptive methodologies that have been particularly effective in working with diverse groups and developing culturally responsive evaluation practices that support diversity, equity, and inclusion. Dr. Epps is pleased to be among the esteemed scholars of like-minded CREA Affiliate Researchers and looks forward to both contributing to and learning from others applying a racial equity lens to their work, employing a participatory approach to evaluation projects, and creating awareness about incorporating contextual factors into our methodologies.

To learn more, visit the website at www.dir-online.com.