It is important to plan for and build in data collection to measure the effectiveness of your system-building work. Collecting data on key measures of effectiveness will help you to monitor and improve your system-building work on an ongoing basis and document the value of what you are doing to funders, partners and policymakers.  

The Noyce Foundation asked state networks to engage in an ambitious set of data collection efforts that went broader (measuring the impact of system-building work on the supply of STEM programming across the state) and deeper (measuring the impact of STEM system-building work on program quality and individual students) than most states had undertaken previously. This was challenging work for networks that resulted in many lessons learned and a refined set of tools and processes for measuring STEM system-building work. These tools and processes provide an important foundation on which you can build your efforts to measure effectiveness.  

A helpful starting point for designing your data collection is to lay out a basic logic model that includes your desired outcomes and the strategies you have prioritized for implementation as well as the measures you will use to measure implementation of strategies and progress toward outcomes, and the data collection strategy you will use (ie: who will collect what data from whom?) 

As you design your data collection plan, it is important to be realistic about the capacity of your network staff and the providers you work with to collect, report, and analyze data. Many of the Noyce grantees found that they required additional administrative staff to manage the data requirements of the Noyce grant.  They also found that programs required significant levels of support in collecting data through mechanisms such as the DoS or the Common Instrument. Once again, you will likely have greater success and achieve greater sustainability in your data collection efforts if you integrate collection of your data measures into data requirements of key funders of afterschool in your state. The system-building strategies and state examples highlight approaches Noyce grantees took to measure the effectiveness of their efforts.

System Building Strategies & Tips

  • Align your data collection efforts to the goals and outcomes you are trying to achieve. Articulate a clear logic model that connects your measures to the particular strategies you are implementing and the outcomes you are seeking to achieve.
  • Be realistic about the time, expertise and resources needed to collect, analyze and use data effectively. Better to start small with data collection efforts and collect quality data than create a data plan that is unrealistic to implement and frustrating to staff and partners.
  • Use existing tools for measuring the quality and outcomes of informal science education in afterschool and tap into ongoing data collection efforts within afterschool programs in your state.
  • Seek to institutionalize key measures of supply and quality of STEM programming into ongoing data collection efforts in the afterschool field, including the use of DoS and the YPQA observation tools by 21st Century Community Learning Center Programs (CCLC) and state funded programs; and integration of Common Instrument measures into ongoing data collection from 21st Century Community Learning Center programs. 

State Examples

Missouri AfterSchool Network partnered with their state Department of Education to ensure that observations of 21st CCLC and licensed school-age child care programs include observation of STEM programming through the YPQA – STEM. They also helped the state to integrate Common Instrument questions into the ongoing data collected to evaluate 21st Century programs. They are refining the instrument and will make web-based version of it publicly available.  

Connecting data to professional development and training – both the Indiana Afterschool Network and Beyond School Bells, the Nebraska afterschool network, are exploring how to utilize DoS as a continuous improvement tool to identify what training and professional development supports are needed to improve STEM quality. In Nebraska, the network is partnering with Click2SciencePD housed at the University of Lincoln and city intermediaries to connect the insights from DoS trainings and observations to the 20 essential skills within the online site. 

Michigan Afterschool Partnership – strong partnership with the Michigan STEM Partnership (STEMx network) in the state has recently resulted in the inclusion of afterschool and summer programs as part of the MI STEM registry. The funding for this registry has been included in the state budget so that it will be maintained and updated regularly. This initiative has funded five STEM hubs to increase the awareness, interest and participation in STEM classes and industries. In the most recent round of funding, 70 percent of funds were allocated toward afterschool and summer programs.

The University of California, Irvine created an evaluation of The Power of Discover: STEM² initiative for the 2013-14 academic year.  This report analyses the program's workforce participation, alterations in the PoD Initiative and the relationships between activities, programs, staff beliefs and the resulting student outcomes.  The report finds the first year of PoD implementation to have been an overall effective application of STEM.

Tools & Resources

National Resources

Dimensions of Success

Dimensions of Success (DoS) is an observational tool that measures 12 indicators of STEM quality in out-of-school time. The tool is a four level rubric with a level of quality associated with each dimension. A handbook detailing the rubrics, sample ratings, data reports and summaries can be found here.

Common Instrument

The Common Instrument is a survey instrument for those over 10 years old to determine interest and engagement in informal science and out-of-school time.

Youth Program Quality Assessment and School-Age Program Quality Assessment

Youth Program Quality Assessment (YPQA) is an instrument used to evaluate the quality of youth programs and staff training needs. Its dual purpose ensures monitoring of programming and self-assessment for program staff. The tool has been validated for youth grades 4-12.