Afterschool Focus: Making Meaning Out of Data

High-quality afterschool programs don’t happen by accident. The programs are designed and implemented around a vision and supporting goals. When youth participate in high-quality programming supporting this vision and these goals, positive developmental experiences that meet expected outcomes are anticipated. But how can this be verified? 

Regular data analysis should be part of every afterschool program. When 21st CCLC programs collect and analyze data on program implementation, they can determine whether their programs are on track to meet their goals and can make any changes needed to reach these goals. As Marion Baldwin, from the Illinois Quality Afterschool team, explains, “One reason that this is a continuous process is because things often change and adjustments are sometimes required. By routinely looking at where you are and what must happen to get where you want to be, you will have a better chance of achieving your goals and realizing your program vision.”1


Planning for Data Analysis

To make data analysis a manageable activity, 21st CCLC teams will want to prioritize data that can help them measure progress toward program goals. Neal Naftzger, a principal researcher with the American Institutes for Research® who has studied afterschool and youth programming for nearly two decades, advises, “It comes down to being parsimonious with the pieces of data that you are going to look at.” He suggests programs focus on adopting three or four key performance indicators. “Elevate the things that are important for your program and your community. The reality is that in each given local context, individual stakeholders have different values for what they want to accomplish with you. And that could be your school stakeholders, your community stakeholders, your partners.”

When analyzing data around the implementation of a program, the following categories and questions need to be considered:

  • Adherence: Is the program being implemented as outlined in the 21st CCLC application submitted? Do activities or lessons reflect the activity description or lesson plan?
  • Exposure: How much of the programming did students participate in? This can include the number of sessions or contacts, attendance, or duration of sessions.
  • Quality: Is the program designed and delivered according to best practices? For example, are staff focused on clear instruction and ensuring that all children understand the concepts and material being taught? Are activities well organized and engaging? Do staff respond positively to students and speak in a warm, enthusiastic manner? Tools such as the Afterschool for Children and Teens Now (ACT Now) Coalition's Illinois Quality Program Self-Assessment can help afterschool programs assess performance on the basis of indicators aligned with the Illinois Statewide Quality Afterschool Standards.
  • Engagement: How are participants responding to the program? This may include their level of interest in a particular activity, the extent to which they find activities to be meaningful, and whether the program fosters community and positive relationships.2
Quantitative data are countable information. Examples include the number of days youth attend a 21st CCLC program or the number of special events a program hosts, standardized test scores, and classroom grades. This information can be shared as statistics, such as averages and percentages.
Qualitative data provide descriptive and conceptual information. These include surveys, interviews, activity observations, meeting minutes, lessons, and student portfolios. 

When analyzed together, quantitative and qualitative data provide a more complete picture of a program than either type of data alone. For example, the quantitative measure of attendance will tell us whether a student participation goal has been met, but student and parent surveys will indicate what students liked most about the program and why they did or did not attend.3

For guidance in selecting indicators and the data to support them, the Every Hour Counts Measurement Framework should be considered. The framework provides information on system-, program-, and youth-level data.4 Tools like the Beyond the Bell Data Discussion Tool can help teams determine which data to analyze, the way to present them, and the programming decisions that the data may inform.


Having Regular Data Discussions

Plan to allow time to fine-tune the data analysis process, especially if this is a new practice for a specific 21st CCLC program. “I like the idea of trying to develop a set of performance metrics that you’re going to commit to over the course of three or four years. It will probably take you a year to figure out what those metrics are and how to measure and analyze them,” says Naftzger. “Don’t cast the net super-wide because you can only go an inch deep on some of those, but really prioritize because there’s a cost in terms of accessing data, analyzing data, and really working the data,” he says. 

If afterschool programs have additional bandwidth, Naftzger also suggests that they engage in an action-research-type question that would last for only a specified period of time. He adds that afterschool team members might ask themselves, "Maybe we have noticed that we can’t get any of our sixth- and seventh-graders to come to the program. So, we’re going to spend the school year just trying to unpack why that is the case. Is it because we’re not offering the right activities? Are there other things that they are involved in? Does the program have a reputation among the students? If so, what is it, and how can we change it?"

Formative data show a program’s progress while it is underway. They help identify and “form” solutions to challenges. For example, if a review of student attendance indicates that they are not attending a program, a brief student survey can be administered to get their input. These formative data might show that students are not attending because of a scheduling challenge or because they do not find the topic as engaging as was hoped. This information can help make possible a midcourse correction to get the program back on track.

Summative data tell us about outcomes and “summarize” results at the end of an extended period of programming. They include student assessments, end-of-semester surveys, and local evaluations. Summative data reveal whether program goals have been achieved.5

Just as they can support data analysis, stakeholders can play an essential role in making meaning from data. All stakeholders (program staff, caregivers, community partners, youth) should be engaged in reviewing data and planning for improvement. Those leading conversations should encourage all staff and stakeholders to weigh in. Data shouldn’t be analyzed in a vacuum, and stakeholders may perceive data in different ways. They may bring additional ideas and resources for addressing challenges identified during data analysis. Baldwin suggests making sure that data and evaluation findings are formatted in easy-to-understand ways before they are shared with staff and stakeholders. 

Depending on what type of data are collected, teams will want to discuss areas where they met their goals, see what worked, what didn’t, what activities were implemented with fidelity, and how engaged their students were. They may want to identify any themes that the data present. Teams may ask themselves, “Is there anything surprising? Is there anything that is not surprising?” Discussions may take into account extenuating circumstances that arose during the academic year (for example, think about what happened when COVID-19 disrupted programs in 2020) or issues like staff turnover or resource allocation—all issues that could affect programming. Data analysis is part of the larger continuous improvement process. 

  • Focus on what is most important for a program and community.
  • Adopt three or four key performance indicators.
  • Take time to fine-tune the data analysis process.
  • Use action research to explore specific challenges.
  • Engage stakeholders.
  • Continually reflect on goals.

The final step in reflection is asking, “Did we meet our goals? How can we make the changes needed to meet our vision and goals?” This is an opportunity to change course and implement changes for success. It is also an opportunity to share successes that make a 21st CCLC program a valuable component of student success! Using the Beyond the Bell Continuous Improvement Reflection Tool may help to integrate data analysis into ongoing work toward program quality.  




American Institutes for Research. (2019). Data: Your continuous improvement reality check.

American Institutes for Research. (2022). Youth engagement in practice.;

Every Hour Counts. (2021.) Putting data to work for young people: A framework for measurement, continuous improvement, and equitable systems.

Texas Education Agency, American Institutes for Research, and Diehl Consulting Group. (2019). Texas ACE Local Evaluation Guide.

You for Youth. (2019). The continuous improvement launch pad.



1 American Institutes for Research (AIR), 2019. 

2 AIR, 2022; Texas Education Agency, AIR, & Diehl Consulting Group, 2019.

3 You for Youth, 2019.

4 Every Hour Counts, 2021.

5 You for Youth, 2019.