Program Profile: Behind Every Successful Fox Valley Park District 21st CCLC Site Is a Strong Evaluation Collaboration

The Fox Valley Park District (FVPD) 21st CCLC afterschool program in West Aurora School District 129 has become so successful that there are waiting lists for admissions to the program’s four sites. Parents and students alike value the afterschool program, and the students’ younger siblings look forward to attending the program when they are eligible. Many factors have contributed to the 21st CCLC’s success and popularity. Among those factors is a close collaboration with the program’s evaluators to use evaluation results to drive continuous program improvement. 

For the past 10 years, the FVPD 21st CCLC team has worked with evaluators from Aurora University for their local evaluation. The two evaluators, Dr. Christine Bruhn and Jessica Ortiz, are more than just researchers or number crunchers. They are partners and problem solvers. Lead evaluator Jessica Ortiz says about the evaluators’ relationships with FVPD, “I feel like it is really important that the [21st CCLC] staff know we are there for them, that we support them. It’s really a partnership.” She adds, “Their [FVPD 21st CCLC] staff are very invested in the program and in the students. And that helps me tremendously. I think they see the value in evaluation; they are invested in helping the program be as efficient as possible.” 

   
 

“Their [FVPD 21st CCLC] staff are very invested in the program and in the students. . . . I think they see the value in evaluation; they are invested in helping the program be as efficient as possible. 

—Jessica Ortiz, lead evaluator for FVPD 21st CCLC program

 

Partnership and Flexibility Are Key

To make the most of their evaluation, the evaluators and 21st CCLC team collaboratively plan the evaluation. Together, they determine what data will help 21st CCLC leadership and other stakeholders understand the program’s strengths and challenges and find the most effective ways to collect data. Bruhn, who serves as the principal investigator of the Aurora University evaluation team, sees flexibility and nimbleness as an essential part of the process, providing an example of how the 21st CCLC tried new approaches to data collection when the original plan was unsuccessful. “For the parent surveys, initially, we had paper and pencil surveys. But you know middle schoolers. . . [the surveys] weren’t being seen by parents,” says Bruhn. “So, we shifted to the phone. Now they are finding phone isn’t as effective as it once was, so they are exploring other options.” 

Bruhn notes the importance of collecting different streams of data, whether from the school district, the schools themselves, the 21st CCLC sites, the students, or the parents. “Being able to capture that range of experiences has really helped us to be able to understand more deeply. It isn’t always easy to gather data that are above and beyond strictly what’s required. I think having done so has helped us to really be able to inform program stakeholders in a more authentic, thoughtful way.” In addition to collecting data for the annual evaluation, the evaluators are quick in responding to 21st CCLC requests for additional tools that can gauge youth interest in certain programming or gather staff feedback to get ad hoc data that program staff need to guide their decision making.

 

Focusing on Continuous Program Improvement

Program director Karen Harkness says that her team embraces evaluation results as an opportunity for learning and continuous improvement. “I don’t ever take it personally, thinking, ‘Oh I haven’t done my job or our program staff haven’t done a good job’ or ‘We’ve missed the boat’ because it is never presented in that format,” she says. Harkness notes that the 21st CCLC team’s collaboration with their evaluators helps staff see the evaluation process as an opportunity for the program. “Both Chris and Jessica are always supportive. If we need help, they’re there.” 

Like Harkness, program coordinator Katie Kulakowski views the evaluation as an opportunity for program improvement instead of just a grant requirement. “The evaluation data demonstrate our program’s effectiveness and provide valuable insight on how we can better serve our students throughout the year and not just one given time frame.” 

“The evaluation data demonstrate our program’s effectiveness and provide valuable insight on how we can better serve our students.

—Katie Kulakowski, FVPD 21st CCLC

 

When the 21st CCLC team receives evaluation results, staff review the results for each site, discussing what is working, what isn’t, and how to best allocate the program’s resources. After discussing the evaluation results, the 21st CCLC team develops an action plan and timeline for each site so they can adjust programming based on the evaluation results. For example, evaluation results may show that one site needs more professional development on social and emotional learning (SEL) while another needs more focus on team building. Kulakowski notes that collecting data from diverse stakeholders is important to understanding the program. “By hearing their [stakeholders’] needs, we can also facilitate the progress with programming,” she says. “That’s one of the benefits of being so collaborative with the community. When we see a need at one of our schools, we can fine-tune what we’re pushing in a support. It takes a village, that’s for sure.” 

One example of programming changes in response to evaluation results included activities related to SEL and English language arts. The evaluation report identified needs in this area for one site, so the 21st CCLC team developed programming that included art journaling and gratitude. “We were able to develop programming that was specific to the needs of our evaluation for SEL and language,” says Ortiz. “And it is kind of nice—the kids think they are getting something new and cool,” she says. 

The evaluators and 21st CCLC program staff alike believe afterschool in West Aurora is truly a community-wide effort and resource. “The community itself, the parents, and the students have come to rely on this program,” said Ortiz. “This is something they kind of expect and really appreciate, but I don’t think it is taken for granted.”

  

  • 95% of students across all sites said that the program helped them be “more involved” and “make new friends.”
  • 89% of students reported that the program helped them “get homework done,” “try harder,” and “do better.”
  • 82% of students reported that the program helped them “feel good” about themselves and “give new things a try.” 
  • 100% of parents reported that the program was a safe environment for their students. 
  • 97% of parents reported that the program helped their students get homework done and improve behavior; in addition, they felt that the program was constructive, reported that staff were positive, and reported that their students enjoyed the program. 
  • 97% of parents also reported that they were satisfied with the program and said that they were involved in their students’ education.
  • 94% of parents felt that the program helped their students improve their grades and that youth were positive.