Program Evaluation

Research and evaluation play a vital role in helping policymakers, program funders, and education leaders understand the impacts that specific education policies or programs have on children and the adults who serve them. In an era of scarce available resources, leaders more than ever need access to reliable, objective data to understand how they can most effectively invest their time, energy, dollars, and staffing. APA's priorities for research and evaluation are to: 1) tailor and customize projects to meet the particular needs of our clients; 2) gather input from multiple sources to create a broad perspective; and 3) provide clear and understandable findings. Our tailored approach allows us to advise clients regarding the types of outcome measures and research designs that make the most sense for them. As one of three organizations that make up REL Central – the federal education laboratory for the Central region of the country – APA is also able to build alliances/networks of stakeholders who are interested in particular areas of education data and research. (Please click here for more information on APA’s work with REL Central; and click here for the REL Central Web site). This enables us to connect clients with an even wider base of experience and knowledge. APA staff members have expertise in conducting a wide range of program evaluation and research activities, from the most rigorous randomized control trials, to quasi-experimental study designs, to a mix of quantitative and qualitative data gathering through surveys, interviews, and focus groups. These approaches are used to provide clients both:
  • Formative feedback – to help identify areas where programs are most effective and where they can be improved; and
  • Summative feedback – to help clients understand how their programs impact students, teachers, or whomever their target population might be.

APA has found that, when these two types of feedback are combined effectively, and findings are clearly explained, education leaders are empowered to take decisive action to modify, improve, eliminate, or expand existing programs and policies.

Please click on the links below to learn more about some of our recent research and evaluation projects.  To download a fact sheet summarizing our work in this area, please click here.

The Denver Preschool Program (DPP) was approved by voters in November 2006 and connects Denver-area families with high-quality preschool. DPP focuses on increasing access to preschool, providing preschool choice to families, and raising the quality of preschool programs in Denver. Since 2006, APA has led the evaluation team for DPP, helping the program to understand how parents, children, and preschool providers access the DPP program and to learn the strengths and weaknesses inherent in the current program’s design. To inform the annual program evaluation, APA administers parent and provider surveys, focus groups, interviews, and reviews of program data. APA also works closely with Clayton Early Learning on the child outcomes evaluation, and recently began performing analysis of 3rd grade TCAP results, using a propensity score matching approach, to assess the persistence of DPPs impact.
APA is the evaluator for three programs currently funded through Mile High United Way’s Social Innovation Fund (SIF). The SIF program leverages $1.8 million each year in federal funds to create up to $6.6 million in literacy investments in Colorado's communities. In particular, Mile High United Way supports a series of innovative programs designed to improve literacy outcomes for children from birth to age eight. Each program is required to include a rigorous evaluation component that measures: 1) how consistently and effectively the programs are being implemented; and 2) the impacts of the program on child literacy and learning. Each evaluation plan is required to undergo significant external review, and must obtain approval from national experts in program evaluation design as well as approval of an independent Institutional Review Board (IRB).

The three programs which APA evaluates use different approaches to improve literacy outcomes, and each requires its own unique evaluation design:

  • Providers Advancing Student Outcomes (PASO): This program provides training to preschool providers who serve primarily low-income, Hispanic communities. These providers tend to be family, friends, or neighbors of the children they serve and often have little or no training in early childhood literacy or education. PASO provides such training through a year-long course that includes classwork and in-home visits to preschool providers. APA conducted a formative evaluation of PASO in 2011 designed to inform program leaders and funders about aspects of the program that could be enhanced or improved. APA used a mix of surveys, interviews, observations, focus groups, and reviews of program data to offer a set of objective recommendations. PASO leaders accepted and implemented several recommended changes, such as offering providers the opportunity to earn a nationally-recognized credential for participating in the program. APA is now studying how the program is implemented across several different communities, and has developed an approach to use assessments to measure impacts on children and the providers who serve them, over a three year period.
  • Reading Partners: This program provides tutoring services directly to struggling elementary school readers using a national reading curriculum and trained volunteers. APA worked closely with Reading Partners leaders as well as representatives of Mile High United Way to create an evaluation design that met not only our client’s needs but the rigorous design requirements of external evaluation reviewers. APA’s approach began with an implementation evaluation that allowed for an analysis of how consistently the program operated across multiple school and district sites. This work led to the development of an approved study that will use a statistical approach (propensity score matching) to examine program impacts on students over a three year period based on district reading assessments. The design of this impact study required considerable flexibility in order to accommodate the shifting needs of participating schools, districts, and students, as well as the Reading Partners program. The design will also maintain flexibility in order to recognize the rapidly changing district assessment environment.
  • Summer Advantage: This program seeks to reduce summer learning loss in children by providing five weeks of summer academic instruction in reading and math along with enrichment activities and field trips. In consultation with program leaders and Mile High United Way, APA developed an approved evaluation design which led to an analysis in 2013 to determine whether the program was being implemented in schools consistently, and to understand any challenges associated with maintaining program fidelity. This evaluation included site visits, interviews with key program and school district staff and leadership, as well as a review of existing program data and materials. The implementation evaluation also informed the design of APA’s impact evaluation of the program, which will use statistical analysis of data to understand how the program impacts student performance on district reading and math assessments.
  • Jeffco Summer of Early Learning: APA is conducting an impact evaluation of the Jeffco Summer of Early Learning (JSEL) program.  JSEL is a summer program designed to provide academic support and enrichment to elementary school children in the Jefferson County School District in Colorado. The APA evaluation is part of a multi-year effort to understand the impact of the program on student achievement.
APA has conducted numerous evaluations for school districts, state departments of education, and national and local foundations on a variety of topics:

  • A formative evaluation of a program in Mississippi designed to improve student performance in one of the poorest areas of the state. APA conducted site visits and analysis of data to provide insight into the aspects of the program that were having an impact on educators and children in the targeted communities. Recommendations were used by the program to help prioritize staff deployment.
  • An analysis for the State Department of Education in Louisiana of schools that are “beating the odds” in terms of their student demographics and high performance. APA created a data tool to help identify beat the odds schools, and worked with state leaders to conduct site visits, interviews, and other activities to help understand the conditions and policies which helped enable these schools to succeed.
  • An evaluation of teacher turnover in high need elementary schools to help school and district leaders identify root causes of teacher turnover, and to offer recommended strategies to address such turnover. APA recommendations were adopted by district leaders who responded by developing an innovative mentoring program to support new teachers.
  • An evaluation of a state-level program designed to work with targeted school districts to close student achievement gaps. APA conducted school and district site visits and ran focus groups with teachers, school leaders, district leaders, and state department of education personnel. APA also gathered and analyzed student performance data for each district to determine impacts of the program over a multi-year period. Recommendations were then provided and presented to state agency personnel.
  • An analysis of a school district’s modification of its existing teacher evaluation system. APA developed a district-wide survey of teachers to gather data on how the new system was perceived and implemented by teachers. Focus groups with school and district leaders also informed a set of recommendations that were presented to the district. These recommendations were used as the basis of follow-up conversations with school leaders to improve the new evaluation system.