Teen Pregnancy Prevention Evaluation Technical Assistance: A Case Study

Purpose: This case study discusses Mathematica’s experience providing large-scale evaluation technical assistance (ETA) to 65 grantees across two cohorts of Teen Pregnancy Prevention (TPP) Program grants. The grantees were required to conduct rigorous evaluations with specific evaluation benchmarks....

Full description

Saved in:
Bibliographic Details
Published in:Evaluation review Vol. 46; no. 1; pp. 32 - 57
Main Authors: Knab, Jean, Cole, Russell
Format: Journal Article
Language:English
Published: Los Angeles, CA SAGE Publications 01-02-2022
SAGE PUBLICATIONS, INC
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Purpose: This case study discusses Mathematica’s experience providing large-scale evaluation technical assistance (ETA) to 65 grantees across two cohorts of Teen Pregnancy Prevention (TPP) Program grants. The grantees were required to conduct rigorous evaluations with specific evaluation benchmarks. This case study provides an overview of the TPP grant program, the evaluation requirements, the ETA provider, and other key stakeholders and the ETA provided to the grantees. Finally, it discusses the successes, challenges, and lessons learned from the effort. Conclusion: One important lesson learned is that there are two related evaluation features, strong counterfactuals and insufficient target sample sizes, that funders should attend to prior to selecting awardees because they are not easy to change through ETA. In addition, if focused on particular outcomes (for TPP, the goal was to improve sexual behavior outcomes), the funder should prioritize studies with an opportunity to observe differences in these outcomes across conditions; several TPP grantees served young populations, and sexual behavior outcomes were not observed or were rare, limiting the opportunity to observe impacts. Unless funders are attentive to weaning out evaluations with critical limitations during the funding process, requiring grantees to conduct impact evaluations supported by ETA might unintentionally foster internally valid, yet underpowered studies that show nonsignificant program impacts. The TPP funder was able to overcome some of the limitations of the grantee evaluations by funding additional evidence-building activities, including federally led evaluations and a large meta-analysis of the effort, as part of a broader learning agenda.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0193-841X
1552-3926
DOI:10.1177/0193841X20975279