by Molly Baltman, Assistant Director/Grantmaking
|Foundation employees and other funders discuss strategy.|
As a launch to the Communities Program’s Unified Outcomes Project, over 70 people representing McCormick Foundation grantees, public funders, private foundations and McCormick Foundation staff gathered for an afternoon to discuss evaluation in the areas of child trauma and child abuse prevention. Led by evaluator, Dr. Tania Rempert, the meeting was the beginning of a series of grantee convenings to discuss what outcomes and indicators are most meaningful to practitioners and agencies, streamline evaluation, and provide a forum for agency representatives to talk about evaluation practices at their organizations. We are hoping that through the upcoming summer workgroups, grantees will learn and share different evaluation approaches and practices that will build on the high-quality services already provided. Through this work, we hope to find common indicators and outcomes the field can use when evaluating program impact (in context with other factors, such as: program models, staff credentials, financial stability, need, barriers, etc.), and provide the aggregated data back to grantees for benchmarking and continued learning/improvement.
During the June 17th meeting, funders posed three questions answered by grantees to help better understand struggles, needs and feedback around evaluation:
Question #1: What kind of support around data collection would you find helpful?
Grantees were interested in having computer generated versions of research outcome-based tools that provide responses and indicators, funding for data collection and reporting (including training for new staff), guidance on systems to help analyze the data, access to technical assistance, narrative-based evaluative questions to accompany measurement tools, and a universal database to align data collection. There was also an interest in funders aligning outcome reporting requirements.
- Question #2: What kinds of outcomes requested/required are NOT HELPFUL? What would you prefer?
Grantees wanted benchmarks targets that were based on research, clarity around the City’s questionnaire and correlation with reporting, the opportunity to choose what tools/outcomes are used, more awareness of the “ceiling effect” constraints on child reported outcomes, being able to report on a mixture of qualitative and quantitative data, less focus on outputs which are not as useful as outcomes, and less specific demographic breakdown requirements in reporting.
Question #3: What kind of support around evaluation would you find most helpful?
Grantees were interested in access to evaluators who could provide on-site technical assistance for creating systems to collect, aggregate, and verify data. Respondents wanted training on participatory evaluation methods and data collection, funding for statistical software and consulting on analyzing outcome data, a library of recommended evidence based tools by category, free evaluation tools, and the opportunity to participate in learning communities to compare and get new tools. From funders, they wanted clarity on minimum and desired expectations, grants to support continuing education in evaluation, training on best practices of administering evaluation tools, ideas for motivating staff to ensure more valid responses , and an easy to use report to pull together all of the data.
|Dr. Tonya Rempert works on evaluation with grantees|