PROGRAM EVALUATION

Evaluation doesn’t have to be onerous. Yes, it probably requires some extra work but, like balancing a checkbook, it is an essential activity for staying on top of what’s really going on. You don’t work as hard as you do to deliver a social or environmental program for the public good to not know how things are REALLY working and how you’re actually moving the needle. Things can look great on paper … but they are rolling out as you intend them to?

Every nonprofit or government organization running a program for the public good needs to know they are good stewards of the resources they are given – both for accountability and for expanding the pool of supporters that are fans of your work. Flying blind is never a good idea – you need information collected systematically so you can make any needed choices to improve a program and/or have solid footing to back up claims of the “good work” you do.

Do any of these sounds familiar?

It costs too much, we don’t have the money …
We don’t have time right now … we’re too busy …
We don’t really know how …
We’re a bit concerned about what we might learn …
We don’t want to look bad to our funders …
We know what we’re doing – we’ve always done the program this way and been successful …

We can help you! We’ve done this a lot so we know how to design and implement evaluations so you get the high-quality information you need for making decisions and promoting your work. And … we talk like program people (not academics…) because that’s where we’ve come from. It’s not enough that we as evaluators understand what the results mean, our job is to make sure you understand your own results in plain English.

TYPES OF EVALUATION WE DO

PROCESS EVALUATION

WHAT: Studies HOW program activities are delivered including fidelity of implementation across sites or service providers. Looks at Program, management, and infrastructure together. Helps determine the capacity of an organization to deliver on its promised outcomes. Process evaluation helps stakeholders see how a program outcome or impact was achieved.

WHEN: Program improvement; fidelity of service model implementation especially across multiple sites.

IMPACT EVALUATION

WHAT: Assessment of changes that can be attributed to a particular intervention or program model. Seeks to answer cause-and-effect questions and demonstrate the changes are due to the program not outside factors by using a comparison or control group.

WHEN: Decisions on whether to expand, modify, or eliminate a particular program, fuller accountability.

EVALUATION SERVICES WE OFFER

There are many ways we can be of assistance, depending on your specific evaluation needs.

We can do the “FULL MEAL DEAL” – work with you to do everything soup yo nuts for the evaluation – plan the evaluation, develop all the data collection tools and processes, collect and analyze the data, and write a report with recommendations.

We also provide “A LA CARTE” Services so you can just get the specific support you need. If you are doing an internal evaluation, only want help with pieces of the evaluation process, or want to coach as you work with other evaluators.

Or, we also have a lot of experience providing evaluation training for nonprofit and government agency staff – both in person and via webinar.

Contact Susan Hyatt today to discuss your specific evaluation needs and/or request a proposal from us.

A LA CARTE SERVICES WE OFFER

EVALUATION PLANNING

We work with you to write the plan for your evaluation including purpose and type of evaluation your specific research questions (what you want to know), identification of key stakeholders, data collection methodology(ies), data sources, and tools, data analysis and statistical testing, timelines, and reporting.

DATA COLLECTION

We develop or identify data collection tools (i.e., surveys, observation, focus group or interview protocols, pre/post-test, etc.) and create procedures, systems and timelines to access the data…and then we get to it.

Our team can collect all the data or share responsibility with your data collection and/ or accessing information already collected.

DATA ANALYSIS

We can run the analysis for data you already collected or that we collect for you. Qualitative Data (the words): Content Analysis to identify key themes and determine how often each theme is mentioned. Quantative Data (the numbers): descriptive and inferential statistical testing (using software such as SPSS or SAS) to summarize data from a sample or determine the probability that an observed difference between groups is a dependable one or one that might have happened by chance.

BOOK SUE

To book Sue for speaking engagement or to provide a workshop, please call 720-593-0808 or send us email at info@bigpurposebigimpact.com

SAMPLES OF EVALUATION WE’VE DONE

The following is a list of selected program evaluation designed and implemented by Susan Hyatt and her team for a variety of nonprofit organizations and government agencies.