Evaluation Guidelines

RRF Foundation for Aging promotes the use of evaluation as part of every RRF grant in order to:

  • Encourage applicants and grantees to become more effective learning organizations, gathering and systematically analyzing important client and program information about the nature, reach, quality, and efficiency of the services they provide
  • Enable the Foundation to better understand the value of the investments we make and the lessons that funded projects can teach us about how to invest our grant dollars more effectively in the future
  • Add to knowledge in the field about best practices in services for older adults by supporting, when appropriate, rigorous experimental or quasi-experimental outcome studies

RRF uses three categories of evaluation:

  • Implementation – Providing practical lessons that emerge from putting a new project into action
  • Process – Generating a blueprint of a program in action
  • Outcome – Determining if a program can improve one or more targeted results

Implementation evaluation asks about the practical lessons that emerge from putting a new project into action. Rarely does a project go off without a hitch, and lessons learned during implementation help organizations identify if an approach may need to be modified and what critical next steps are required. In turn, these lessons can help others avoid the same pitfalls. Finally, they teach the Foundation important lessons that can help improve our grantmaking capacity.

Implementation evaluation is the appropriate focus for the evaluation of:

  • Demonstration projects for training or service delivery where the intervention/training model is still undergoing development
  • Projects that seek to replicate an existing model in one or more new settings, or with a different population
  • Planning and seed grants
  • Service expansion grants
  • Technical assistance grants
  • Advocacy and community organizing grants

Key Questions

The following questions may be used to develop effective implementation evaluation. These questions may help applicants create an outline of how they plan to gather information about their project. The list is intended to be illustrative—some questions may not be relevant to all projects and applicants may want to include other questions that are not listed below.

  • What is your program model (goals, objectives, activities, resource inputs, short- and long-term outcomes, types of clients/participants targeted, timeframe, budget, etc.)?
  • What aspects of your original program model were implemented as planned and what had to be changed?
  • Why were revisions needed?
  • What changes were made, and why did you select these new approaches and discard other options?
  • What aspects of the program were felt to work particularly well and why?
    Is there evidence that any unintended outcomes occurred, either positive or negative, for either the program, its staff, or for participants? For example, did you receive unexpected publicity, attract new volunteers, connect to new partner organizations, or identify and meet unexpected client needs?  Alternately, did the project cause stress among or between staff, divert staff from other responsibilities they have to clients, or cost more than expected?
  • How do you explain unintended outcomes?
  • Did you confront any barriers that were not anticipated?
  • Will you do things differently now, based on lessons learned to date?
  • What next steps will you take and/or do you recommend to further revise the model and why?
  • Are there conditions under which you would recommend that this program or service not be used, and why?

Process evaluation documents how a program operates by describing characteristics of clients and staff, the nature of services offered and methods of delivery, and patterns of service use to essentially generate a blueprint of a program in action. Effective process evaluation will allow applicants to:

  • Describe how funds were used
  • Provide a guide to others wishing to replicate the project and study the outcomes of a model program
  • Describe what the “intervention” consisted of in reality, not just as designed

Process evaluation is appropriate for:

  • Direct service and training projects
  • Conferences
  • Model and demonstration projects

Key Questions

The following questions may be used to develop effective process evaluation. These questions may help applicants create an outline of how they plan to gather information about their project. The list is intended to be illustrative—some questions may not be relevant to all projects and applicants may want to include other questions that are not listed below.

  • What were the goals and specific objectives of the project?
  • For each objective, what specific steps were taken and how were they accomplished?
  • For each objective or program component, what resources/inputs were needed (type, numbers, and time commitments of staff, physical space(s), equipment, volunteers, etc.)?
  • What type(s) of client(s) did each program element target?
  • What were the characteristics of clients actually served (age, gender, health status, living situation, family status, cognitive status, functional status, etc.)?
  • Were the characteristics of clients/participants in line with the targeted population? If not, why not? Were any type(s) of clients underrepresented? If so, why do you feel these groups were not reached?
  • How many clients/participants received each service? Was this more or less than your goal? Why do you think demand was higher or lower than expected?
  • How many units of each type of service/program component did clients receive (e.g., hours, rides, course sessions, friendly visits, days of adult day care, rehabilitation sessions, etc.)?
  • How much did the program cost? How did this break down for individual parts of larger projects?
  • How satisfied were clients with services provided? Were there any aspects of program operation that clients or staff recommended changing and why?

Outcome evaluation is what most people think of when they hear the term evaluation. It focuses on determining whether a program improves one or more targeted outcomes for those served (e.g., health, mental health, quality of life, risk of falling, re-hospitalization rates, etc.). Outcome evaluation requires that targeted clients are compared to a control group who are similar to clients in every way except for the fact that they are not exposed to the program being studied.

Outcome evaluation is often expensive and time-consuming, and requires the involvement of experts with a track record of documenting their knowledge of and experience with evaluation research and statistics. Generally, the Foundation only funds outcome studies when the proposed project is likely to be replicable and already has been pilot-tested to document that it is feasible to implement. Specifically, outcome evaluation is relevant for applicants who are proposing to test the effects of programs that are innovative, replicable, and already shown to be feasible.

RRF has developed two sets of guidelines for outcome evaluation: one for applicants with limited research expertise and a second for experienced researchers.  Applicants with limited research and evaluation experience are encouraged to include funds for an expert evaluation consultant in their program budget.

These guidelines are presented to help applicants with limited research experience to develop effective outcome evaluations for model or demonstration projects:

  • Outcomes: List specific measurable outcomes from your planned activities that will be tested for with your evaluation. You may wish to phrase these as research questions or hypotheses.
  • Research design: Discuss the approach that will be used to test whether the project is achieving these outcomes. Will you be able to use a randomized experiment, where you randomly assign participants to either your program (i.e., “treatment group”) or to a non-treatment (i.e., “control”) group? Or will you compare outcomes for your participants with those of some comparison group of similar individuals who do not participate?
  • Sampling plan: Describe how you will find and enroll people in your program and comparison group. List the criteria for who can/cannot participate and the size of each group. If participants are not randomly placed in study groups, describe how the people in your comparison group are likely to be similar to and different from those you will serve with your project.
  • Measures: Discuss the information or data you will gather and what form it will take.
    • List all outcomes you want to test for (dependent variables) and the indicators (measures) you will use to represent each. For example, improved health status can be defined as fewer hospitalizations, or as a lower rate of nursing home placement, or as higher functional status scores.
    • You may also want to collect information about those characteristics of participants that could shape how well your program works (intervening variables), listing the indicators you will use to measure these (e.g., gender, age, income, functional status, disease status, residential status). These data can help you if your findings suggest that your program wasn’t as successful as you would have liked, because they allow you to test if the project worked for some participants and not for others.
  • Data collection plan: Describe how you will collect your information. Who will collect each type of data, at what points in time, and how (e.g., via telephone or personal interviews, review of records, mailed survey, ratings by a nurse or social worker, etc.)?
  • Data analysis plan: Discuss how you will analyze your findings. Applicants are encouraged to think through different patterns of possible results and the conclusions they would draw from each. They are also encouraged to contact a local college or university to identify a statistical consultant who can conduct statistical testing to determine if results are meaningful, and who can help explain for purposes of a proposal what statistics would be used and why each is chosen. If a consultant will be engaged, please name this individual and include his or her resume.
  • Other concerns: Discuss how you will address human subjects concerns such as confidentiality, informed consent, and risk.
  • Budget: Include an overall budget, and then separate out the portion of this budget that represents costs for evaluation, including data collection (e.g., personnel, copying, transportation, phone, postage, etc.); data management (checking for accuracy and entering data); and data analysis (software purchases, analysis time, consultant fees).

The following may serve as a guideline for applicants who have more training in evaluation research as they develop effective outcome evaluations for model or demonstration projects or training programs:

  • State your research questions or hypotheses.
  • Detail your research design. Will you use an experimental or quasi-experimental design? If quasi-experimental, what type? Will it include a non-equivalent comparison group? If so, who will be included in this group? What potential threats to the internal validity of the design do you anticipate
  • Present your sampling plan. Discuss inclusion/exclusion criteria and sample sizes and, if possible, provide a statistical power calculation. Address the potential for sample attrition and how it may affect the composition of study groups and, therefore, the validity of conclusions.  Discuss generalizability.
  • Discuss measurement, including operational definitions for all dependent, antecedent, and intervening variables. Discuss level of measurement for each.  If existing scales are to be used, cite the references for each and discuss pros and cons. If new measures are to be used, describe the process by which they are/will be developed and tested.
  • Describe your data collection plan: who will collect each set of data, how they will do so (e.g., telephone, review of records, personal interviews, mailed survey, etc.), when they will do so, and how you will handle problems such as missing data or non-response to surveys?
  • Include a detailed data analysis plan.  Discuss the stages you will use to analyze your data. List all statistics you will run, clarify how your design meets the assumptions of each, and discuss their appropriateness given your levels of measurement for different variables and given sample sizes and estimated distribution of responses across categories on nominal or ordinal measures. Address any multivariate techniques designed to assess differential impact of the intervention or to allow you to control for antecedent or intervening variables. Discuss different potential findings that might emerge, what conclusions you would draw from each, and what additional data analyses you would run as a result. Please name any statistical consultants you will rely on and include their resumes with your proposal.
  • Discuss next steps. Clarify what you hope to know and will not be able to ascertain with the proposed research and future steps you would take as a function of different patterns of outcomes from your work.
  • Discuss how you will address issues of confidentiality, informed consent, and other human subjects concerns.
  • Include separate budget items for evaluation costs such as data collection (e.g., personnel, copying, transportation, purchase of datasets if applicable, etc.); data management (checking for accuracy and entering data); and data analysis (software purchases, analysis time not included in other staff salary figures, and consultant fees).

Things to Know

Learn about general grant exclusions, search our FAQs, watch an instructional video on our application process, read evaluation guidelines and learn how to develop SMART objectives.

Subscribe to our E-Newsletter

Close Menu
error: Content is protected !!