Font-Size
(773) 714-8080

Outcome Evaluation Guidelines for Non-Researchers

These guidelines are presented to help applicants with limited research experience to develop effective outcome evaluations for model or demonstration projects:

  • Outcomes: List specific measurable outcomes from your planned activities that will be tested for with your evaluation. You may wish to phrase these as research questions or hypotheses.
  • Research design: Discuss the approach that will be used to test whether the project is achieving these outcomes. Will you be able to use a randomized experiment, where you randomly assign participants to either your program (i.e., “treatment group”) or to a non-treatment (i.e., “control”) group? Or will you compare outcomes for your participants with those of some comparison group of similar individuals who do not participate?
  • Sampling plan: Describe how you will find and enroll people in your program and comparison group. List the criteria for who can/cannot participate and the size of each group. If participants are not randomly placed in study groups, describe how the people in your comparison group are likely to be similar to and different from those you will serve with your project.
  • Measures: Discuss the information or data you will gather and what form it will take.
    • List all outcomes you want to test for (dependent variables) and the indicators (measures) you will use to represent each. For example, improved health status can be defined as fewer hospitalizations, or as a lower rate of nursing home placement, or as higher functional status scores.
    • You may also want to collect information about those characteristics of participants that could shape how well your program works (intervening variables), listing the indicators you will use to measure these (e.g., gender, age, income, functional status, disease status, residential status). These data can help you if your findings suggest that your program wasn’t as successful as you would have liked, because they allow you to test if the project worked for some participants and not for others.
  • Data collection plan: Describe how you will collect your information. Who will collect each type of data, at what points in time, and how (e.g., via telephone or personal interviews, review of records, mailed survey, ratings by a nurse or social worker, etc.)?
  • Data analysis plan: Discuss how you will analyze your findings. Applicants are encouraged to think through different patterns of possible results and the conclusions they would draw from each. They are also encouraged to contact a local college or university to identify a statistical consultant who can conduct statistical testing to determine if results are meaningful, and who can help explain for purposes of a proposal what statistics would be used and why each is chosen. If a consultant will be engaged, please name this individual and include his or her resume.
  • Other concerns: Discuss how you will address human subjects concerns such as confidentiality, informed consent, and risk.
  • Budget: Include an overall budget, and then separate out the portion of this budget that represents costs for evaluation, including data collection (e.g., personnel, copying, transportation, phone, postage, etc.); data management (checking for accuracy and entering data); and data analysis (software purchases, analysis time, consultant fees).

Have Questions?
773-714-8080
info@rrf.org