Program Evaluation: Alternative Approaches and Practical Guidelines, 4th edition

Published by Pearson (July 14, 2021) © 2011

  • Jody L. Fitzpatrick University of Colorado at Denver
  • James R. Sanders Western Michigan University
  • Blaine R. Worthen Utah State University
  • Lori A. Wingate Western Michigan University
Products list

Access details

  • Instant access once purchased
  • Pay by the month or upfront. Minimum 4-month subscription
  • Anytime, anywhere learning with the Pearson+ app
  • 14-day refund guarantee

Features

  • Search, highlight and take notes
  • Easily create flashcards
  • Access to partners and offers
Products list

Details

  • A print text

Contents
Preface

PART ONE: INTRODUCTION TO EVALUATION

1. Evaluation's Basic Purposes, Uses, and Conceptual Distinctions
Informal versus Formal Evaluation
A Brief Definition of Evaluation and Other Key Terms
Differences in Evaluation and Research
The Purposes of Evaluation
Roles and Activities of Professional Evaluators
Uses and Objects of Evaluation
Some Basic Types of Evaluation
Formative and Summative Evaluation
Needs Assessment, Process, and Outcome Evaluations
Internal and External Evaluations
Evaluation's Importance-and Its Limitations

2. Origins and Current Trends in Modern Program Evaluation
The History and Influence of Evaluation in Society
Early Forms of Formal Evaluation
Program Evaluation: 1800-1940
Program Evaluation: 1940-1964
The Emergence of Modern Program Evaluation: 1964-1972
Evaluation Becomes a Profession: 1973-1989
1990-The Present: History and Current Trends
Spread of Evaluation to Other Countries
Non-Evaluators Take On Internal Evaluation
Responsibilities
A Focus on Measuring Outcomes and Impact
Considering Organizational Learning and Evaluation's Larger Potential Impacts

3. Political, Interpersonal, and Ethical Issues in Evaluation
Evaluation and its Political Context
How Is Evaluation Political?
Suggestions for Working within the Political Environment
Establishing and Maintaining Good Communications
Maintaining Ethical Standards: Considerations, Issues, and Responsibilities for Evaluators
What Kinds of Ethical Problems Do Evaluators Encounter?
Ethical Standards in Evaluation
Protections to Human Subjects and the Role of Institutional Review Boards
Reflecting on Sources of Bias and Conflicts of Interest
Ethics beyond a Code of Ethics

PART II: ALTERNATIVE APPROACHES TO PROGRAM EVALUATION

4. Alternative Views Of Evaluation
Diverse Conceptions of Program Evaluation
Origins of Alternative Views of Evaluation
Philosophical and Ideological Differences
Methodological Backgrounds and Preferences
Classifications of Evaluation Theories or Approaches
Existing Categories and Critiques
A Classification Schema for Evaluation Approaches

5. First Approaches: Expertise and Consumer-Oriented Approaches
The Expertise-Oriented Approach
Developers of the Expertise-Oriented Evaluation Approach and Their Contributions
Formal Professional Review Systems: Accreditation
Informal Review Systems
Ad Hoc Panel Reviews
Ad Hoc Individual Reviews
Influences of the Expertise-Oriented Approach: Uses, Strengths and Limitations
The Consumer-Oriented Evaluation Approach
The Developer of the Consumer-Oriented Evaluation Approach
Applying the Consumer-Oriented Approach
Other Applications of the Consumer Oriented Approach
Influences of the Consumer-Oriented Approach: Uses, Strengths and Limitations

6. Program-Oriented Evaluation Approaches
The Objectives-Oriented Evaluation Approach
The Tylerian Evaluation Approach
Provus's Discrepancy Evaluation Model
A Schema for Generating and Analyzing Objectives: The Evaluation Cube
Logic Models and Theory-Based Evaluation Approaches
Logic Models
Theory-Based or Theory-Driven Evaluation
How Program-Oriented Evaluation Approaches Have Been Used
Strengths and Limitations of Program-Oriented Evaluation Approaches
Goal-Free Evaluation

7. Decision-Oriented Evaluation Approaches
Developers of Decision-Oriented Evaluation Approaches and Their Contributions
The Decision-Oriented Approaches
The CIPP Evaluation Model
The UCLA Evaluation Model
Utilization-Focused Evaluation
Evaluability Assessment and Performance Monitoring
How the Decision-Oriented Evaluation Approaches Have Been Used
Strengths and Limitations of Decision-Oriented Evaluation Approaches

8. Participant-Oriented Evaluation Approaches
Evolution of Participatory Approaches
Developers of Participant-Oriented Evaluation Approaches and Their Contributions
Robert Stake and his Responsive Approach
Egon Guba and Yvonna Lincoln: Naturalistic and Fourth Generation Evaluation
Participatory Evaluation Today: Two Streams and Many Approaches
Categories of Participatory Approaches
Differences in Current Participatory Approaches
Developmental Evaluation
Empowerment Evaluation
Democratically-Oriented Approaches to Evaluation
Looking Back
How Participant-Oriented Evaluation Approaches Have Been Used
Research on Involvement of Stakeholders
Use of Approaches by Developers
Strengths and Limitations of Participant-Oriented Evaluation Approaches
Strengths of Participatory Approaches
Limitations of Participatory Approaches

9. Other Current Considerations: Cultural Competence and Capacity Building
The Role of Culture and Context in Evaluation Practice and Developing Cultural Competence
Growing Attention to the Need for Cultural Competence
Why is Cultural Competence Important?
Evaluation's Roles in Organizations: Evaluation Capacity Building and Mainstreaming Evaluation
Mainstreaming Evaluation
Evaluation Capacity Building
Limitations to Mainstreaming Evaluation and Capacity Building

10. A Comparative Analysis Of Approaches
A Summary and Comparative Analysis of Evaluation Approaches
Cautions about the Alternative Evaluation Approaches
Evaluation Approaches are Distinct but May Be Mixed in Practice
"Discipleship" to a Particular Evaluation "Model" Is a Danger
Calls to Abandon Pluralism and Consolidate Evaluation Approaches into One Generic Model Are Still Unwise
The Choice of Evaluation Approach Is Not Empirically Based
Contributions of the Alternative Evaluation Approaches
Comparative Analysis of Characteristics of Alternative Evaluation Approaches
Eclectic Uses of the Alternative Evaluation Approaches
Drawing Practical Implications from the Alternative Evaluation Approaches

PART III: PRACTICAL GUIDELINES FOR PLANNING EVALUATIONS

11. Clarifying the Evaluation Request and Responsibilities
Understanding the Reasons for Initiating the Evaluation
Direct, Informational Uses of Evaluation
Noninformational Uses of Evaluation
Conditions under Which Evaluation Studies Are Inappropriate
Evaluation Would Produce Trivial Information
Evaluation Results Will Not Be Used
Evaluation Cannot Yield Useful, Valid Information
The Type of Evaluation Is Premature for the Stage of the Program
Propriety of Evaluation Is Doubtful
Determining When an Evaluation Is Appropriate: Evaluability Assessment
How Does One Determine Whether a Program Is Evaluable?
Checklist of Steps for Determining When to Conduct an Evaluation
Using an Internal or External Evaluator
Advantages of External Evaluations
Advantages of Internal Evaluations
Advantages of Combining Internal and External Evaluation
Checklist of Steps for Determining Whether to Use an External Evaluator
Hiring an Evaluator
Competencies Needed By Evaluators
Possible Approaches to Hiring An Evaluator
Checklist of Questions to Consider in Selecting an Evaluator
How Different Evaluation Approaches Clarify the Evaluation Request and Responsibilities

12. Setting Boundaries and Analyzing the Evaluation Context
Identifying Stakeholders and Intended Audiences for an Evaluation
Identifying Stakeholders to be Involved in the Evaluation and Future Audiences
Importance of Identifying and Involving Various Stakeholders
Describing What Is To Be Evaluated: Setting the Boundaries
Factors to Consider in Characterizing the Object of the Evaluation
Using Program Theory and Logic Models to Describe the Program
Methods for Describing the Program and Developing Program Theory
Dealing with Different Perceptions
Re-Describing the Program as it Changes
A Sample Description of an Evaluation Object
Analyzing the Resources and Capabilities That Can Be Committed to the Evaluation
Analyzing Financial Resources Needed for the Evaluation
Analyzing Availability and Capability of Evaluation Personnel
Analyzing Technological and Other Resources and Constraints for Evaluations
Analyzing the Political Context for the Evaluation
Variations Caused By the Evaluation Approach Used
Determining Whether to Proceed with the Evaluation

13. Identifying and Selecting the Evaluation Questions and Criteria
Identifying Useful Sources for Evaluation Questions: The Divergent Phase
Identifying Questions, Concerns, and Information Needs of Stakeholders
Using Evaluation Approaches as Heuristics
Using Research and Evaluation Work in the Program Field
Using Professional Standards, Checklists, Guidelines, and Criteria Developed or Used Elsewhere
Asking Expert Consultants to Specify Questions or Criteria
Using the Evaluator's Professional Judgment
Summarizing Suggestions from Multiple Sources
Selecting The Questions, Criteria, And Issues To Be Addressed: The Convergent Phase
Who Should Be Involved in the Convergent Phase?
How Should the Convergent Phase Be Carried Out?
Specifying the Evaluation Criteria and Standards
Absolute Standards
Relative Standards
Remaining Flexible during the Evaluation: Allowing New Questions, Criteria, and Standards to Emerge

14. Planning How to Conduct the Evaluation
Developing the Evaluation Plan
Selecting Designs for the Evaluation
Identifying Appropriate Sources of Information
Identifying Appropriate Methods for Collecting Information
Determining Appropriate Conditions for Collecting Information: Sampling and Procedures
Determining Appropriate Methods and Techniques for Organizing, Analyzing, and Interpreting Information
Determining Appropriate Ways to Report Evaluation Findings
Work Sheets to Summarize an Evaluation Plan
Specifying How the Evaluation Will Be Conducted: The Management Plan
Estimating and Managing Time for Conducting Evaluation Tasks
Analyzing Personnel Needs and Assignments
Estimating Costs of Evaluation Activities and Developing Evaluation Budgets
Establishing Evaluation Agreements and Contracts
Planning and Conducting the Metaevaluation
The Development of Metaevaluation and Its Use Today
Some General Guidelines for Conducting Metaevaluations
A Need for More Metaevaluation

PART IV: PRACTICAL GUIDELINES FOR CONDUCTING AND USING EVALUATIONS

15. Collecting Evaluative Information: Design, Sampling, and Cost Choices
Using Mixed Methods
Evaluation Controversies over Methodology
A Definition and Discussion of Mixed Methods
Designs for Collecting Descriptive and Causal Information
Descriptive Designs
Case Studies
Cross-Sectional Designs
Time-Series Designs
Causal Designs
Experimental Designs
Quasi-Experimental Designs
Mixed Method Designs
Sampling
Sample Size
Selecting a Random Sample
Using Purposive Sampling
Cost Analysis
Cost Benefit Analysis
Cost-Effectiveness Studies

16. Collecting Evaluative Information: Data Sources and Methods, Analysis and Interpretation
Common Sources and Methods for Collecting Information
Existing Documents and Records
Identifying Sources and Methods for Original Data Collection: A Process
Observations
Surveys
Interviews
Focus Groups
Tests and Other Methods for Assessing Knowledge and Skill
Planning and Organizing the Collection of Information
Technical Problems in Data Collection
Analysis of Data and Interpretation of Findings
Data Analysis
Interpreting Data

17. Reporting Evaluation Results: Maximizing Use and Understanding
Purposes of Evaluation Reporting and Reports
Different Ways of Reporting
Important Factors in Planning Evaluation Reporting
Accuracy, Balance, and Fairness
Tailoring Reports to Their Audience(s)
Timing of Evaluation Reports
Strategies to Communicate and Persuade
Appearance of the Report
Human and Humane Considerations in Reporting Evaluation Findings
Delivering Negative Messages
Key Components of a Written Report
Executive Summary
Introduction to the Report
Focus of the Evaluation
Brief Overview of the Evaluation Plan and Procedures
Presentation of Evaluation Results
Conclusions and Recommendations
Minority Reports or Rejoinders
Appendices
Suggestions for Effective Oral Reporting
A Checklist for Good Evaluation Reports
How Evaluation Information Is Used
Models of Use
Steps To Take To Influence Evaluation Use
Reporting and Influence

18. The Future Of Evaluation
The Future of Evaluation
Predictions Concerning the Profession of Evaluation
A Vision for Evaluation
Conclusion

Appendix A

References

Author Index

Subject Index

This fixed-layout publication may lack compatibility with assistive technologies. Images in the publication lack alternative text descriptions. The publication does not support text reflow. The publication contains no content hazards known to cause adverse physical reactions.

Need help? Get in touch